开发者

Sqlite as cache store to store static copies of dynamic pages - Is it a good idea?

We're running our blog from a shared hosting account. Our host limits the allowed inodes/number of files on the hosting account to 150,000. We've implemented our caching mechanism that caches all pages in full as soon as they are accessed so that subsequent seeks are delivered from cache. Unfortunately, the inode limit won't allow us to store more pages very soon.

Fortunately we have sqlite on our server. Though we have mysql too, but our shared hosting account only allows us to have maximum 25 concurrent connections from the apache webserver to the mysql server. That's a major constraint! Its said that sqlite is "serverless", and so I believe sqlite won't have that kind of limitation.

With that, should I and can I use a sqlite table to store full cache pages of all dynamic pages of our blog ? The average cached page size is around 125 kbs and I have around 125,000 cache pages and growing.

Will that have any kind of bottlenecks to slow down the cache page delivery out of the sqlite database file?

Will I be able to write more pages to the cache_table in the sqlite database while simuntaneously d开发者_StackOverflow中文版eliverying sought pages from the cache_table to the site visitors?


I't not a good idea cause sqlite usage may impact you website performance (at least on response time). I recommend to use Memcached or NoSQL DB as a last resort (need to test for response time rise).

But if you have not choise, sqlite will be better then MySQL, cause its select operations are faster.


Haven't calculated that because there has never been a need to calculate max page generation time. I manage all pages statically in full, and with that, it has always been a stable process without any trouble.

Our server load varies from 400 to 8000 page requests in an hour.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜