Delayed mysql query and then bulk running
I have a widget which has several articles displayed. For each articles there are are about 2 selects and 1 insert/update.
The worst case scenario involves a widget displaying 300 articles and this involves 900 queries plus the other setup queries for the widget and this really slows the widget down. Now imagine that along with the widget being loaded by 5 users simultaneously sometimes.
I was thinking if I should collect all the queries for each widget and then write them in a file. Later have a cron job run periodically where those queries are executed every 30 seconds.
How much difference would this make to the performance. Also, does anyone else have a better system or idea.
What does websites like youtube, etc use for caching the views of the youtube videos.
I cannot optimize the widget a开发者_如何学编程ny more. I want a solution for a delayed page counter lets say/
Using a cron job to do this kind of heavy work is not a bad idea : just have your cronjob generate the HTML output, and use that HTML from your PHP code, when displaying the page.
This way :
- The page will be fast to display : just one static HTML data/file to include/read
- There will never be more than 1 user (the cronjob) doing all the heavy querying.
But, seriously, 900 queries for a widget ? 3 queries for a single article ?
There is really something that should be improved, here !
精彩评论