开发者

Processing Multiple RSS Feeds in PHP

I have a table of more than 15000 feeds and it's expected to grow. What I am trying to do is to fetch new articles using simplepie, synchronously and storing them in a DB.

Now i have run into a problem, since the number of feeds is high, my server stops responding and i am not able to fetch feeds any longer. I have also implemented some caching and fetching odd and even feeds at diff time intervals.

What I want to know is that, is there any way of improving this process. Maybe, fetching feeds in parallel. Or may be if someone can tell me a psuedo algo 开发者_StackOverflow中文版for it.


15,000 Feeds? You must be mad!

Anyway, a few ideas:

  1. Increase the Script Execution time-limit - set_time_limit()
    Don't go overboard, but ensuring you have a decent amount of time to work in is a start.
  2. Track Last Check against Feed URLs
    Maybe add a field for each feed, last_check and have that field set to the date/time of the last successful pull for that feed.
  3. Process Smaller Batches
    Better to run smaller batches more often. Think of it as being the PHP equivalent of "all of your eggs in more than one basket". With the last_check field above, it would be easy to identify those with the longest period since the last update, and also set a threshold for how often to process them.
  4. Run More Often
    Set a cronjob and process, say 100 records every 2 minutes or something like that.
  5. Log and Review your Performance
    Have logfiles and record stats. How many records were processed, how long was it since they were last processed, how long did the script take. These metrics will allow you to tweak the batch sizes, cronjob settings, time-limits, etc. to ensure that the maximum checks are performed in a stable fashion.

Setting all this may sound like alot of work compared to a single process, but it will allow you to handle increased user volumes, and would form a strong foundation for any further maintenance tasks you might be looking at down the track.


fetch new articles using simplepie, synchronously

What do you mean by "synchronously"? Do you mean consecutively in the same process? If so, this is a very dumb approach.

You need a way of sharding the data to run across multiple processes. Doing this declaratively based on, say the modulus of the feed id, or the hash of the URL is not a good solution - one slow URL would cause multiple feeds to be held up.

A better solution would be to start up multiple threads/processes which would each:

  1. lock list of URL feeds
  2. identify the feed with the oldest expiry date in the past which is not flagged as reserved
  3. flag this record as reserved
  4. unlock the list of URL feeds
  5. fetch the feed and store it
  6. remove the reserved flag on the list for this feed and update the expiry time

Note that if there are no expired records at step 2, then the table should be unlocked, the next step depends on whether you run the threads as daemons (in which case it should implement an exponential back of, e.g. sleeping for 10 seconds doubling up to 320 seconds for consecutive iterations) or if you're running as batches, exit.


Thank You for your responses. I apologize I am replying a little late. I got busy with this problem and later I forgot about this post.

I have been researching a lot on this. Faced a lot of problems. You see, 15,000 feed everyday is not easy.

May be I am MAD! :) But I did solve it.

How?

I wrote my own algorithm. And YES! It's written in PHP/MYSQL. I basically implemented a simple weighted machine learning algorithm. My algorithm basically learns the posting time about a feed and then estimates the next polling time for the feed. I save it in my DB.

And since it's a learning algorithm it improves with time. Ofcourse, there are 'misses'. but these misses are alteast better than crashing servers. :)

I have also written a paper on this. which got published in a local computer science journal.

Also, regarding the performance gain, I am getting a 500% to 700% improvement in speed as opposed to sequential polling.

How is it going so far?

I have a DB that has grown in size of TBs. I am using MySQL. Yes, I am facing perforance issues on MySQL. but it's not much. Most probably, I will be moving to some other DB or implement sharding to my existing DB.

Why I chose PHP?

Simple, because I wanted to show people that PHP and MySQL are capable of such things! :)

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜