开发者

need to speed up my feed parsing and processing PHP

I'm keeping my self busy working on app that gets a feed from twitter search API, then need to extract all the URLs from each status in the feed, and fina开发者_高级运维lly since lots of the URLs are shortened I'm checking the response header of each URL to get the real URL it leads to. for a feed of 100 entries this process can be more then a minute long!! (still working local on my pc) i'm initiating Curl resource one time per feed and keep it open until I'm finished all the URL expansions though this helped a bit i'm still warry that i'l be in trouble when going live

any ideas how to speed things up?


The issue is, as Asaph points out, that you're doing this in a single-threaded process, so all of the network latency is being serialized.

Does this all have to happen inside an http request, or can you queue URLs somewhere, and have some background process chew through them?

If you can do the latter, that's the way to go.

If you must do the former, you can do the same sort of thing.

Either way, you want to look at way to chew through the requests in parallel. You could write a command-line PHP script that forks to accomplish this, though you might be better off looking into writing such a beast in language that supports threading, such as ruby or python.


You may be able to get significantly increased performance by making your application multithreaded. Multi-threading is not supported directly by PHP per se, but you may be able to launch several PHP processes, each working on a concurrent processing job.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜