开发者

Reduce cURL Processor Usage

I'm doing this certain task that involves sending 6 sets of 8 requests each per user, and a total of about 2000 users. It's a bunch of GET requests, used to send commands.

To speed up the sending, I've constructed 4 curl multi-handles, each holding 8 requests, firing them off one after the other, and then continuing on with the next user. Slight problem of it eating 99% of my CPU, and eating only about 5kb per second on my bandwidth. There's no leaks or anything, but when sending 96000 requests, it lags big time, taking up about a good 3 hours on my dual core AMD Phenom.

Are there any met开发者_运维百科hods I can possible speed this up? Using file_get_contents() instead of cURL ends up being 50% slower. But cURL uses only 5 kbps, and eats my CPU out.


Have you tried using fopen() for your requests instead of curl? This could also be putting a load on where ever you are sending the requests? It won't return until the web server finishes the request. Do you need the data back to present the user, if not, can you run the queries in the background? The real question is why are you sending so many request, and it would be far better to consolidate them into fewer requests. You have a lot of variables in this setup that can contribute to the speed.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜