开发者

cUrl multiple URL open

$query = 'SELECT * FROM `chat` LIMIT 0, 24334436743;'; 
$result = mysql_query($query);
while($row = mysql_fetch_array( $result )) {
$URL = $row['url']开发者_开发技巧;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,"$URL");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
//curl_setopt($ch, CURLOPT_POSTFIELDS, "user=unrevoked clarity&randominfo=hi");
curl_exec ($ch);
curl_close ($ch);
   }





//curl_close ($ch);
} 

Alright the above snippet is me pulling a whole bunch of URL's from a database and I am trying to send data to each of them. But it seems to gum the page up (even with only one or two URL's). Is there a built in system to handle this or something?


You can initialize multiple requests using the curl_multi_*() functions, then have them sent all at once. There is probably a limit to how many requests can be pooled. And the overall processing time will take as long as the slowest connection/server.

So your approach (many many URLs at once) is still problematic. Maybe you can rewrite it to do the processing in your browser, start multiple AJAX requests with some visual feedback.


Requesting a URL from the network is an expensive operation, and even downloading a few will noticeably increase the latency of your page. Can you cache the contents of the pages in a database? Do you have to download the URL; can you make the client do it with an iframe?

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜