开发者

PHP f-n file_get_contents stops execution for huge html page

I'm using file_get_contents function to get and parse the html, from huge page (32 000 rows). It works for small/norma开发者_StackOverflow社区l pages but with this one it just stops.

It doesn't give any error. It doesn't read the next line in my PHP script, but just stops..I tried to increase the time out time or using cURL but it still does not work.

It works on my local XAMPP but it does not work when I upload it to my hosting. Does anyone know what setting is messed up in my PHP hosting? I think it is some buffer issue..


It is possible that it is caused by memory_limit

Here is some additional information regarding the directive

To determine the cause of this you should prepend code such as the following to your script (leave out log statements if you wish to display errors in the browser instead)

ini_set('display_errors', 1); 
ini_set('log_errors', 1); 
ini_set('error_log', dirname(__FILE__) . '/error_log.txt'); 
error_reporting(E_ALL);
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜