开发者

Downloading large files (~400MB) stops prematurely in production, works fine on development server

Recently I ran into a problem on larger file downloads in PHP. PHP is running as CGI on zeus server. I tried everything but all in vain. like:

set_time_limit(0);
ini_set('max_execution_time',0);

The problem is that after downloading about 4-5MB, downloading stops without any warning. However, when I run the开发者_运维百科 code locally everything works like a charm. Help me get out of this problem.


Look in your PHP.ini file on the zeus server and your local box. Check the

upload_max_filesize = ??

Or the:

post_max_size = ??

values on both servers. See if they are different.


This might be a memory limitation of the CGI process or some other limitation in the response delivery chain.

  • don't load the whole file into memory, e.g. echo file_get_contents(<file>)
  • disable output compression for this request (PHP and Webserver)

I suggest you also read this page.

Could you paste the code that send the file?


I tend to use:

post_max_size = ?

Best of luck!

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜