File download in php, memory limit problem?
A client of mine decided to move the website from a somewhat nice server to a... lets call it lesser nice server.
The problem is, there's a file with 40MBs to be downloaded and the memory limit on the server is 32. To make it even more difficult for me, they don't allow fopen...
Also, if i reduce the file size to 20MB it works fine.
So, my question is, what can i do - besides reducing the file size - to make this work?
Thank you
EDIT:`
$fsize = filesize($file_path);
$path_parts = pathinfo($file_path);
$ext = strtolower($path_parts["extension"]);
switch ($ext) {
case "pdf": $ctype = "application/pdf";
break;
case "exe": $ctype = "application/octet-stream";
break;
case "zip": $ctype = "application/zip";
break;
case "doc": $ctype = "application/msword";
break;
case "xls": $ctype = "application/vnd.ms-excel";
break;
case "ppt": $ctype = "application/vnd.ms-powerpoint";
break;
case "gif": $ctype = "image/gif";
break;
case "png": $ctype = "image/png";
break;
case "jpeg":
case "jpg": $ctype = "image/jpg";
break;
default: $ctype = "application/force-download";
}
header("Pragma: public"); // required
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: private", false); // required for certain browsers
header("Content-Type: $ctype");
header("Content-Disposition: attachment; filename=\"" . basename($file_path) . "\";");
header("Content-Transfer-Encoding: binary");
header("Content-Length: " . $fsize);
ob_clean();
flush();
readfile($file_path);`
The code i saw on php.net
// If it's a large file, readfile might not be able to do it in one go, so:
$chunksize = 1 * (1024 * 1024); // how many bytes per chunk
if ($size > $chunksize) {
$handle = fopen($realpath, 'rb');
$buffer = '';
while (!feof($handle)) {
$buffer = fread($handle, $chunksize);
echo $b开发者_如何学编程uffer;
ob_flush();
flush();
}
fclose($handle);
} else {
readfile($realpath);
}
Use readfile()
instead. It'll stream the file in small chunks and handle all the background work to keep memory usage minimal.
You really want fpassthru()
but if the host disabled fopen()
they probably disabled fpassthru()
, too. In that case you either negotiate with the host or start to look for a better host. Or you cannot distribute big files at all.
It sounds to me that the host is intentionally crippling the available features to reduce bandwidth costs and/or CPU usage. If they allow any files to be read it should not be about security rules.
精彩评论