Problem serving large (image?) files to Safari
Server setup: Apache 2.2.14, PHP 5.3.1
I use a PHP script to serve files of all types as part of an application with complex access permissions. Things have worked out pretty well so far, but then one of our beta users took a picture with a 10-megapixel digital camera and uploaded it. It's somewhere north of 9 MB, 9785570 bytes.
For some reason, in Safari (and thus far ONLY in Safari, I've reproduced this on 5.0.5) the download will sometimes hang partway through and never finish. Safari just keeps on merrily trying to load forever. I can't consistently reproduce the problem - if I reload over and over sometimes the download will complete and sometimes it won't. There's no apparent pattern.
I'm monitoring the server access logs and in the cases where Safari hangs I see a 200 response of the appropriate filesize after I navigate away from the page, or cancel the page load, but not before.
Here's the code that serves the file, including headers. When the download succeeds and I inspect the headers browser-side I see the content type and size have been set correctly. Is it something in my headers? Something in Safari? Both?
header('Content-Type: ' . $fileContentType);
header('Content-Disposition: filename=' . basename($fpath));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($fpath));
ob_clean();
flush();
session_write_close();
readfile($fpath);
exit;
FURTHER BULLETINS AS EVENTS WARRANT:
By artificially throttling download speed to 256k/s -- that is, by chunking the file into 256k pieces and pausing between serving them, as
$chunksize = 1 * (256 * 1024); // how many bytes per chunk
if ($size > $chunksize) {
$handle = fopen($fpath, 'rb');
$buffer = '';
while (!feof($handle)) {
$buffer = fread($handle, $chunksize);开发者_如何转开发
echo $buffer;
ob_flush();
flush();
sleep(1);
}
fclose($handle);
} else {
readfile($fpath);
}
I was able to guarantee a successful display of the image file in Safari under arbitrary conditions.
A chunksize of 512k does not guarantee a successful display.
I am almost certain that the problem here is that Safari's image renderer can't handle data coming in any faster, but:
- I would like to know for certain
- I would also to know if there's some other kind of workaround like a special CSS webkit property or whatever to handle large images because 256k/second is kind of dire for a 10 MB file.
And just to pile on the weird, setting up a finer-grained sleep with usleep() results in problems at a sleep time of 500 ms but not 750 ms.
I did a little digging and found little specific, but I do see a lot of people asserting that Safari has issues with honoring cache control directives. One person asserts:
You don't need all those Cache Controls, just a max-age with Expires set in the past, does everything all those headers your using does [...] many of those Cache Controls headers your using cause problems for Safari [...] Lastly, some browsers don't understand filename, the only understand name, which must be included in the Content-Type header line, never in the Content-Disposition line. [...]
( see last post in thread: http://www.codingforums.com/archive/index.php/t-114251.html OLD info, but you never know... )
So possibly comment out some of your headers and look to see if there is an improvement.
(anecdotal) I also saw some some older post complaining about safari both resuming an interrupted download by appending the whole file to the end of the partial one, and endless downloading which appears to count bytes beyond the file length being sent. (anecdotal)
You might want to try to "chunk" the file while reading it in.
There a numerous posts here on PHP.net that explain ways to do that: http://php.net/manual/en/function.readfile.php
Try:
ob_start();
readfile($path);
$buffer = ob_get_clean();
echo $buffer;
精彩评论