Getting a 2GB file inside PHP?
I am needing to download a very large file via PHP, the last time I did it manually via http it was 2.2gb in size and took a few hours to download. I would like to automate the download somehow.
Previously I have used
file_put_contents($filename, file_get_contents($url));
Will this be ok for such a large file? I will want to untar the file post downloading and then perform analysis of the various f开发者_C百科iles inside the tarball.
file_get_contents() is handy for small files but it's totally unsuitable for large files. Since it loads the entire file into memory you need like 2GB of RAM for each script instance!
You should use resort to old fopen() + fread() instead.
Also, don't discard using a third-party download tool like wget (installed by default in many Linux systems) and create a cron task to run it. It's possibly the best way to automate a daily download.
You will have to adapt your php.ini to accept larger files in upload, and adapt your memory usage limit.
精彩评论