Uploading to amazon S3 exceeds maximum excecution time
I have a fairly simple problem and a very annoying error. I want to allow some users to upload images which I will store on Amazon S3. I have written an uploadscript that works fine when I feed it small images, but when the images are large ~1mb, the script stops.
I think it has something to do with the script waiting for response from amazon and then times out because the image gets uploaded, but the rest of the uploadscript is skipped (inserting in DB).
I have come upon this question How to sequence events in PHP for uploading files to amazon S3 which is somewhat similar to my problem but this is a bit simpler (I hope).
I use jumploader and amazon s3 class for php.
This is the line where the script stops and go no further
S3::putObject($f开发者_如何学JAVAull, 'bucket_name', $path, S3::ACL_PRIVATE)
Are there maby some way of just instantiating the upload from my server to S3 and just excecute the rest of the code (so the upload is asynchronous)?
Increase the time limit:
set_time_limit($seconds);
http://www.php.net/set_time_limit
If the delay is too long for the user to reasonably wait, add the job to a queue and use a scheduled task (cron job) to run a php script periodically to action uploads.
精彩评论