开发者

Processing Large CSV files using PHP MYSQL

I am developing a PHP/MySQL application which entails processing of CSV files but开发者_运维百科 the script always stops before the entire process is completed.

How can I optimize the system to conclusively handle this?

Note I wont be doing the webhosting for this system so I cant be able to extend the PHP maximum execution time.

Thanks


A couple of ideas.

  1. Break the file down into a row set that you know you can process in once shot. Launch multiple processes.

  2. Break down the work so that it can be handled in several passes.


Check out LOAD DATA INFILE. It's a pure MySQL solution.

You could begin/execute this SQL with a PHP script, which could continue to run after the script stops/timeout. Or, better yet, schedule a cron job.


You don't need to have control over config files to extend maximum execution time. You can still use set_time_limit(0) on your code to make it run till the end. The only catch is if you are calling this from the browser. The browser may time-out and leave the page orphaned. I have a site which generates CSV files that take a long time and I put the process to run in the background by ending the session with the browser using buffer flush and send an email notification when the job is finished.


Suggestion one: after you insert one of the rows, remove it from the csv file

Suggestion two: update a file or mysql with last inserted csv row and with next run skip all other entries before that row.

Also, you can add a limit of 30 seconds per execution or 100/1000/X rows per execution (which works best before the script terminates). That will work for both suggestions.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜