Working with a large amount of data from mysql
I am having trouble with allowing users to download a large report from the data base.
Users have the ability to browse the report on the website, i have set up a limit and a pagination for the pages so no problem there.
But now i am adding the functio开发者_开发技巧nality to download the whole report at once in a csv file. I receive memory error saying i have reach the maximum memory allowed, and i don't have permission on the server to increase it. Does anybody know how can make this possible? so far just getting the query which has 700k result stops my script.
ps i have search around stackoverflow and so far can't find an answer to it. I am using php and mysql to get the data.
Thanks in advance
Do not export whole data in a single request.
Apply some limit and offset in exporting data from database.
Like :
- in first go it will export data from 0 to 9999, append to csv, and redirect to same url with offset = 10000 and limit =10000
- in second go it will export data from 10000 to 19999, and append to csv, and redirect to same url with offset = 20000 and limit =10000
- and so on
As per my comment on Guarav's answer you can't fetch that much data from the database in a single go - it doesn't matter how clever you are in how you manipulate it later.
You must either
- restrict the amount of data fetched from mysql in a single operation - and potentially kepp repeating this until you've the full data set - and as Guarav suggests you'll need to keep the data outside of PHP's memory space in a file then redirect to it. But given that the process is going to be slow already this is not a scalable solution either
how can i redirect? if i have already sent
This is trivial - if you're having problems understanding how to do this then you're already massively out of your depth.
- alternatively you could generate the report outside of the webserver's PHP but if you don't have permission to configure the webserver then you'll probably not be able to run programs either.
Just like @symbcbean said, the process of creating a file and appending the data with each query offset is very slow. So i came up with an alternative that i thought i should share in case someone else is facing the same issue.
I have setup a cronjob that goes through the process of creating the file with all the necessary data in it at night, since the data only change a few times a week. So now i've cleared the overhead of generating a new csv everytime someone needs to access it, instead i simply redirect to the pre-made file and the work is done!
Thank you @Gaurav & @symbcbean!
you can append data to a file in bunch of records Something like : Get first 10000 record from mysql Write first 10000 record to file Get second 10000 record Write second 10000 record to file Then crate link and send it to user.
精彩评论