deal with big arrays in php [closed]
In the application i am developing I need to get information from varoius sources (mysql database,facebook and in future other social networks.) and join all in the same structure,
What i am doing right now is get the data from my database and from facebook as array, th开发者_StackOverflow社区en use array_merge to join both arrays and then iterate over that array and create a custom array formatted with the fields I want.
The strucutre itself is working but I fear that with increasing number of posts into my app and the associated facebook page will make my application really slow. I am only showing 10 records at the time (using array_slice) but before that I allways need to get all the data, and because, it comes from different sources with different formats, I dont see any way to limit the ammount of data returned from each call.
Any tips?
Is the data specific per call? You could preprocess the data, for example on the first request obtain it and store it in a temporary table, that way the next requests can just load from the table. You could define a refresh rate, and do a full reload if the refresh rate time is passed. Also if you have access to things like cron jobs they might be the solution you're looking for.
I would do some unittesting and global testing to try to find if the growing arrays are actually a problem.
If they are, i guess i would divide the application in two parts, one getting the data and parsing it two the right format and another one to use the data and show the 10 records to the user, you can make a backgroud app for example in Python for the data procesing and a PHP front end for the user visualization.
HTH regards.
Don't load all your data into memory. Use mySQL's LIMIT
and your database wrapper's fetch_row
method to keep down the number of records that you need to load into your script's memory.
精彩评论