large php serialized cache file over 100mb , how to handle?
I have this problem i cant workaround. I cache data in serialized format in a file to be read by php5 and unserialized when requested. Now some of the data files are large 100MB-300MB. I have serialized objects in it.
The problem is retrieving this data will take forever if it ever finishes, currently it fails and reaches max execution time of 90 seconds, setting it higher is not a good solution. I know this is definitely a "code better" idea so I ask here.
I thought of retrieving the serialized data in small portions but unfortunately, serialized data means no line endings. maybe its a noob question but I am exhausted.
开发者_StackOverflowthank you
EDIT: I solved this myself. Thanks everyone for your replies. This is how I did it, not the best but its a temporary workaround until 2020A.D.
I "paged " the cache file.
the mutantcode is so
for loop that jumps for 10,000 rows
$dataN = array_splice($mydata,N,10000,true)
fputs($myfileN,serialize($dataN))
end
edit 2: So the basic idea was to reduce the file size in order to speed up the process. I narrowed down the serialize/unserialize by splitting the files according to Year/month. The serialized data is emails headers so I changed the serialization to split the data accordingly and the read back in same way. With a small file size everything is atleast operational (which is my job) Eventually it will be easier for me now to migrate to a database and redo this thing. Regardless of holier-than-thou(you are wrong comments which is common in IRC/usenet as well) sometimes you have to work with what you are given and make the best of it.
hope it helps someone else.
An exact case when a cure turned to be much worse than a disease. A perfect example shows us why premature optimization is evil.
Thank you for providing community with such an excellent example.
Please refer to my original question for the answer. It's resolved for my case.
精彩评论