simplexml_load_file() speed factors
I'm working on a web-app, in which each user has multiple non-local XML files that are downloaded and parsed via SimpleXML each page (re)load. Each request takes a little less than a second on average, however with more than five or six files (which is likely), the load time is quite noticeable. So my question is what factors go into the speed of the function and are there any was I could control and speed them up?
As for purely improving efficiency, http headers to check开发者_如何转开发 for the last-updated time are inaccurate and I don't think cron jobs would give me the 'live' results which I'm looking to get.
So are the factors mainly on the server that I'm accessing them from, or are they on my side? Is it the function itself? Obviously the size of the file affects the speed. Is there any way to compress it before bringing it over, thus speeding up downloads?
What are your reasons for having them absolutely live? Could you have them updated every 15 minutes?
Could you Cron them and cache them for 15 minutes at a time?
You could make sure the method you get them from a remote server sends headers to accept gzip (and you can deflate them on your end).
You may want to try curl_multi_init() if you have cURL installed. It will perform any number of requests in parallel, so a page that uses 5 or 6 requests shouldn't be much more slower than the one using only 1 or 2, especially if you arrange your code to start those requests as early in your script as possible.
With that said, it's still much worse than not performing any remote requests at all, obviously.
精彩评论