Reading 200 urls automatically with varying intervals between requests
By calling the example URL below my Java servlet caches one file at a time (with corresponding "filekey").
http://www.example.org/JavaServlet?fileKey=phAwcNAxsdfajsdoj&action=cache
Now I hav开发者_如何学JAVAe 200 files I'd like to cache ... Instead of calling all these URLs manually I'd like to use fopen, curl or something else to automatically go through all these calls. Every call takes 3-8 seconds so the script has to wait until the previous call has returned the "cache read message" or do concurrent calls.
Any ideas of how these dynamic URLs can be read automatically?
Thanks.
You don't provide much information about where the files are cached from. Is it a script that takes a file as input over POST? Have a look at the PHP cURL library. If they're local files, you could use PHP to iterate over the files in the directory of your choice and use cURL to upload them and retrieve the fileKey.
Is this what you mean?
Put the URLs into an array, loop over the array and fetch the URLs ...
精彩评论