PHP asynchronous multiple file write and read
I am using a cURL based php application to make requests to another webserver that does asynchronous requests. So what I am doing is creating files with the name as .req with the info I will need on the return and as the identification in the request. The requests are done using HTTP-XML-POST. The file is written using: -
file_get_contents(reqs/<databaseid>.req, FILE_APPEND);
What happens is that while the requests are being generated in bulk (about 1500 per second), the responses start coming back from the webserver. The response is caught by a another script which received the from the response and opens the request file based on it using: -
$aResponse = file(reqs/<databaseid>.req);
Now what happens is that in about 15% of开发者_如何转开发 requests, the file() request fails and generates a log entry in apache log like this: -
file(reqs/<databaseid>.req): failed to open stream: No such file or directory in <scriptname> on line <xyz>
It has been verified using a cleaner script that runs later that the file did exist.
Any ideas?!!!
There are some functions to handle simultaneous file access such as flock() but it's normally easier to simply use a database. Any decent DBMS has already worked it out for you.
精彩评论