concurrent file read/write
What happens when many requests are received to read & write to a file in PHP? Do the requests get queued? Or is only one accepted and the rest are discarded开发者_运维问答?
I'm planning to use a text based hit counter.
You can encounter the problem of race condition
To avoid this if you only need simple append data you can use
file_put_contents(,,FILE_APPEND|LOCK_EX);
and don't worry about your data integrity.
If you need more complex operation you can use flock (used for simple reader/writer problem)
For your PHP script counter I suggest you to do with something like this:
//> Register this impression
file_put_contents( $file, "\n", FILE_APPEND|LOCK_EX );
//> Read the total number of impression
echo count(file($file));
This way you don't have to implement a blocking mechanism and you can keep the system and your code script lighter
Addendum
To avoid to have to count the array file()
you can keep the system even lighter with this:
//> Register this impression
file_put_contents( $file, '1', FILE_APPEND|LOCK_EX );
//> Read the total number of impression
echo filesize($file);
Basically to read the number of your counter you just need to read its filesize considering each impression add 1 byte to it
No, requests will not be queued, reader will get damaged data, writers will overwrite each other, data will be damaged.
You can try to use flock and x
mode of fopen.
It's not so easy to code good locking mutex, so try to find existing variant, or try to move data from file to DB.
You can use flock() to get a lock on the file prior to read/write to it. If other threads are holding a lock on the file, flock()
will wait until the other locks are released.
精彩评论