开发者

Concurrent Fwrite in PHP?

I have a fwrite function to write log for delay insert to the database.

We have a visit rate of 20,000 Visits per hour.

so it's 0.18s per fwrite.

My question is that is it possible that PHP will miss several开发者_运维知识库 log when there are 2 or 3 visitor come in at the same time?

If it's then how can I make this concurrent?

My code is just normal Fopen fwrite fclose.


I would avoid this if I was you. If you must log something for each request, either use something like Gearman to queue the data for later processing, or store it in a database (or a NoSQL db) for later processing. Don't forget that databases were designed to solve this very problem. Don't try to re-invent a db using log files.

Not to mention that 0.18s per write is likely a lot more expensive than an insert into a DBM (MySQL for me typically returns writes inside of 0.01 seconds, depending on the table, MongoDB can be a LOT faster).


How are you firing off the writes? If each requests (1 per visit) has an associated fwrite with it, you won't miss any. Perhaps they could become slow(er) as your web server is queuing / handling requests (peak traffic), but PHP is request driven, so for every page load, you'll have the fwrite execute.

I say you won't miss any because, unless your doing some non blocking I/O, your script is going to block on the fwrite call. In other words, fwrite will execute.

You're going to have trouble getting any sort of concurrency on a request driven app written in PHP without doing some really funky stuff.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜