开发者

Queues And Wait Handles in C#

I've had the following code in my application for some years and have never seen an issue from it.

while ((PendingOrders.Count > 0) || (WaitHandle.WaitAny(CommandEventArr) != 1))
{
    lock (PendingOrders)
    {
       if (PendingOrders.Count > 0)
       {
           fbo = PendingOrders.Dequeue();
       }
       else
       {
           fbo = null;
       }
    }

    // Do Some Work if fbo is != null
}

Where CommandEventArr is made up of the NewOrderEvent (an auto reset event) and the ExitEvent (a manual reset event).

But I'm not sure if this is thread safe (assuming N producer threads that all lock the queue before enqueing and one consumer thread that runs the code above). Also, we can assume that the Queue.Count property just returns one instance Int32 value from the Queue class (without volatile or interlocked or a lock, etc.).

What's the usual pattern used with a Queue and an AutoResetEvent to fix this and do what I'm trying to do wi开发者_开发知识库th the code above?

(Edited to change the question slightly after it was correctly pointed out that Queue.Count could do anything and its implementation specific).


Looks quite thread-safe to me, the WaitAny() will simply complete immediately because thee event is already set. That's not a problem.

Don't break threading sync that works. But if you want a better mousetrap then you could consider Joe Duffy's BlockingQueue in this magazine article. A more general version of it is available in .NET 4.0, System.Collections.Concurrent.BlockingCollection with ConcurrentQueue as a practical implementation of it.


You are correct. The code is not thread-safe. But not for the reason you think.

The AutoResetEvent is fine. Though, only because you acquire a lock and retest PendingOrders.Count. The real crux of the matter is that you are calling PendingOrders.Count outside of a lock. Because the Queue class is not thread-safe your code is not thread-safe... period.

Now in reality you will probably never have a problem with this for two reasons. First, Queue.Count property is almost certainly designed to never leave the object in a half-baked state. After all, it will probably just return an instance variable. Second, the lack of a memory barrier on that read will not have a significant impact in the broader context of your code. The worst thing that will happen is that you will get a stale read on one iteration of the loop and then the acquired lock will implicitly create a memory barrier and a fresh read will take place on the next iteration. I am assuming here that there is only one thread queueing items. Things change considerably if there are 2 or more.

However, let me make this perfectly clear. You have no guarantee that PendingOrders.Count will not alter the state of the object during its execution. And because it is not wrapped in a lock, another thread could initiate an operation on it while is still in that half-backed state.


Using manual events...

ManualResetEvent[] CommandEventArr = new ManualResetEvent[] { NewOrderEvent, ExitEvent };

while ((WaitHandle.WaitAny(CommandEventArr) != 1))
{
    lock (PendingOrders)
    {
        if (PendingOrders.Count > 0)
        {
            fbo = PendingOrders.Dequeue();
        }
        else
        {
            fbo = null;
            NewOrderEvent.Reset();
        }
    }
}

Then you need to ensure a lock on the enqueue side as well:

    lock (PendingOrders)
    {
        PendingOrders.Enqueue(obj);
        NewOrderEvent.Set();
    }


You should only use the WaitAny for that, and ensure that it gets signaled on every new order added to the PendingOrders collection:

while (WaitHandle.WaitAny(CommandEventArr) != 1))
{
   lock (PendingOrders)
   {
      if (PendingOrders.Count > 0)
      {
          fbo = PendingOrders.Dequeue();
      }
      else
      {
          fbo = null;

          //Only if you want to exit when there are no more PendingOrders
          return;
      }
   }

   // Do Some Work if fbo is != null
}
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜