开发者

Producer Consumer queue does not dispose

i have built a Producer Consumer queue wrapping a ConcurrentQueue of .net 4.0 with SlimManualResetEvent signaling between the producing (Enqueue) and the consuming (while(true) thread based. the queue looks like:

public class ProducerConsumerQueue<T> : IDisposable, IProducerConsumerQueue<T>
{
    private bool _IsActive=true;

    public int Count
    {
        get
        {
            return this._workerQueue.Count;
        }
    }

    public bool IsActive
    {
        get { return _IsActive; }
        set { _IsActive = value; }
    }

    public event Dequeued<T> OnDequeued = delegate { };
    public event LoggedHandler OnLogged = delegate { };

    private ConcurrentQueue<T> _workerQueue = new ConcurrentQueue<T>();

    private object _locker = new object();

    Thread[] _workers;

    #region IDisposable Members

    int _workerCount=0;

    ManualResetEventSlim _mres = new ManualResetEventSlim();

    public void Dispose()
    {
        _IsActive = false;

        _mres.Set();

        LogWriter.Write("55555555555");

          for (int i = 0; i < _workerCount; i++)
          // Wait for the consumer's thread to finish.
          {
             _workers[i].Join();        
          }
           LogWriter.Write("6666666666");
     // Release any OS resources.
    }
    public ProducerConsumerQueue(int workerCount)
    {
        try
        {
            _workerCount = workerCount;
            _workers = new Thread[workerCount];
            // Create and start a separate thread for each worker
            for (int i = 0; i < workerCount; i++)
                (_workers[i] = new Thread(Work)).Start();
        }
        catch (Exception ex)
        {
            OnLogged(ex.Message + ex.StackTrace);
        }

    }
    #endregion

    #region IProducerConsumerQueue<T> Members

    public void EnqueueTask(T task)
    {
        if (_IsActive)
        {
            _workerQueue.Enqueue(task);
            //Monitor.Pulse(_locker);
            _mres.Set();
        }
    }

    public void Work()
    {
      while (_IsActive)
      {
          try
          {
              T item = Dequeue();
              if (item != null)
                  OnDequeued(item);
          }
          catch (Exception ex)
          {
              OnLogged(ex.Message + ex.StackTrace);
          }              
      }
    }

    #endregion
    private T Dequeue()
    {
        try
        {
            T dequeueItem;
            //if (_workerQueue.Count > 0)
            //{
            _workerQueue.TryDequeue(out dequeueItem);
            if (dequeueItem != null)
                return dequeueItem;
            开发者_开发百科//}
            if (_IsActive)
            {
                _mres.Wait();
                _mres.Reset();
            }
            //_workerQueue.TryDequeue(out dequeueItem);
            return dequeueItem;
        }
        catch (Exception ex)
        {
            OnLogged(ex.Message + ex.StackTrace);
            T dequeueItem;
            //if (_workerQueue.Count > 0)
            //{
            _workerQueue.TryDequeue(out dequeueItem);
            return dequeueItem;
        }

    }


    public void Clear()
    {
        _workerQueue = new ConcurrentQueue<T>();
    }
}

}

when calling Dispose it sometimes blocks on the join (one thread consuming) and the dispose method is stuck. i guess it get's stuck on the Wait of the resetEvents but for that i call the set on the dispose. any suggestions?


Update: I understand your point about needing a queue internally. My suggestion to use a BlockingCollection<T> is based on the fact that your code contains a lot of logic to provide the blocking behavior. Writing such logic yourself is very prone to bugs (I know this from experience); so when there's an existing class within the framework that does at least some of the work for you, it's generally preferable to go with that.

A complete example of how you can implement this class using a BlockingCollection<T> is a little bit too large to include in this answer, so I've posted a working example on pastebin.com; feel free to take a look and see what you think.

I also wrote an example program demonstrating the above example here.

Is my code correct? I wouldn't say yes with too much confidence; after all, I haven't written unit tests, run any diagnostics on it, etc. It's just a basic draft to give you an idea how using BlockingCollection<T> instead of ConcurrentQueue<T> cleans up a lot of your logic (in my opinion) and makes it easier to focus on the main purpose of your class (consuming items from a queue and notifying subscribers) rather than a somewhat difficult aspect of its implementation (the blocking behavior of the internal queue).


Question posed in a comment:

Any reason you're not using BlockingCollection<T>?

Your answer:

[...] i needed a queue.

From the MSDN documentation on the default constructor for the BlockingCollection<T> class:

The default underlying collection is a ConcurrentQueue<T>.

If the only reason you opted to implement your own class instead of using BlockingCollection<T> is that you need a FIFO queue, well then... you might want to rethink your decision. A BlockingCollection<T> instantiated using the default parameterless constructor is a FIFO queue.

That said, while I don't think I can offer a comprehensive analysis of the code you've posted, I can at least offer a couple of pointers:

  1. I'd be very hesitant to use events in the way that you are here for a class that deals with such tricky multithreaded behavior. Calling code can attach any event handlers it wants, and these can in turn throw exceptions (which you don't catch), block for long periods of time, or possibly even deadlock for reasons completely outside your control--which is very bad in the case of a blocking queue.
  2. There's a race condition in your Dequeue and Dispose methods.

Look at these lines of your Dequeue method:

if (_IsActive) // point A
{
    _mres.Wait(); // point C
    _mres.Reset(); // point D
}

And now take a look at these two lines from Dispose:

_IsActive = false;

_mres.Set(); // point B

Let's say you have three threads, T1, T2, and T3. T1 and T2 are both at point A, where each checks _IsActive and finds true. Then Dispose is called, and T3 sets _IsActive to false (but T1 and T2 have already passed point A) and then reaches point B, where it calls _mres.Set(). Then T1 gets to point C, moves on to point D, and calls _mres.Reset(). Now T2 reaches point C and will be stuck forever since _mres.Set will not be called again (any thread executing Enqueue will find _IsActive == false and return immediately, and the thread executing Dispose has already passed point B).

I'd be happy to try and offer some help on solving this race condition, but I'm skeptical that BlockingCollection<T> isn't in fact exactly the class you need for this. If you can provide some more information to convince me that this isn't the case, maybe I'll take another look.


Since _IsActive isn't marked as volatile and there's no lock around all access, each core can have a separate cache for this value and that cache may never get refreshed. So marking _IsActive to false in Dispose will not actually affect all running threads.

http://igoro.com/archive/volatile-keyword-in-c-memory-model-explained/

private volatile bool _IsActive=true;
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜