C# put thread to sleep on dequeue?
I'm trying to use WebClient to download a bunch of files asynchronously. From my understanding, this is possible, but you need to have one WebClient
object for each download. So I figured I'd just throw a bunch of them in a queue at the start of my program, then pop them off one at a time and tell them to download a file. When the file is done downloading, they can get pushed back onto the queue.
Pushing stuff onto my queue shoul开发者_JAVA技巧dn't be too bad, I just have to do something like:
lock(queue) {
queue.Enqueue(webClient);
}
Right? But what about popping them off? I want my main thread to sleep when the queue is empty (wait until another web client is ready so it can start the next download). I suppose I could use a Semaphore
alongside the queue to keep track of how many elements are in the queue, and that would put my thread to sleep when necessary, but it doesn't seem like a very good solution. What happens if I forget to decrement/increment my Semaphore every time I push/pop something on/off my queue and they get out of sync? That would be bad. Isn't there some nice way to have queue.Dequeue()
automatically sleep until there is an item to dequeue then proceed?
I'd also welcome solutions that don't involve a queue at all. I just figured a queue would be the easiest way to keep track of which WebClients are ready for use.
Here's an example using a Semaphore. IMO it is a lot cleaner than using a Monitor:
public class BlockingQueue<T>
{
Queue<T> _queue = new Queue<T>();
Semaphore _sem = new Semaphore(0, Int32.MaxValue);
public void Enqueue(T item)
{
lock (_queue)
{
_queue.Enqueue(item);
}
_sem.Release();
}
public T Dequeue()
{
_sem.WaitOne();
lock (_queue)
{
return _queue.Dequeue();
}
}
}
What you want is a producer/consumer queue.
I have a simple example of this in my threading tutorial - scroll about half way down that page. It was written pre-generics, but it should be easy enough to update. There are various features you may need to add, such as the ability to "stop" the queue: this is often performed by using a sort of "null work item" token; you inject as many "stop" items in the queue as you have dequeuing threads, and each of them stops dequeuing when it hits one.
Searching for "producer consumer queue" may well provide you with better code samples - this was really just do demonstrate waiting/pulsing.
IIRC, there are types in .NET 4.0 (as part of Parallel Extensions) which will do the same thing but much better :) I think you want a BlockingCollection wrapping a ConcurrentQueue.
I use a BlockingQueue to deal with exactly this type of situation. You can call .Dequeue when the queue is empty, and the calling thread will simply wait until there is something to Dequeue.
public class BlockingQueue<T> : IEnumerable<T>
{
private int _count = 0;
private Queue<T> _queue = new Queue<T>();
public T Dequeue()
{
lock (_queue)
{
while (_count <= 0)
Monitor.Wait(_queue);
_count--;
return _queue.Dequeue();
}
}
public void Enqueue(T data)
{
if (data == null)
throw new ArgumentNullException("data");
lock (_queue)
{
_queue.Enqueue(data);
_count++;
Monitor.Pulse(_queue);
}
}
IEnumerator<T> IEnumerable<T>.GetEnumerator()
{
while (true)
yield return Dequeue();
}
IEnumerator IEnumerable.GetEnumerator()
{
return ((IEnumerable<T>) this).GetEnumerator();
}
}
Just use this in place of a normal Queue and it should do what you need.
精彩评论