开发者

Limiting the number of threadpool threads

I am using ThreadPool in my application. I have first set the limit of the thread pool by using the following:

ThreadPool.SetMaxThreads(m_iThreadPoolLimit,m_iThreadPoolLimit);
m_Events = new ManualResetEvent(false);

and then I have queued up the jobs using the following

WaitCallback objWcb = new WaitCallback(abc);
ThreadPool.QueueUserWorkItem(objWcb, m_objThreadData); 

Here abc is the name of the function that I am calling. After this I am doing the following so that all my threads come to 1 point and the main thread takes over and continues further

m_Events.WaitOne();

My thread limit is 3. The pr开发者_运维知识库oblem that I am facing is, inspite of the thread pool limit set to 3, my application is processing more than 3 files at the same time, whereas it was supposed to process only 3 files at a time. Please help me solve this issue.


What kind of computer are you using?

From MSDN

You cannot set the number of worker threads or the number of I/O completion threads to a number smaller than the number of processors in the computer.

If you have 4 cores, then the smallest you can have is 4.

Also note:

If the common language runtime is hosted, for example by Internet Information Services (IIS) or SQL Server, the host can limit or prevent changes to the thread pool size.

If this is a web site hosted by IIS then you cannot change the thread pool size either.


A better solution involves the use of a Semaphore which can throttle the concurrent access to a resource1. In your case the resource would simply be a block of code that processes work items.

var finished = new CountdownEvent(1); // Used to wait for the completion of all work items.
var throttle = new Semaphore(3, 3); // Used to throttle the processing of work items.
foreach (WorkItem item in workitems)
{
  finished.AddCount();
  WorkItem capture = item; // Needed to safely capture the loop variable.
  ThreadPool.QueueUserWorkItem(
    (state) =>
    {
      throttle.WaitOne();
      try
      {
        ProcessWorkItem(capture);
      }
      finally
      {
        throttle.Release();
        finished.Signal();
      }
    }, null);
}
finished.Signal();
finished.Wait();

In the code above WorkItem is a hypothetical class that encapsulates the specific parameters needed to process your tasks.

The Task Parallel Library makes this pattern a lot easier. Just use the Parallel.ForEach method and specify a ParallelOptions.MaxDegreesOfParallelism that throttles the concurrency.

var options = new ParallelOptions();
options.MaxDegreeOfParallelism = 3;
Parallel.ForEach(workitems, options,
  (item) =>
  {
    ProcessWorkItem(item);
  });

1I should point out that I do not like blocking ThreadPool threads using a Semaphore or any blocking device. It basically wastes the threads. You might want to rethink your design entirely.


You should use Semaphore object to limit concurent threads.


You say the files are open: are they actually being actively processed, or just left open? If you're leaving them open: Been there, done that! Relying on connections and resources (it was a DB connection in my case) to close at end of scope should work, but it can take for the dispose / garbage collection to kick in.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜