开发者

.NET Concurrency Question: Can I yield a semaphore to another thread

I have multiple threads that share use of a semaphore. Thread A holds the semaphore (using lock) and threads B and C are waiting on that same semaphore (also using lock). The threads share global variables, etc.

Is there a technique in C# that I can use to shut down thread B? I can set a flag in A and have thread B check that flag and exit as soon as it gets control of the semaphore, but I don't know of any technique to allow thread A to yield the semaphore to thread B (and get it back when thread B exits) without the risk of thread C s开发者_如何学运维eizing control.

Anyone have any suggestions how to address this design problem? I can rewrite the program as necessary if I am approaching this incorrectly.

[Edit] A commenter has pointed out that I am using the wrong terminology. The commenter is correct - I am using a critical section, but given that everything is running in a single process, in this example critical sections are functionally equivalent to the more general term 'semaphore'.

[Edit] Someone asked for more details, so here it is.

There can be multiple threads executing Code A. There's only ever one thread executing Code B.

Code A:

private static Thread workerThread = null;

lock (lockObject)
{
    ... do some work ...

    if (...condition...)
    {
        if (workerThread != null)
        {
            // Kill the worker thread and continue only after it is dead.
            quitWorkerThread = true;
            // Wait for the thread to die.
            while (workerThread.IsAlive)
            {
                Thread.Sleep(50);
            }
            workerThread = null;
            quitWorkerThread = false;
        } // if (workerThread != null)
    } // if (...condition...)

    ... do some more work ...

    if (...condition...)
    {
        if (workerThread == null)
        {
            // Start the worker thread.
            workerThread = new Thread(WorkerThread);
            workerThread.Start();
        } // if (workerThread == null)
    } // if (...condition...)

    ... do even more work ...

} // lock (lockObject)

Code B:

private void WorkerThread()
{
    while (true)
    {
        if (quitWorkerThread)
        {
            return;
        }

        Thread.Sleep (2000);

        if (quitWorkerThread)
        {
            return;
        }

        lock(lockObject)
        {
            if (quitWorkerThread)
            {
                return;
            }
            ... do some work ...
        } // lock(lockObject)
    } // while (true)
} // WorkerThread

I suspect that a variant of Aaron's solution will be what I use. I was mostly hoping there was somewhat more elegant solution was available, but I suspect that like everything else about this project, it's all brute force and corner cases :-(.


I'm fairly certain that there's no way to yield control to a specific thread, which seems to be what you're trying to do. You can only yield, period - it's up to the Windows scheduler to decide what thread gets to run next.

The situation is that you have three threads, A, B, and C. A has the lock, B and C are waiting for it, and you want a way to guarantee that B gets to executed next.

The obvious solution is to use more than one lock and/or sync primitive. You can combine the semantics of lock with a ManualResetEvent. Make thread C wait for both the event and the critical section, but thread B only has to wait for the critical section. Under normal circumstances, you signal the event just before releasing the lock, which leaves it up to the OS to decide which thread to execute. In the special case, you don't signal the event at all, leaving thread B to execute while C is still blocked.

Once B is done, then you signal the event to let C finish.


An (untested) example would be:

// Main thread is Thread A
object myLock = new Object();
AutoResetEvent myEvent = new AutoResetEvent(false);
ManualResetEvent completedEvent = new ManualResetEvent(false);

ThreadPool.QueueUserWorkItem(s =>
{
    for (int i = 0; i < 10000; i++)
    {
        lock (myLock)
        {
            // Do some work
        }
    }
    completedEvent.Set();
});  // Thread B

ThreadPool.QueueUserWorkItem(s =>
{
    for (int i = 0; i < 10000; i++)
    {
        myEvent.WaitOne();
        lock (myLock)
        {
            // Do some work
        }
    }
});  // Thread C

// Main loop for thread A
while (true)
{
    lock (myLock)
    {
        // Do some work
        if (SomeSpecialCondition)
            break;
        else
            myEvent.Set();
    }
}

completedEvent.WaitOne(); // Wait for B to finish processing
if (SomeSpecialCondition) // If we terminated without signaling C...
    myEvent.Set();        // Now allow thread C to clean up

This essentially puts Thread A in charge of when Thread C gets to execute. Threads A and B will compete normally but it's up to Thread A to signal the event for Thread C.


Take a look at using Monitor and Monitor.Pulse this will not get you exactly what you want in yielding to a specific thread but it will allow you to transfer control between threads and critical sections.

It's not clear to me what problem you are trying to solve. You may also be able to solve your problem using ReaderWriterLockSlim as well.

Lastly your scenerio sounds to me like an appropriate place to use .NET events instead.


(Disclaimer: If your only use case is that specific one, @Aaronaught's solution of simply using ManualResetEvents is probably the easiest.)

Edited for additional disclaimer: If you want to extend this concept, be very very wary of deadlocks.

If there are situations where you might want C to do work regardless of whether B has done stuff, but always want B to go first if C is waiting, here's one simple solution:

    object lockObject = new object();
    int WaitCounter = 0;

    void B()
    {
        System.Threading.Interlocked.Increment(ref WaitCounter);

        try
        {
            lock (lockObject)
            {
            }
        }
        finally
        {
            System.Threading.Interlocked.Decrement(ref WaitCounter);
        }
    }

    void C()
    {
        while (true)
        {
            // always attempt to yield to other threads first
            System.Threading.Thread.Sleep(0);
            lock (lockObject)
            {
                if (WaitCounter > 0)
                    continue;

                // ...

                return;
            }
        }
    }

A bunch of extra code, but no one's ever claimed concurrency is easy. :)


Aaronaught's solution looks sound, but I think a simpler one would be to use a litte more shared state and just a single lock.

Basically you pass control between the threads and the threads decide if it's time for them to work or not. If it's not, they simple PulseAll (move all existing waiting threads into the ready queue) and Wait to (be pulsed and then) get the lock again. At some point, it's decided when ThreadC is good to go.

public class ThreadsGettinCrazy {
  static readonly _sync = new object();
  static bool _threadCReady = false;

  public void ThreadA() {
    while (true) {
      lock(_sync) {
        while(/* my condition not met */) {
          Monitor.PulseAll(_sync);
          Monitor.Wait(_sync);
        }
        // do work, possibly set _threadCReady = true
        Monitor.PulseAll(_sync);
      }
      if (/* i'm done */) break;
    }
  }

  public void ThreadB() {
    while (true) {
      lock(_sync) {
        while(/* my condition not met */) {
          Monitor.PulseAll(_sync);
          Monitor.Wait(_sync);
        }
        // do work, possibly set _threadCReady = true
        Monitor.PulseAll(_sync);
      }
      if (/* i'm done */) break;
    }
  }

  public void ThreadC() {
    lock(_sync) {
      while (!_threadCReady) {
        Monitor.PulseAll(_sync);
        Monitor.Wait(_sync);
      }
      // do work
    }
  }
}
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜