Can I improve the resolution of Thread.Sleep?
Thread.Sleep() resolution varies from 1 to 15.6ms
Given this console app:
class Program
{
static void Main()
{
int outer = 100;
int inner = 100;
Stopwatch sw = new Stopwatch();
for (int j = 0; j < outer; j++)
{
开发者_如何学运维 int i;
sw.Restart();
for (i = 0; i < inner; i++)
Thread.Sleep(1);
sw.Stop();
Console.WriteLine(sw.ElapsedMilliseconds);
}
}
}
I expected the output to be 100 numbers close to 100. Instead, I get something like this:
99 99 99 100 99 99 99 106 106 99 99 99 100 100 99 99 99 99 101 99 99 99 99 99 101 99 99 99 99 101 99 99 99 100 99 99 99 99 99 103 99 99 99 99 100 99 99 99 99 813 1559 1559 1559 1559 1559 1559 1559 1559 1559 1559 1559 1559 1559 1559 1559 1559 1559 1559 1559 1559 1560 1559 1559 1559 1559 1558 1558 1558 1558 1558 1558 1558 1558 1558 1558 1558 1558 1558 1558 1558 1558 1558 1558 1558 1559 1558 1558 1558 1558 1558
But sometimes I won't get any accurate results; it will be ~1559 every time.
Why isn't it consistent?
Some googling taught me that 15.6ms is the length of a timeslice, so that explains the ~1559 results. But why is it that I sometimes get the right result, and other times I just get a multiple of 15.6? (e.g. Thread.Sleep(20) will usually give ~31.2ms)
How is it affected by hardware or software?
I ask this because of what led me to discover it:
I had been developing my application on a 32 bit dual-core machine. Today my machine was upgraded to 64bit quad-core with a complete OS re-install. (Windows 7 and .NET 4 in both cases, however I can't be sure the older machine had W7 SP1; the new one does.)
Upon running my application on the new machine, I immediatey notice my forms take longer to fade out. I have a custom method to fade my forms which uses Thread.Sleep() with values varying from 10 to 50. On the old system this seemed to work perfectly every time. On the new system it's taking much longer to fade than it should.
Why did this bahavior change between my old system and my new system? Is this related to hardware, or software?
Can I make it consistently accurate? (~1ms resolution)
Is there something I can do in my program to make Thread.Sleep() reliably accurate to about 1ms? Or even 10ms?
The answer is not to use Thread.Sleep
and instead use a high resolution timer. You'll need to do your fade in a busy loop, but it sounds like that would be no problem. You simply cannot expect high resolution from Thread.Sleep
and it is notorious for behaving differently on different hardware.
You can use the Stopwatch
class on .net which uses high-resolution performance counters if they are supported on the hardware.
“The Sleep function suspends the execution of the current thread for at least the specified interval.”
-> http://social.msdn.microsoft.com/Forums/en/clr/thread/facc2b57-9a27-4049-bb32-ef093fbf4c29
I can answer one of my questions: Can I make it consistently accurate? (~1ms resolution)
Yes, it seems that I can, using timeBeginPeriod() and timeEndPeriod().
I've tested this and it works.
Some things I've read suggest that calling timeBeginPeriod(1) for the duration of the application is a bad idea. However, calling it at the start of a short method and then clearing it with timeEndPeriod() at the end of the method should be okay.
Nevertheless I will also investigate using timers.
You have a program running on your machine that is calling timeBeginPeriod() and timeEndPeriod(). Typically a media related program that uses timeSetEvent() to set a one millisecond timer. It affects the resolution of Sleep() as well.
You could pinvoke these functions yourself to get consistent behavior. Not terribly reasonable for UI effects though. It is rather unfriendly to battery life on a laptop.
Sleeping for 20 msec and actually getting 2/64 seconds is otherwise logical, the cpu simply won't wake up soon enough to notice that 20 msec have passed. You only get multiples of 1/64 seconds. So a reasonable choice for a Timer that implements fading effects is 15 msec, giving you 64 fps animation worst case. Assuming that your effect draws fast enough. You'll be a bit off if timeBeginPeriod was called but not by much. Calculating the animation stage from the clock works too but is a bit overkill in my book.
Your approach is wrong. You will never get sleep to be accurate, and the more busy your machine is, the more wrong your sleep loop will become.
What you should be doing is looking at how much time has passed and adapting accordingly. Although spinning around Sleep is a bad idea, "better" ways of doing it will be significantly more complex. I'll keep my suggested solution simple.
- Call DateTime.Now.Ticks and save it in a variable (startTick).
- In your loop, call Sleep as you are already doing.
- Call DateTime.Now.Ticks again and subtract startTick from it - this value will be how many 100 nanosecond units of time have passed since you started the fade (timeSinceStart).
- Using timeSinceStart calculate how much you should be faded with that much time elapsed.
- Repeat until you have completely faded it.
精彩评论