Why does increasing timer resolution via timeBeginPeriod impact power consumption?
I am currently writing an application in C# where I need to fire a timer approx. every 5 milliseconds. From some research it appears the best way to do this involves p/invoking timeBeginPeriod(...) to change the resolution of the system timer. It works well enough in my sample code.
I found an interesting warning about using this function on Larry Osterman's MSDN Blog in this entry:
Adam: calling timeBeginPeriod increases the accuracy of GetTickCount as well.
开发者_运维技巧using timeBeginPeriod is a hideously bad idea in general - we've been actively removing all of the uses of it in Windows because of the power consumption consequences associated with using it.
There are better ways of ensuring that your thread runs in a timely fashion.
Does anyone know exactly why this occurs, or what those "better ways" (which are unspecified in the thread) might be? How much extra power draw are we talking about?
Because it causes more CPU usage. A good explanation is at Timers, Timer Resolution, and Development of Efficient Code.
Changing the system timer resolution does impact on power usage, mainly because lots of developers do not understand windows timers. You see lots of code with sleep or timer values less than 15ms. It also changes the behaviour of system tasks which can result in more power usage.
Changing the system timer to 1ms suddenly all this code that was only waking up every 15ms starts to wake up much more often, and the CPU usage goes up.
However from the users perspective the programs that have misused the timers can become more responsive, even the OS in the case of WinXP, so there is a trade off.
I have a small program that changes the system timer so you can experiment and test the power usage for yourself. There are also a number of links and some more background at http://www.lucashale.com/timer-resolution/
精彩评论