开发者

Why is my Stopwatch.Frequency so low?

  Debug.WriteLine("Timer is high-resolution: {0}", Stopwatch.IsHighResolution);
  Debug.WriteLine("Timer frequency: {0}", Stopwatch.Frequency);

Result:

  Timer is high-resolution: True
  Timer frequency: 2597705

This article (from 2005!) mentions a Frequency of 3579545, a million more than mine. 开发者_运维知识库This blog post mentions a Frequency of 3,325,040,000, which is insane.

Why is my Frequency so much comparatively lower? I'm on an i7 920 machine, so shouldn't it be faster?


3,579,545 is the magic number. That's the frequency in Hertz before dividing it by 3 and feeding it into the 8053 timer chip in the original IBM PC. The odd looking number wasn't chosen by accident, it is the frequency of the color burst signal in the NTSC TV system used in the US and Japan. The IBM engineers were looking for a cheap crystal to implement the oscillator, nothing was cheaper than the one used in every TV set.

Once IBM clones became widely available, it was still important for their designers to choose the same frequency. A lot of MS-DOS software relied on the timer ticking at that rate. Directly addressing the chip was a common crime.

That changed once Windows came around. A version of Windows 2 was the first one to virtualize the timer chip. In other words, software wasn't allowed to directly address the timer chip anymore. The processor was configured to run in protected mode and intercepted the attempt to use the I/O instruction. Running kernel code instead, allowing the return value of the instruction to be faked. It was now possible to have multiple programs using the timer without them stepping on each other's toes. An important first step to break the dependency on how the hardware is actually implemented.

The Win32 API (Windows NT 3.1 and Windows 95) formalized access to the timer with an API, QueryPerformanceCounter() and QueryPerformanceFrequency(). A kernel level component, the Hardware Adaption Layer, allows the BIOS to pass that frequency. Now it was possible for the hardware designers to really drop the dependency on the exact frequency. That took a long time btw, around 2000 the vast majority of machines still had the legacy rate.

But the never-ending quest to cut costs in PC design put an end to that. Nowadays, the hardware designer just picks any frequency that happens to be readily available in the chipset. 3,325,040,000 would be such a number, it is most probably the CPU clock rate. High frequencies like that are common in cheap designs, especially the ones that have an AMD core. Your number is pretty unusual, some odds that your machine wasn't cheap. And that the timer is a lot more accurate, CPU clocks have typical electronic component tolerances.


The frequence depends on the HAL (Hardware abstraction layer). Back in the pentium days, it was common to use the CPU tick (which was based on the CPU clock rate) so you ended up with really high frequency timers.

With multi-processor and multi-core machines, and especially with variable rate CPUs (the CPU clock slows down for low power states) using the CPU tick as the timer becomes difficult and error prone, so the writers of the HAL seem to have chosen to use a slower, but more reliable hardware clock, like the real time clock.


The Stopwatch.Frequency value is per second, so your frequency of 2,597,705 means you have more than 2.5 million ticks per second. Exactly how much precision do you need?

As for the variations in frequency, that is a hardware-dependent thing. Some of the most common hardware differences are the number of cores, the frequency of each core, the current power state of your cpu (or cores), whether you have enabled the OS to dynamically adjust the cpu frequency, etc. Your frequency will not always be the same, and depending on what state your cpu is in when you check it, it may be lower or higher, but generally around the same (for you, probably around 2.5 million.)


I think 2,597,705 = your processor frequency. Myne is 2,737,822. i7 930

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜