开发者

What is best way to measure the time cycles for a C# function?

Really, I'm looking for a good function that measure the time cycles accurately for a given C# function under Windows operating system. I tried these functions, but they both do not get accurate measure:

DateTime StartTime = DateTime.Now;      
TimeSpan ts = DateTime.Now.Subtract(StartTime);
Stopwatch stopWatch = new Stopwatch();
stopWatch.Start();
//code开发者_StackOverflow中文版 to be measured
stopWatch.Stop();
TimeSpan ts = stopWatch.Elapsed; 

Really, each time I call them, they give me different time for the same function

Please, if anyone know better way to measure time consuming accurately, please help me and thanks alot alot


The Stopwatch is the recommended way to measure time it takes for a function to execute. It will never be the same from run to run due to various software and hardware factors, which is why performance analysis is usually done on a large number of runs and the time is averaged out.


"they give me different time for the same function" - that's expected. Things fluctuate because you are not the only process running on a system.

Run the code you want to time in a large loop to average out any fluctuations (divide total time by the number of loops).

Stopwatch is an accurate timer, and is more than adequate timing for most situations.

const int numLoops = 1000000;   // ...or whatever number is appropriate

Stopwatch stopWatch = new Stopwatch();
stopWatch.Start();

for (int i = 0; i < numLoops; i++)
{
    // code to be timed...
}

stopWatch.Stop();
TimeSpan elapsedTotal = stopWatch.Elapsed;
double timeMs = elapsedTotal.TotalMilliseconds / numLoops;


The gold-standard is to use StopWatch. It is a high resolution timer and it works very well.

I'd suggest you check the elapsed time using .Elapsed.TotalMilliSeconds as you get a double rather than .Elapsed.MilliSeconds which gives you an int. This might be throwing your results off.

Also, you might find that garbage collections occur during your timing tests and these can significantly change the resulting time. It's useful to check the GC collection count before and after your timing test and discard the result if any garbage collections occurred.

Otherwise your results can vary simply because other threads and processes take over the CPU and other system resources during your tests. There's not much you can do here except to run your tests multiple times and statistically analyze your results by calculating mean & standard deviations timings etc.

I hope this helps.


While it is possible for you to measure your code in clock cycles, it will still be just as prone to variability as measuring in seconds and will be much less useful (because seconds are just as good, if not better, a unit of measurement than clock cycles). The only way to get a measurement that is unaffected by other processes is to ensure none are running, and you can't do that on Windows -- the OS itself will always be running doing some things, because it's not a single-process OS.

The closest you can come to the measurement you want is to build and run your code as described here. You can then view the x86 assembly for the JIT'd code for the methods you want to time by setting a breakpoint at the start of your code, and then stepping through. You can cross-reference each x86 instruction with its cycle timing in the Intel architecture manuals, and add them up to get an accurate cycle count.

This is, of course, extremely painful and basically useless. It also might be invalidated by code changes that cause the JIT to take slightly different approaches to producing x86 from your IL.


You need profiler to measure code execution (see What Are Some Good .NET Profilers? to start your search).

Looking at your comments it is unclear what you trying to optimize. Generally you need to go down to measure CPU clock cycles when your code is executed 1000s of times and purely CPU bound, in other cases per-function execution time is usually enough. But you are saying that your code is too slow to run that many times to calculate average time with stopwatch.

You also need to figure out if CPU is the bottleneck for your application or there is somthing else that makes it slow. Looking at CPU % in TaskManager can give you information on it - less then 100% CPU usage pretty much guarantees that there is something else (i.e. network or disk activity) makes program slow.

Basically providing more detail on type of code you trying to measure to meet your performance goals will make helping you much easier.


To echo the others: The Stopwatch class is the best way to do this.

To answer your questions about only measuring clock cycles: The fact that you're running on a multi-tasking OS on a modern processor makes measuring clock cycles almost useless. A context switch has a good chance of removing your code and data from the processors cache, and the OS might decide to swap your working set out in the meantime.

The processor could decide to reorder your instructions based on cache waits or memory accesses, and execute what it can while it's waiting. Or it may not if it is in the cache.

So, in short, performing multiple runs and averaging them is really the only way to go.

To get less jitter in the time, you could elevate the priority of the thread/process, but this can result is a slew of other issues (Bumping to real-time priority, and getting stuck in a long loop will essentially stop all other processing. If a bug occurs, and you get stuck in an infinite loop, your only choice is the reset button), and is not recommended at all, especially in on a users computer, or in a production environment. And since you can't do that where it matters, it makes the benchmarks you run in your machine, with any priority modifications, invalid.


how about using Environment.TickCount to capture start and end and then to TimeSpan.FromTicks() on it ?

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜