C# Debug and Release .exe behaving differently because of Long? [closed]
After reading these questions:
Code is behaving differently in Release vs Debug Mode
C# - Inconsistent math operation result on 32-bit and 64-bit
Double precision problems on .NET
Why does this floating-point calculation give different results on different machines?
I suspect that the reason my method for determining FPS which works while in Debug mode and no longer works in Release mode is because I'm using Long to hold time values. Here's the relevant code:
public void ActualFPS()
{
if (Stopwatch.GetTimestamp() >= lastTicks + Stopwatch.Frequency)
{
actualFPS = runsThisSecond;
lastTicks = Stopwatch.GetTimestamp();
runsThisSecond = 0;
}
}
runsThisSecond is incremented by one every time the method I'm tracing is called. Granted this isn't an overly accurate way to determine FPS, but it works for what I need it to.
lastTicks is a variable of type Long, and I believe that开发者_JS百科 Stopwatch.GetTimestamp() is returned as a Long as well(?). Is this my problem? If so: any suggestions as to how to work around this?
EDIT: Stopwatch is using the High Resolution timer.
EDIT2: The problem has resolved itself. Without any changes to any of my code. At all. None. I have no idea what caused it to break, or to fix itself. Perhaps my computer decided to spontaneously consider my feelings?
You have a very accurate interval measurement available (gettimestamp - lastticks), but you are not using it all to compute the frame rate. You assume the interval is a second, it won't be. It will be more, by a random amount that's determined by how often you call ActualFPS(). In Release mode you'll call ActualFPS() more frequently so the error is less.
Divide runsThisSecond by (gettimestamp - lastticks) converted to seconds.
精彩评论