Making a program timer-based rather than frame-rate dependent?
I have a game 开发者_运维技巧engine to work on as part of a class. Currently, its rendering is frame-rate dependent and one requirement is to move to a timer-based dependency. I am not sure how to determine where it is relying on frame-rates. I'm not sure what to look for. I realize I'm going to need to somehow incorporate a timer (GetTickCount?) to accomplish this, but I'm not sure how frequent to update that, either.
I'm not looking to be handed code, just some helpful guidelines possibly?
Imagine you have a very simple game, where it's just a ball moving across the screen. Without time-based updates, it moves as fast as you update.
What you want to do is find out how much time has elapsed (in a fraction. I usually measure in seconds, so physics equations match better.) When updating, instead of something like this:
ballPosition += ballVelocity
You'd have this:
ballPosition += ballVelocity * timeElapsed
What this means is that for higher frame rates, timeElapsed
will be lower, which consequently moves the ball less. Lower frame rates means that timeElapsed
will be greater, and the ball will move more per-frame.
In the end, the ball will move the same distance independent on frame rate. A 60 FPS update rate makes timeElapsed
equal 0.01666667f
, while a 30 FPS update rate would make it 0.03333333f
. You can see how at 60 FPS, the elapsed time is half of 30 FPS, but because it's twice as fast, it's the same number.
I usually pass timeElapsed
as an argument to any functions that are time-dependent. A nice consequence of doing it this way is you can slow down or speed up your game by multiplying the elapsed time by a value. You can also apply that to individual components. It also plays well if you switch to a frame-limiting model instead, because you're effectively just forcing timeElapsed
to be a constant. Pseudo-code:
while (gameRunning)
{
const float timeElapsed =
timer.elapsed(); // elapsed returns the number of seconds
// passed since it was last called
// GlobalTimeScale is 1 for normal time
game.update(timeElapsed * GlobalTimeScale);
game.draw();
}
To get the time, GetTickCount
should work. You might also take a look at QueryPerformanceCounter
for higher precision, though it can have issues with multiple cores.
I used fix your timestep with some success
The issue then becomes keeping track of any lag... and slowing the thing down if the computer can't keep up.
A possible pseudocode game loop:
constant TIMETHRESHOLD xxxx // nanoseconds to pass between executions of game loop
while true loop
if getTime() - previousTime > TIMETHRESHOLD
previousTime = getTime();
// Execute game logic here
end if
end loop
The thing is that you can have different TIMETHRESHOLD for different parts of the game logic. For example you could like the frame rate to be 60fps but if the physics engine "works" at 30fps is enough... this kind of things. With a fast hardware it will work as intended, and with slower hardware (one that can not meet the time requirements) it will work as fast as it can anyways. Of course this is a simple, mono-process example.
If you are not sure where the frame-rates are being used, it's almost certainly implicit in your main loop. If you have a call to a rendering function in your main loop, and rendering takes a long time, then the "loop rate" of your main loop is the same as your frame rate, right? Single threaded execution means everything that is processed in the loop contributes to your frame rate.
精彩评论