Why are animations sometimes done using steps based on the amount of time that's passed?
I've noticed that some programmers animate objects based on the difference in time. I am not sure why or even if this is logical. Does anyone know the significance?
Below is a snippet of code that explains what I mean:
var timePassed:int = getTimer()-lastTime;
lastTime += timePassed;
var newBallX = ball.x + ballDX*timePassed;
var newBallY = ball.y + ballDY*timePassed;
开发者_Python百科
When you animate based on time, you make yourself independent of the framerate. No matter how many frames have passed, your ball will move the same distance in a given amount of time. Compare that to depending on the framerate, which is dependent on many variables, like how much processing power is available to do the animation.
This is a common game-physics issue -- check out Glenn Fiedler's excellent "Fix Your Timestep!" article for a more detailed take on this. (Doing it right is slightly more complicated than just multiplying your direction vectors by the timestep.)
The logic is simple.
BallDX => Ball Delta X => The distance the ball can move on the x coordinate in one second
timepassed => amount of time passed
if OldBallX = 0
if BallDX = 10
if TimePassed = 1 sec
Then NewBallX = OldBallX + (BallDX*TimePassed)
Which means
NewBallX = 0 + (10 * 1) = 10 pixels
In that case
if TimePassed = 0.5 sec (half a second)
Then
NewBallX = 0 + (10 * 0.5) = 5 pixels
Logical?
Why NOT do it that way? As opposed to doing what? It is a simple linear motion right? Here is a thought: this allows for the ball to catch up with its intended position in the case other programs are slowing down the computer.
A modern computer operating system runs many tasks at once, and you don't always get your time slices at regular intervals. By using the difference in the real time clock, you smooth out the motion versus if you moved the same amount every time through the loop, which could cause it to look jerky if the OS gave a few more milliseconds to another process before it got back to yours.
The most important aspect of being independent of the framerate is that you don't have to chain down the framerate. It used to be, back in the dark ages, that games would be written to use the CPU as much as possible and the frame rate was determined by the CPU speed. I remember playing games on my 16MHz machine that would have things fly by so fast you couldn't react, because they were written for 1MHz machines. Programmers wised up to this and started writing games that capped the framerate, usually at 30fps in the early years, later 60fps (usually locked to the VSYNC of the monitor). This solved the problem, but was really annoying for those of us with awesome computers that wanted more fluid motion. Eventually they started writing the games completely independent of the framerate, which allows you to play a game at 700fps and get the same experience as at 20fps, except smoother graphics. And it can also cope with the load changing during play, as others have said, which can be very important with today's multitasking OSes.
If you make the animation as a function of time, you can be independent of the frame rate somewhat, meaning that if you make your animation for 24fps, you can easily adjust the animation to fit a 30 fps scenario if it's dynamic (as in defined by a function / opposed to frame by frame drawings where planning is everything )
This is short story, for the full explanation have a look at Robert Penner's good ol' chapter on Motion, Tweening and Easing.
精彩评论