开发者

Simulated time in a game loop using c++

I am building a 3d game from scratch in C++ using OpenGL and SDL on linux as a hobby and to learn more about this area of programming.

Wondering about the best way to simulate time while the game is running. Obviously I have a loop that looks something like:

void main_loop()
{
    while(!quit)
    {
         handle_events();
         DrawScene();
         ...
         SDL_Delay(time_left());
    }
}

I am using the SDL_Delay and time_left() to maintain a framerate of about 33 fps.

I had thought that I just need a few global variables like

int current_hour = 0;
int current_min = 0;
int num_days = 0;
Uint32 prev_ticks = 0;

Then a function like :

void handle_time()
{
    Uint32 current_ticks;
    Uint32 dticks;
    current_ticks = SDL_GetTicks();
    dticks = current_ticks - prev_ticks; // get difference since last time

    // if difference is greater than 30000 (half minute) increment game mins
    if(dticks >= 30000) {
         prev_ticks = current_ticks;
         current_mins++;
         if(current_mins >= 60) {
            current_mins = 0;
            current_hour++;
         }
         if(current_hour > 23) {
            current_hour = 0;
            num_days++;
         }
    }
 }

and then call the handle_time() function in the main loop.

It compiles and runs (using printf to write the time to the console at the moment) but I am wondering if this is the best way to do it. Is there easier ways or mo开发者_JAVA百科re efficient ways?


I've mentioned this before in other game related threads. As always, follow the suggestions by Glenn Fiedler in his Game Physics series

What you want to do is to use a constant timestep which you get by accumulating time deltas. If you want 33 updates per second, then your constant timestep should be 1/33. You could also call this the update frequency. You should also decouple the game logic from the rendering as they don't belong together. You want to be able to use a low update frequency while rendering as fast as the machine allows. Here is some sample code:

running = true;
unsigned int t_accum=0,lt=0,ct=0;
while(running){
    while(SDL_PollEvent(&event)){
        switch(event.type){
            ...
        }
    }
    ct = SDL_GetTicks();
    t_accum += ct - lt;
    lt = ct;
    while(t_accum >= timestep){
        t += timestep; /* this is our actual time, in milliseconds. */
        t_accum -= timestep;
        for(std::vector<Entity>::iterator en = entities.begin(); en != entities.end(); ++en){
            integrate(en, (float)t * 0.001f, timestep);
        }
    }
    /* This should really be in a separate thread, synchronized with a mutex */
    std::vector<Entity> tmpEntities(entities.size());
    for(int i=0; i<entities.size(); ++i){
        float alpha = (float)t_accum / (float)timestep;
        tmpEntities[i] = interpolateState(entities[i].lastState, alpha, entities[i].currentState, 1.0f - alpha);
    }
    Render(tmpEntities);
}

This handles undersampling as well as oversampling. If you use integer arithmetic like done here, your game physics should be close to 100% deterministic, no matter how slow or fast the machine is. This is the advantage of increasing the time in fixed time intervals. The state used for rendering is calculated by interpolating between the previous and current states, where the leftover value inside the time accumulator is used as the interpolation factor. This ensures that the rendering is is smooth, no matter how large the timestep is.


Other than the issues already pointed out (you should use a structure for the times and pass it to handle_time() and your minute will get incremented every half minute) your solution is fine for keeping track of time running in the game.

However, for most game events that need to happen every so often you should probably base them off of the main game loop instead of an actual time so they will happen in the same proportions with a different fps.


One of Glenn's posts you will really want to read is Fix Your Timestep!. After looking up this link I noticed that Mads directed you to the same general place in his answer.


I am not a Linux developer, but you might want to have a look at using Timers instead of polling for the ticks.

http://linux.die.net/man/2/timer_create

EDIT:
SDL Seem to support Timers: SDL_SetTimer

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜