开发者

Proper animation in Direct3D 10

I'm a beginner in d3d development and I'm stuck on my first issue. I've created a little program that draws a terrain with random heights in each vertex and a light that moves from left to right (plus basic keyboard management). Problem is the animated light is very skippy, input recognition fails pretty often and cpu usage is 50% constant (I have a dual core cpu). I thought that this issue was caused by incorrect throttling, but even after fixing the code i still have problems. My main loop is

int APIENTRY _tWinMain(HINSTANCE hInstance, HINSTANCE hPrevInstance, LPTSTR lpCmdLine, int nCmdShow)
{
    HWND wndHandle;
    int width = 640;
    int height = 480;

    wndHandle = InitWindow(hInstance, width, height);
    InitDirect3D(wndHandle, width, height);
    DInputInit(hInstance, wndHandle);
    SceneSetUp();
    MSG msg = {0};
    QueryPerformanceFrequency((LARGE_INTEGER *) &perf_cnt);
    time_count = perf_cnt / fixedtic;
    QueryPerformanceCounter((LARGE_INTEGER 开发者_如何学编程*) &next_time);
    while (WM_QUIT != msg.message)
    {
        if (PeekMessage(&msg, NULL, 0, 0, PM_REMOVE))
        {
            TranslateMessage(&msg);
            DispatchMessage(&msg);
            continue;
        }
        HandleInput();
        QueryPerformanceCounter((LARGE_INTEGER *) &cur_time);
        if (cur_time>next_time) {
            Render();
            next_time += time_count;
            if (next_time < cur_time)
                next_time = cur_time + time_count;
        }
    }
    ShutdownDirect3D();
    return (int)msg.wParam;
}

while the Render function is

void Render()
{
    QueryPerformanceCounter((LARGE_INTEGER *) &curtime);
    timespan = (curtime - last_time) * time_factor;
    last_time = curtime;
    model.pD3DDevice->ClearRenderTargetView(model.pRenderTargetView, D3DXCOLOR(0.0f, 0.0f, 0.0f, 0.0f));
    lightpos.x = (float)cos(ang) * 256.0f;
    ang += timespan * 0.5;
    lpvar->SetFloatVector((float *) lightpos);
    D3DXMatrixLookAtLH(&V, &model.campos, new D3DXVECTOR3(0.0f, 0.0f, 0.0f), new D3DXVECTOR3(0.0f, 1.0f, 0.0f));
    D3DXMATRIX VP = V*P;
    camvar->SetFloatVector((float *)model.campos);
    ViewProjection->SetMatrix((float *) &VP);
    for(UINT p=0; p < techniqueDescription.Passes; ++p)
    {
        pTechnique->GetPassByIndex(p)->Apply(0);
        model.pD3DDevice->DrawIndexed(numIndices, 0, 0);
    }
    model.pSwapChain->Present(0, 0);
}

Any help is appreciated


Remove the continue from after the Dispatch message. As it stads if you have a shed load of messages queued up (say you move the mouse over the window) then you will end up wasting time processing those messages instead of rendering.

It would also be usual to do the following to process all the messages in a one'r:

    while(PeekMessage(&msg, NULL, 0, 0, PM_REMOVE))
    {
        TranslateMessage(&msg);
        DispatchMessage(&msg);
    }

It would be interesting to see what sort of input you are handling too as you appear to be handling the input more regularly than you are rendering.

In general, though, its better to render constantly and simply multiply any movement code by the time since the last frame to smooth things out. This way you'd work out what the movement for a given second. So if you want to press a key and rotate at 1 rotation per second then you would simply set the rotation increment to (2 * pi) / sinceLastFrame ...

The reason your CPU time is at 50% is because you are, effectively, sitting in a loop. You don't give up your timeslice at any point. You simply spin round the loop handling input, probably many more times per second than rendering occurs.


Ok, I nailed it and I think I deserve a punch in my face. I was creating a reference device instead of a hardware device

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜