Why does CPU usage increase when my application is minimized?
I'm programming a calculator. When the window is maximized, the CPU usage is about 12%, but when it is minimized, the CPU usage rises to about 50%. Why is this happening and how can I prevent this? Here is the piece of code that I think is causing the problem.
LRESULT CALLBACK WndProc(HWND hWnd, UINT uMsg, WPARAM wParam, LPARAM lParam)
{
switch(uMsg)
{
case WM_ACTIVATE:
if(!HIWORD(wParam))
active = true;
else
active = false;
return 0;
case WM_SYSCOMMAND:
switch(wParam)
{
case SC_SCREENSAVE:
case SC_MONITORPOWER:
return 0;
}
break;
case WM_CLOSE:
PostQuitMessage(0);
return 0;
case WM_KEYDOWN:
if( (wParam >= VK_LEFT && wParam <= VK_DOWN) || wParam == VK_CONTROL)
myCalc.handleInput(wParam, true);
return 0;
case WM_CHAR:
myCalc.handleInput(wParam);
return 0;
case WM_SIZE:
ReSizeGLScene(LOWORD(lPara开发者_C百科m), HIWORD(lParam)); //LOWORD = Width; HIWORD = Height
return 0;
}
return DefWindowProc(hWnd, uMsg, wParam, lParam);
}
int WINAPI WinMain(HINSTANCE hInstance, HINSTANCE hPrevInstance, LPSTR lpCmdLine, int nShowCmd)
{
MSG msg;
if(!CreateGLWindow(WINDOW_CAPTION, WINDOW_WIDTH, WINDOW_HEIGHT, WINDOW_BPP))
{
return 0;
}
while(!done) //Main loop
{
if(PeekMessage(&msg, NULL, 0, 0, PM_REMOVE))
{
if(msg.message == WM_QUIT)
done = true;
else
{
TranslateMessage(&msg); //Translate the message
DispatchMessage(&msg); //Dispatch the message
}
}
else
{
//Start the time handler
myTimeHandler.Start();
//Draw the GL Scene
if(active)
{
DrawGLScene(); //Draw the scene
SwapBuffers(hDC); //Swap buffer (double buffering)
}
//Regulate the fps
myTimeHandler.RegulateFps();
}
}
//Shutdown
KillGLWindow();
return(msg.wParam);
}
My guess is that your main loop runs without any delays if active
is false. The thread infinitely spins through that loop and keeps one of your two processors cores busy (that's why you see 50% CPU load).
If active
is true, the swap operation waits for the next vsync and delays your loop until your next screen refresh happens, resulting in a lower CPU load. (The time a thread spends waiting inside a windows function waiting for an event to happen does not count to its CPU load.)
To solve that problem, you could switch to a GetMessage-based message loop for the time that you do not want to render anything.
The less area your OpenGL window covers, the quicker the scene is drawn (the key term is fillrate) and thus the event loop is iterated with a higher frequency. I see you have some function RegulateFps
– to me this sounds like something that busy loops until a certain time has been consumed in the renderer. I.e. you're literally wasting CPU time to gain… uhhh, why do you want to keep the framerate low in the first place? Get rid of that.
And of course if you minimize it, you set active = false
so not doing GL stuff at all, but still wasting time in the busy loop.
Try switching on V synchronization in the driver options, and use duouble buffering, then wglSwapBuffers
will block until the vertical blank. And if active==false
don't PeekMessage
but GetMessage
Ok. I see one of ur logic flow.
while(!done) //Main loop
{
if(PeekMessage(&msg, NULL, 0, 0, PM_REMOVE))
{
......
}
else
{
.... rendering.
}
}
when you minimized, the peekMessage will always failed, so u will come to rendering part.
that causes u to use 100% single core cpu, because it's in while loop and never sleep, wait, just doing drawing again and again.
you may want to put a minimum time for rendering frame.
My suggestion is:
// set a timer for wakeup ur process.
while (::GetMessage(&msg, NULL, 0, 0))
{
// handle the messages
// do check the rendering to see if you need to render.
if (currentTime - lastDrawTime > MINIMIUM_RENDER_INTERVAL)
// rendering.
}
精彩评论