OpenGL "out of memory" on glReadPixels()
I am running into an "out of memory" error from OpenGL on glReadPixels() under low-memory conditions. I am writing a plug-in to a program that has a robust heap mechanism for such situations, but I have no idea whether or how OpenGL could be made to use it for application memory management. The notion that this is even possible came to my attention through this [albeit dated] thread on a similar issue under Mac OS [not X]: http://lists.apple.com/archives/Mac-opengl/2001/Sep/msg00042.html
I am using Windows XP, and have seen it on multiple NVidia cards. I am also interested in any work-arounds I might be able to relay to users (the thread mentions "increasing virtual memory").
开发者_运维知识库Thanks, Sean
I'm quite sure that the out-of-memory error is not raised from glReadPixels (infact glReadPixels doesn't allocate memory itself).
The error is probably raised by other routines allocating buffer objects or textures. Once you detect the out-of-memory error, you should release all non-mandatory buffer objects (textures, texture mipmaps, rarely used buffer objects) is order to allocate a new buffer object holding the glReadPixels returned data.
Without more specifics, hard to say. Ultimately OpenGL is going to talk to the native OS when it needs to allocate. So if nothing else, you can always replace (or hook) the default CRT/heap allocator for your process, and have it fetch blocks from the "more robust" heap manager in the plugin host.
精彩评论