开发者

Opengl textures and endianness

I'm using glTexSubImage2D with GL_LUMINANCE and GL_UNSIGNED_BYTE to display raw greyscale data from a camera directly - rather than having to repack it into an RGB bitmap manually.

I would like to run the camera in higher resolution mode, with 12 or 14bits/ pixel.

I can do this with simply by setting GL_SHORT but the camera returns data in big endian and my openGL implementation seems to be drawing it the wrong way around (on x86).

Is there a simple way of telling openGL that the textures are the 'wrong' way round? I would like to avoid manually byteswaping the data just for display开发者_运维知识库 because all the other functions expect big endian data.


Check out the glPixelStore* group of functions.

You might need to play with GL_UNPACK_SWAP_BYTES or GL_UNPACK_LSB_FIRST but double check you're using the correct GL_UNPACK_ALIGNMENT. By default the unpack alignment is 4, but if you're using one byte per pixel (lum / ub), you'll want to set that to 1. I ran into this problem just recently, and it took me longer to figure out than I'd care to admit :)

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜