What format for #greyscale' render targets in directX?
I have a directx9 game engine that creates its normal adaptor with this format: D3DFMT_X8R8G8B8 I have a system where I render some objects to an offscreen rende开发者_StackOverflowr target, as lightmaps. I then use that lightmap data to composite back to the back buffer where they act as a full screen 'mask' and let me get the effect of torches or other light sources on a dark scene. Everything works just great. The problem is, I'm aware that my big offscreen lightmap render targets are 16MB each, at a large res, and I only really need 8 bits of data (greyscale) from them, so 75% of the 32 bit render target memory is a waste. (I'm targeting low spec cards). I tried creating the render targets as D3DFMT_A8 But directx silently fails on that (if I add CheckDeviceFormat() I see it happen) and creates 32 bit anyway. I use the D3DXCreateTexture function My question is, what format is best for creating these offscreen buffers?
Thankyou for your help, I'm not good at render target related stuff :)
D3DFMT_L8 is 8 bit luminance. I believe it's supported on GeForce 3 (i.e. the first consumer card with shader 1.1!), so must be available everywhere. I think the colour is read as L, L, L, 1, i.e. rgb = luminance value, alpha = 1.
Edit: this tool is useful for finding caps:
http://zp.lo3.wroc.pl/cdragan/wizard.php
Ontopic: If you are targeting lower spec cards, you are very likely to be running on systems where 8-bit single channel render targets are not supported at all.
If you are using shaders to do the rendering and compositing, it should be possible to use the rgba channels for 4 alternating pixels of your lightmap, packing your information. Perhaps you can tell us a little bit more about your current rendering setup?
Offtopic: AWESOME to have you here on StackOverflow, big fan of your work!
精彩评论