开发者

Render to textue array with ATI hardware

I have an OpenGL implementation that supposedly should make it possible to render to a texture array. This by selecting different layers in the geometry shader. Though, the problem is that this does not work due to a ATI driver bug. I would really like to get this working, and I have come up with a couple of alternatives, on how to proceed:

  1. Remake the implementation in Direct3D, are ATI dr开发者_高级运维ivers better at D3D?
  2. Come up with a workaround (cant think of any though).
  3. Buy an nVidia card.

What should I do? Any other alternatives?


This is an outdated way, but it should work on ATI cards. This renders the depthmap to a texture, but you can render pretty much anything.

// generate texture (in init method)
init(){...
glGenTextures(1, &depthTexture);
glBindTexture(GL_TEXTURE_2D, depthTexture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT, 512,
512, 0, GL_DEPTH_COMPONENT,GL_FLOAT, 0);}

// draw the mesh to fill the depth buffer
glDisable(GL_LIGHTING);
draw_mesh();
// copy depth buffer to texture
glBindTexture(GL_TEXTURE_2D, depthTexture);
glCopyTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, 0, 0, 512, 512);
// clear screen
cls();
// enable texturing and bind the depth texture
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, depthTexture);
// overlay texture to fullsizequad
drawFullsizeQuad();


Consider filing a bug to ATI: your test case (layered rendering) is supposed to be small and 100% reproducable.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜