I read lots of information about getting depth with fragment shader. such as http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=234519
I\'m writing a 2d game in OpenGL ES 1.0 (with casts to 1.1 Extensions where applicable). I\'m trying to keep this as generic as possible in case I\'ve missed something obvious.
I\'m on Android OpenGL-ES 2.0 and after all the limitations that come with it, I can\'t figure out how to take 2D screen touches to the 3D points I have. I can\'t get the right results.
I\'d like to retrieve the depth buffer from my camera view for a 3D filtering application. Currently, I\'m using glReadPixels to get the depth component. Instead of the [0,1] values, I need the true v
I can\'t seem to read the depth buffer values in OpenGL ES2 on iOS 4.3 afDepthPixels = (float*)malloc(sizeof(float) * iScreenWidth * iScreenHeight);
I am doing a ray-casting in a 3d texture until I hit a correct value. I am doing the ray-casting in a cube and the cube corners are already in world coordinates so I don\'t have to multiply the vertic
I saw many examples on the web (for example) which do the following Create and Bind FBO Create and Bind BUFFERS (texture, render, depth, stencil)
im rendering png\'s on simple squares in opengl es 2.0, but when i try and draw something behind an square i have already drawn the transparent 开发者_如何学JAVAarea in my top square are rendered the
We are trying to simulate si开发者_StackOverflow中文版mple kinect output. I have rendered a triangle mesh in Matlab and now I want to get at the depth buffer of the figure/axis where the shape has be
I\'m rendering my scene to a texture. This works fine except that depth testing does not work. How do I enable depth testing if rendering to an offscreen texture? I\'m using the FrameBuffer class http