How to apply a normal map in OpenGL?
I'm learning to use normal maps (per pixel lighting?) in 2D graphics with OpenGL.
New to normal mapping, I managed to wrap my head around the Sobel operator and the generation of normal ma开发者_如何学Gops (mostly thanks to this), that is creating a (2D) array of normals from a (2D) array of pixel data.
(Most of the tutorials and forum threads that I have found were specific to 3D uses and modelling software. I aim to implement this functionality myself, in C++.)
- What do I do once I've got the normal map?
- Do I need to register it with OpenGL?
- Does it need to be associated with the texture, if yes, how is it done?
- How is it mapped to a 2D textured quad?
- (Is this something that I can do without shaders / GLSL?)
I recommend you look at:
This nvidia presentation on bumb mapping
I haven't looked at this for a while, but I remember it going over most of the details in implementing a bump map shader, should get a few ideas running.
This other nvidia tutorial for implementing bump mapping in the cg shader langauge
This bump mapping tutorial might also be helpful.
I know all these are not for full normal mapping but they're a good start.
Also, while there are differences in shader languages it shouldn't be to hard to convert formulers between them if you want to use GLSL.
As ybungalobill said, you can do it without shaders but unless you are working on an educational project (for your education) or a particular embedded device, I have no idea why the hell you would want to - but if you do need to this is where you want to look, it was written before shaders, and updated to reference them later.
- What do I do once I've got the normal map?
- Do I need to register it with OpenGL?
Yes, you need to load it as a texture.
- Does it need to be associated with the texture, if yes, how is it done?
If you mean associated with the color texture, then no. You need to create a texture that holds the normal map in order to use it later with OpenGl.
- How is it mapped to a 2D textured quad?
Your normal map is just another texture, you bind it and map as any other texture.
Normal map stores the normals in tangent space coordinates, so to calculate the lighting per pixel you need to know the relative position of the light source in tangent space coordinate system. This is done by setting additional parameters per vertex (normal, tangent, binormal), calculating the light-source position in tangent space coordinates and interpolating this position along the triangles. In the fragment shader you lookup the normal in the normal map and perform the desired lighting calculation based on the interpolated parameters.
- (Is this something that I can do without shaders / GLSL?)
Yes, you can use some legacy extensions to program the multi-texture environment combination functions. Never done it myself but it looks like hell.
the topic is quite old,but I'll answer it for any beginner who could seek any help here:
1) & 2)
your normal map is like a regular diffuse texture,you need to load it from the disk,create your texture2D like you would do with diffuse texture,and lock it with the other textures to send it to the shaders. So to sum it up,normal maps,height maps,etc are not different when you deal with them CPU side...it's always loading them,creating the texture,and locking them.
3)
Normal maps are just a way to store normals of every fragment in a texture. It's better to use them with the corresponding diffuse texture,because that's the way you can fake the little details. Nevertheless, it's not always necessary in some cases...for example, last time I was working on a way to render water...I used a dudv map,and a normal map,but not the corresponding diffuse map,as I used to render into textures to achieve the reflection.
4)
normal maps are mapped with the same texture coordinates as the diffuse texture... it means that if you have a quad wich coordinates are 0;0 , 0;1 , 1;0 , 1;1 these are the coordinates you use to map both textures.
5)
you could do it without shaders,I guess...you can even do it without OpenGL if you want...shaders are designed for these kind of operations,and it will be easier to do that with shaders.
So,normal maps are textures that encode normal vectors into colors... a normal vector is a 3D vector , and colors are 3D vectors too. When you use lighting,you usually send your normals from the vertex shader to the fragment shader...the fragment shader will interpolate the value of the vertex normal to find out an approximation of the pixel's own normal vector. this is done by default...
With normal maps,you don't use a vertex normal vector,but the normal vector encoded in the texture.
First,you'll have to convert the lighting calculation from world space into tangent space.that's done in the vertex shader.
then,you will sample your normal map in the fragment shader like a usual texture.You will get an RGB color of your pixel. you will have to convert that value into a normal vector,ie,multiply it by 2 and subtract 1 ,ie ,rgb*2 - 1,then use it as your normal vector in lighting calculation.
精彩评论