Problems trying to apply shader to vertex array in OpenGL using C++
I have 4 dimensional vertices(X,Y,A,B) that I'd like to draw as 6 separate 2D plots (XxY, XxA, XxB, YxA, ...)
My vertices are defined as follows:
GLint data[MAX_N_POINT][4];
I can draw the first (X,Y) of the 2D plots just fine with:
glEnableClientState(GL_VERTEX_ARRAY);
glVertexPointer(4, GL_INT, 4*sizeof(GLint), &data[0][0]);
glDrawArrays(GL_POINTS, 0, MAX_N_POINT-1);
glDisableclientState(GL_VERTEX_ARRAY);
To draw the other 2D plots (XxA, AxB, etc...) I've decided to use a shader that swizzles the X,Y,Z,W dimensions of the vertices depending on which dimensions I want to draw.:
uniform int axes;
void main()
{
vec2 po开发者_JAVA百科int;
if (axes == 0) {
point = gl_Vertex.xy;
} else if (axes == 1) {
point = gl_Vertex.xz;
} else if (axes == 2) {
point = gl_Vertex.xw;
} else if (axes == 3) {
point = gl_Vertex.yz;
} else if (axes == 4) {
point = gl_Vertex.yw;
} else if (axes == 5) {
point = gl_Vertex.zw;
}
gl_Position = gl_ModelViewProjectionMatrix * vec4(point.xy, 0.0, 1.0);
gl_FrontColor = gl_Color;
gl_BackColor = gl_Color;
}
I've successfully loaded, compiled, added to the shader to a program, and linked the program.
Now that I have the shader program its not clear how to use the program such that is affects the way my vertex arrays are drawn. I've tried the following but it appears to have no effect as it draws things exactly like there were without the shader:
glEnableClientState(GL_VERTEX_ARRAY);
glVertexPointer(4, GL_INT, 4*sizeof(GLint), &data1[0][0]);
//New shader code here
glUseProgram(shaderProg);
int plotAxes = rand()%4;
GLint axes = glGetUniformLocation(shaderProg, "axes");
glUniform1i(axes, plotAxes);
glDrawArrays(GL_POINTS, 0, MAX_N_POINT-1);
glDisableClientState(GL_VERTEX_ARRAY);
glUseProgram(0);
Is there something fundamental that I am missing or don't understand properly?
Edit 2: I've updated the shader code per Christian's suggestions. I've also verified that shade loads without errors however if I check for OpenGl errors after I call glUseProgram I get an OpenGL Error: invalid operation
error.
Edit 3: Here is the final shader that worked:
uniform int axes;
void main()
{
vec4 point;
if (axes == 0) {
point = gl_Vertex;
} else if (axes == 1) {
point = gl_Vertex.xzyw;
} else if (axes == 2) {
point = gl_Vertex.xwzy;
} else if (axes == 3) {
point = gl_Vertex.yzxw;
} else if (axes == 4) {
point = gl_Vertex.ywxz;
} else if (axes == 5) {
point = gl_Vertex.zwxy;
}
point.z = 0.0;
point.w = 1.0;
// eliminate w point
gl_Position = gl_ModelViewProjectionMatrix * point;
gl_FrontColor = gl_Color;
gl_BackColor = gl_Color;
}
You have to set the axes
uniform (glUniform1i
) after you enabled/use the program (glUseProgram
), otherwise it doesn't have any effect (and your axes
uniform keeps its default value, which seems to be 0). But I hope your program also contains a valid fragment shader that uses your passcolor
varying, as the fixed function fragment processor doesn't know what to do with this varying.
And by the way, although it won't cost you that much if all vertices take the same path anyway, but using 6 different shaders would be a better idea performance-wise. You can still use preprocessor macros to reduce the writing overhead.
EDIT: From your update it seems you don't use a special fragment shader. So you just use the builtin fragment pipeline together with your vertex shader. Whereas this is possible, you still have to match both stages to work together. Your passcolor
varying is completely useless, as the fixed-function pipeline doesn't know what to do with it. Just use the buitlin color varying and replace
passcolor = gl_Color;
with
gl_FrontColor = gl_Color;
gl_BackColor = gl_Color; //just to be sure
You should also check if your shaders compile and link successfully and inspect the info log in case they don't, as I suppose your program won't link because of the above varying inconstencies. So you end up with not using the shader (and with a bunch of GL_INVALID_OPERATION
s) without noting it.
精彩评论