开发者

VBO Differences between OS X and Win7?

Good afternoon,

The problem is that code written on Win7 (C#, VS2010) that displays an OpenGL tile grid displays differently in Mac OS X (C#, MonoDevelop). Each tile is currently rendered separately and the x/y offset are actually stored in the vertex information. The tile is built like so:

private void BuildTile()
{
    Vector3[] vertices = new Vector3[4];
    Vector2[] uvs = new Vector2[] { new Vector2(0, 1), new Vector2(0, 0), new Vector2(1, 1), new Vector2(1, 0) };
    int[] indices = new int[] { 1, 0, 2, 1, 2, 3 };

    // build vertex list based on position
    vertices[0] = new Vector3(Location.X, Location.Y + 1, 0);
    vertices[1] = new Vector3(Location.X, Location.Y, 0);
    vertices[2] = new Vector3(Location.X + 1, Location.Y + 1, 0);
    vertices[3] = new Vector3(Location.X + 1, Location.Y, 0);

    VBO<Vector3> vertex = new VBO<Vector3>(vertices, BufferTarget.ArrayBuffer, BufferUsageHint.StaticRead);
    VBO<Vector2> uv = new VBO<Vector2>(uvs, BufferTarget.ArrayBuffer, BufferUsageHint.StaticRead);
    VBO<int> element = new VBO<int>(indices, BufferTarget.ElementArrayBuffer, BufferUsageHint.StaticRead);

    VAO = new VAO(ShaderProgram, vertex, uv, element);
}

Since Mac OS X does not support OpenGL 3, the VAO object binds the attributes every time the draw call occurs.

    public void BindAttributes()
    {
        if (vertex == null) throw new Exception("Error binding attributes.  No vertices were supplied.");
        if (element == null) throw new Exception("Error binding attributes.  No element array was supplied.");
        uint array = 0;

        Gl.EnableVertexAttribArray(array);
        Gl.BindAttribLocation(Program.ProgramID, array, "in_position");
        Gl.BindBuffer(vertex.BufferTarget, vertex.vboID);
        Gl.VertexAttrib开发者_如何转开发Pointer(array++, vertex.Size, vertex.PointerType, true, 12, new IntPtr(0));

        Gl.EnableVertexAttribArray(array);
        Gl.BindAttribLocation(Program.ProgramID, array, "in_uv");
        Gl.BindBuffer(uv.BufferTarget, uv.vboID);
        Gl.VertexAttribPointer(array++, uv.Size, uv.PointerType, true, 8, new IntPtr(0));

        Gl.BindBuffer(BufferTarget.ElementArrayBuffer, element.vboID);
    }

The shader is pretty straight forward. Vertex shader:

uniform mat4 projection_matrix;
uniform mat4 modelview_matrix;

attribute vec3 in_position;
attribute vec2 in_uv;

varying vec2 uv;

void main(void)
{
  uv = in_uv;

  gl_Position = projection_matrix * modelview_matrix * vec4(in_position, 1);
}

Fragment shader:

uniform sampler2D active_texture;

varying vec2 uv;

void main(void)
{
  gl_FragColor = texture2D(active_texture, uv);
}

The issue is that the Mac OS X version is overlaying all of the tiles. All 100 tiles will be stacked on top of each other in the position of the first tile. However, on Win7 the tiles will be distributed in a 10x10 grid as expected. Does anyone have any idea why this might be happening? I looked at the vertex data and it is storing the offsets on both Mac OS X and Win7, and the VBO IDs are unique, and the right VBOs are being bound. I'm guessing there must be an issue with my method for binding the attributes, but I cannot see a problem. Is there some funny difference between how OpenGL 2 and 3 handle vertex attributes?

Thanks, and let me know if you need anymore of my code to help me find the issue.

Note: I can store the vertex offset in the shader (as a uniform) and update it it per tile. This works! So, I'm lead to believe that only the first VBO is being bound, and is just being rendered 100 times.


It turns out that you cannot bind an attribute location after linking the program. However, for whatever reason my Win7 PC (with Radeon drivers) decided to allow it. Here's how you should actually bind your attributes.

public void BindAttributes()
{
    if (vertex == null) throw new Exception("Error binding attributes.  No vertices were supplied.");
    if (element == null) throw new Exception("Error binding attributes.  No element array was supplied.");

    uint loc = (uint)Gl.GetAttribLocation(Program.ProgramID, "in_position");
    Gl.EnableVertexAttribArray(loc);
    Gl.BindBuffer(vertex.BufferTarget, vertex.vboID);
    Gl.VertexAttribPointer(loc, vertex.Size, vertex.PointerType, true, 12, new IntPtr(0));


    loc = (uint)Gl.GetAttribLocation(Program.ProgramID, "in_uv");
    Gl.EnableVertexAttribArray(loc);
    Gl.BindBuffer(uv.BufferTarget, uv.vboID);
    Gl.VertexAttribPointer(loc, uv.Size, uv.PointerType, true, 8, new IntPtr(0));

    Gl.BindBuffer(BufferTarget.ElementArrayBuffer, element.vboID);
}
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜