开发者

Problems using VBOs to render vertices - OpenGL

I am transferring over my vertex arrays functions to VBOs to increase the speed of my application.

Here was my original working vertex array rendering function:

void BSP::render()
{
    glFrontFace(GL_CCW);

    // Set up rendering states
    glEnableClientState(GL_VERTEX_ARRAY);
    glEnableClientState(GL_TEXTURE_COORD_ARRAY);

    glVertexPointer(3, GL_FLOAT, sizeof(Vertex), &vertices[0].x);

    glTexCoordPointer(2, GL_FLOAT, sizeof(Vertex), &vertices[0].u);

    // Draw
    glDrawElements(GL_TRIANGLES, numIndices, GL_UNSIGNED_SHORT, indices);

    // End of rendering - disable states
    glDisableClientState(GL_VERTEX_ARRAY);
    glDisableClientState(GL_TEXTURE_COORD_ARRAY);
}

Worked great!

Now I am moving them into VBOs and my program actually caused my graphics card to stop responding. The setup on my vertices and indices are exactly the same.

New setup:

vboId is setup in the bsp.h like so: GLuint vboId[2];

I get no error when I just run the createVBO() function!

void BSP::createVBO()
{

    // Generate buffers
    glGenBuffers(2, vboId);

    // Bind the first buffer (vertices)
    glBindBuffer(GL_ARRAY_BUFFER, vboId[0]);
    glBufferData(GL_ARRAY_BUFFER, sizeof(vertices), vertices, GL_STATIC_DRAW);

    // Now save indices data in buffer
    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, vboId[1]);
    glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(indices), indices, GL_STATIC_DRAW);

}

And the rendering code for the VBOS. I am pretty sure it's in here. Just want to render whats in the VBO like I did in the vertex array.

Render:

void BSP::renderVBO()
{
    glBindBuffer(GL_ARRAY_BUFFER, vboId[0]);         // for vertex coordinates
    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, vboId[1]); // for indices

    // do same as vertex array except pointer
    glEnableClientState(GL_VERTEX_ARRAY);             // activate vertex coords array
    glVertexPointer(3, GL_FLOAT, 0, 0);               // last param is offset, not ptr

    // draw the bsp area
    glDrawElements(GL_TRIANGLES, numVertices, GL_UNSIGNED_BYTE, BUFFER_OFFSET(0));

    glDisableClientState(GL_VERTEX_ARRAY);            // deactivate vertex array

    // bind with 0, so, switch back to normal pointer operation
    glBindBuffer(GL_ARRAY_BUFFER, 0);
    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);
}

Not sure what the error is but I am pretty sure I have m开发者_如何学Pythony rendering function wrong. Wish there was a more unified tutorial on this as there are a bunch online but they are often contradicting eachother.


In addition what Miro said (the GL_UNSIGNED_BYTE should be GL_UNSIGNED_SHORT), I don't think you want to use numVertices but numIndices, like in your non-VBO call.

glDrawElements(GL_TRIANGLES, numIndices, GL_UNSIGNED_SHORT, 0);

Otherwise your code looks quite valid and if this doesn't fix your problem, maybe the error is somewhere else.

And by the way the BUFFER_OFFSET(i) thing is usuaully just a define for ((char*)0+(i)), so you can also just pass in the byte offset directly, especially when it's 0.

EDIT: Just spotted another one. If you use the exact data structures you use for the non-VBO version (which I assumed above), then you of course need to use sizeof(Vertex) as stride parameter in glVertexPointer.


If you are passing same data to glDrawElements when you aren't using VBO and same data to VBO buffer. Then parameters little differs, without FBO you've used GL_UNSIGNED_SHORT and with FBO you've used GL_UNSIGNED_BYTE. So i think VBO call should look like that:

glDrawElements(GL_TRIANGLES, numVertices, GL_UNSIGNED_SHORT, 0);

Also look at this tutorial, there are VBO buffers explained very well.


How do you declare vertices and indices?

The size parameter to glBufferData should be the size of the buffer in bytes and if you pass sizeof(vertices) it will return the total size of the declared array (not just what is allocated).

Try something like sizeof(Vertex)*numVertices and sizeof(indices[0])*numIndices instead.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜