开发者

How to use your own class in glVertexPointer / glColorPointer / glNormalPointer

I have a class representing a vertex as follows:

class Vertex
{
public:
Vertex(void);
~Vertex(void);

GLfloat x;
GLfloat y;
GLfloat z;

GLfloat r;
GLfloat g;
GLfloat b;

GLfloat nx;
GLfloat ny;
GLfloat nz;

Vertex getCoords();

Vertex crossProd(Vertex& b);
void normalize();

Vertex operator-(Vertex& b);
Vertex& operator+=(const Vertex& b);
bool operator==(const Vertex& b) const;
};

I initialize my VBO's as follows:

glGenBuffers(2, buffers);

glBindBuffer(GL_ARRAY_BUFFER, buffers[0]);
glBufferData(GL_ARRAY_BUFFER, fVertices.size()*sizeof(Vertex), &(fVertices[0].x), GL_STATIC_DRAW);
glVertexPointer(3, GL_UNSIGNED_BYTE, sizeof(Vertex), BUFFER_OFFSET(0));
glColorPointer(3, GL_FLOAT, sizeof(Vertex), BUFFER_OFFSET(6*sizeof(GL_FLOAT)));
glNormalPointer( GL_FLOAT, sizeof(Vertex), BUFFER_OFFSET(3*sizeof(GL_FLOAT)));

glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, buffers[2]);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, fIndices.size()*sizeof(GLushort), &fIndices[0], GL_STATIC_DRAW);
glIndexPointer(GL_UNSIGNED_SHORT, 0, BUFFER_OFFSET(0));

But when drawing this is all messed up. It's probably because I pa开发者_Go百科ss the wrong stride and/or buffer offset I presume.

The strange thing is, I was experimenting with pointer to see if the adresses match, trying to figure it out myself, I encountered something odd:

If I do:

GLfloat *test = &(fVertices[0]).x;
GLfloat *test2 = &(fVertices[0]).y;

then

test + sizeof(GLfloat) != test2;

fVertices is a std::vector btw.

Hope that anyone can enlighten me about what to pass on to the gl*pointer calls.


The byte arrangement of data in structs/classes (C++ considers them the same) is only guaranteed if that class is a plain-old-data (POD) type. Before C++0x, the rules for POD types were very strict. The class could not have constructors or destructors, of any kind. It also couldn't hold non-POD types. It cannot have virtual functions. And so on.

So you need to remove the constructor and destructor.

Secondly, your offsets are wrong. Your colors are 3 floats from the front, not 6. You seem to have switched your colors and normals.


This:

test + sizeof(GLfloat) != test2;

is pointer arithmetic. If you have a pointer to some type T, adding 1 to this pointer does not add 1 to the address. Basically, it works like array access. So this is true:

T *p = ...;
&p[1] == p + 1;

So if you want to get the float after test, you add one to it, not the size of a float.


On a personal note, I've never understood why so many OpenGL programmers love to put together these little vertex format structs. Where a vertex holds a certain arrangement of data. They always seem like a good idea at the time, but the moment you have to change them, everything breaks.

It's much easier in the long run not to bother with having explicit "vertex" objects at all. Simply have meshes, which have whatever vertex data they have. Everything should be done in a linear array of memory with byte offsets for the attributes. That way, if you need to insert another texture coordinate or need a normal or something, you don't have to radically alter code. Just modify the byte offsets, and everything works.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜