How do I use a pointer as an offset?
This issue originally came up in a related question of mine where I was having trouble reading some bit of code. The answer turned out to be that this line
&((GLushort *)0)[3 * mesh.sBoneBatches.pnBatchOffset[ba开发者_开发问答tchNum]]
evaluates to be a pointer. And it's used in
glDrawElements(GL_TRIANGLES, i32Tris * 3, GL_UNSIGNED_SHORT, &((unsigned short*)0)[3 * mesh.sBoneBatches.pnBatchOffset[batchNum]]);
where it is interpreted as an offset to draw a subset of vertex indices.
My code currently requires me to do by hand some of what openGL is doing in the glDrawElements, and I can't figure out how to use the pointer as an offset. glDrawElements uses an array of indices (named vertexIndices in my code), so I tried something like this:
vertexIndices[&((GLushort *)0)[3 * mesh.sBoneBatches.pnBatchOffset[batchNum]]]
but that obviously failed.
EDIT 1:
I just tried this and it compiles... still not sure if it's correct though. vertexIndices + (uint) &((GLushort *)0)[3 * mesh.sBoneBatches.pnBatchOffset[batchNum]]
This sort of idiom is used occasionally in C. It is good practice to encapsulate the calculation in a macro, but that is a side issue.
The trouble you are running into is this: Yes, the type of the expression is a pointer type, but the interpretation of the expression (as you realize) is an offset. Therefore, the expression value is only valid when cast to an int.
In the glDrawElements()
call, the expression value is being coerced to int by C's parameter rules.
In the array index, there is no coercion rule which would automatically cast the expression to int. You must supply the int cast:
vertexIndices[(int)&((GLushort *)0)[3 * mesh.sBoneBatches.pnBatchOffset[batchNum]]]
Better form would be:
#define OFFSET_OF_GLUSHORT_STUFF (m, N) \ ((int)&((GLushort *)0)[3 * (m).sBoneBatches.pnBatchOffset[(N)]]) /*...*/ vertexIndices[OFFSET_OF_GLUSHORT_STUFF(mesh, batchNum)];
WHOA, you just changed your question to indicate that you are trying to use this offset to calculate the pointer to an array element. In that case, DO NOT use this offset-based approach, because if you are passing a compatible pointer type, the pointer math should work directly. And if you are not passing a compatible pointer type, havoc will ensue.
You must get this to work using some direct pointer math, for your code's safety.
What's being passed to glDrawElements
is a pointer to an offset, not a pointer as an offset. Your mysterious line of code is equivalent to:
size_t offset = ((GLushort*)0)[3 * mesh.sBoneBatches.pnBatchOffset[batchNum]];
glDrawElements(GL_TRIANGLES, i32Tris * 3, GL_UNSIGNED_SHORT, &offset);
In other words, glDrawElements()
isn't using the fourth parameter as an offset directly, but a pointer to an offset value. This is clear from the OpenGL documentation, which has the following signature for glDrawElements:
void glDrawElements( GLenum mode,
GLsizei count,
GLenum type,
const GLvoid *indices );
And the explanation:
indices - Specifies a pointer to the location where the indices are stored.
(Emphasis mine.)
Aside: The offset calculation is a little weird, but valid for the reasons outlined in the answer to your other question. It's clearer and more stylistically normal to write that as:
size_t offset = sizeof(GLushort) * 3 * mesh.sBoneBatches.pnBatchOffset[batchNum];
Calculations based on an offset of zero can work in some compilers, but are not guaranteed to do so. ANSI C offers you the offsetof
operator in stddef.h
to allow you to calculate offsets of members of a struct.
精彩评论