Opengl: how can I use the glIndexPointer to implement color index?
I have an application to show a 2D data MxN with the data value from 0-63. I am displaying it using a colormap which is 64x3. I would like to do it this way: Prepare the vertex points, prepare the index array which is the data values. I think this would be the best way which has both the space and performance efficiency. The code would be like this:
p=colormap_matlab;
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_COLOR_ARRAY);
glEnableClientState(GL_INDEX_ARRAY);
glVertexPointer(2, GL_INT, 0, vertices);
glColorPointer(3, GL_FLOAT, 0, p);
glIndexPointer(GL_UNSIGNED_BYTE,0,color_index);
int iter = 0;
int iterP = 0;
for(i = 0; i < 127; i++)
{
iter = 0;
iterP = 0;
for(j = 0; j < 1000; j++)
{
id1 = (int) data[i*1000+j ];
id2 = (int) data[(i+1)*1000 + j ];
color_index[iter++]=id1;
color_index[iter++]=id2;
vertices[iterP++] = i;
vertices[iterP++] = j;
vertices[iterP++] = i+1;
vertices[iterP++] = j;
}
//glDrawElements(GL_QUAD_STRIP, 999*2, GL_UNSIGNED_INT, indices);
glDrawArrays(GL_QUAD_STRIP,0,1000*2);
//glDrawArrays(GL_QUAD_STRIP,500*2,500*2);
}
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_COLOR_ARRAY);
glDisableClientState(GL_INDEX_ARRAY);
However, it turns out the index pointer array does not work at all. It just draws the color from the colormap sequentially (and will cause data overflow since the colormap is only 64x3).
setting up the context would be like this:
CSimple_drawView *pView = (CSimple_drawView* ) pParam;
HWND hWnd = (pView)->GetSafeHwnd();
HDC hDC ;
HGLRC hRC;
hDC = ::GetDC(hWnd);
SetupPixelFormat(hDC);
hRC = wglCreateContext( hDC );
wglMakeCurrent( hDC, hRC );
readfile(0);
init_index();
init_mesh_index();
int i = 0;
all_threads.SetEvent();
// end added here
int startTime = GetTickCount();
while(i < 200)
{
initialize(hWnd);
//readfile(0);
WaitForSingleObject(all_threads.m_hObject, INFINITE);
glLoadIdentity();
glClear(GL_COLOR_BUFFER_BIT);
i++;
Render4(0,count);
++count;
SwapBuffers(hDC);
}
BOOL SetupPixelFormat(HDC hDC)
{
PIXELFORMATDESCRIPTOR pixelDesc=
{
sizeof(PIXELFORMATDESCRIPTOR),
1,
PFD_DRAW_TO_WINDOW|PFD_SUPPORT_OPENGL|
PFD_DOUBLEBUFFER|PFD_SUPPORT_GDI,
PFD_TYPE_RGBA,
24,
0,0,0,0,0,0,
0,
0,
0,
0,0,0,0,
32,
0,
0,
PFD_MAIN_PLANE,
0,
0,0,0
};
int pixelformat;
if ( (pixelformat = Cho开发者_StackOverflowosePixelFormat(hDC, &pixelDesc)) == 0 )
{
MessageBox(NULL, "ChoosePixelFormat failed", "Error", MB_OK);
return FALSE;
}
if (SetPixelFormat(hDC, pixelformat, &pixelDesc) == FALSE)
{
MessageBox(NULL, "SetPixelFormat failed", "Error", MB_OK);
return FALSE;
}
return TRUE;
}
Anyone can give me some hints on this?
You should setup a pixel format able to display indexed colors (see the documentation of PIXELFORMATDESCRIPTOR )
Substitute PFD_TYPE_RGBA with PFD_TYPE_COLOR_INDEX. The window setup shall follow the rendering operations.
To specify a custom palette, you have to define it at the window manager level (since you create the window using it). The functions are SelectPalette and SetPaletteEntries.
Make sure this doesn't apply to you:
glEnableClientState(GL_INDEX_ARRAY)
What's wrong with this code?
glBindBuffer(GL_ARRAY_BUFFER, vboid); glVertexPointer(3, GL_FLOAT, sizeof(vertex_format), 0); glNormalPointer(GL_FLOAT, sizeof(vertex_format), 20); glEnableClientState(GL_VERTEX_ARRAY); glEnableClientState(GL_NORMAL_ARRAY); glEnableClientState(GL_INDEX_ARRAY); glBindBuffer(GL_ELEMENT_ARRAY, iboid); glDrawRangeElements(....);
The problem is that
GL_INDEX_ARRAY
does not mean what this programmer thinks it does.GL_INDEX_ARRAY
has nothing to do with indices for yourglDrawRangeElements
. This is for color index arrays.Never use these. Just use a color array, as follows.
glColorPointer(4, GL_UNSIGNED_BYTE, sizeof(vertex_format), X); glEnableClientState(GL_COLOR_ARRAY);
精彩评论