openGL for matrix stack
I have a win32 application, in which I want to use openGL just for its matrix stack not for any rendering. That is, I want to use openGL to specify the camera, viewport etc so that I dont have to do the maths again. While creating the scene, I just want to project the points using gluProject and use it. The projected points are passed to another library which creates the scene for me, all the wind开发者_如何学编程ows handles are created by library itself and I dont have access to that.
The problem is, windows needs a device context for initialization. But, since I am not using openGL for any rendering, is there a way to use openGL without any Window handle at all?
Without any explicit initialization, when I read back the matrices using glGet, it returns a garbage. Any thought on how to fix it?
I want to use openGL just for its matrix stack not for any rendering.
That's not what OpenGL is meant for. OpenGL is a drawing/rendering API, not a math library. Actually the whole matrix math stuff has been stripped away from the latest OpenGL versions (OpenGL-3 core and later), for that very reason.
Also doing this matrix math stuff is so simple, you can write it down in less than 1k lines of C code. There's absolutely no benefit in abusing OpenGL for this.
The Matrix stack could potentially live on graphics hardware in your implementation. OpenGL is quite reasonable therefore in insisting you have an OpenGL context in order to be able to use such functions. This is because the act of creating a context probably includes setting up the necessary implementation mechanics required to store the matrix stack.
Even in a purely software based OpenGL implementation one would still expect the act of creating a context to call some equivalent to malloc
to secure the storage space for the stack. If you happened to find an OpenGL implementation where creating a context wasn't necessary I'd still keep clear of relying on that behavior since it's most likely undefined and could be broken in the next release of that implementation.
If it's C++ I'd just use std::stack
with the Matrix
class from your favorite linear algebra package if you're not using OpenGL for anything other than that.
I present to you my complete (open source) matrix class. Enjoy.
https://github.com/TheBuzzSaw/paroxysm/blob/master/newsource/CGE/Matrix4x4.h
I can recommend trying to implement those calls yourself. I did that once for a Palm app I wrote, tinyGL. What I learnt was that the documentation basically tells you in plain text what is done.
i.e the verbatim code for tglFrustum
and tglOrth
are (note that I was using fix point math to get some performance)
void tglFrustum(fix_t w, fix_t h, fix_t n, fix_t f) {
matrix_t fm, m;
fix_t f_sub_n;
f_sub_n = sub_fix_t(f,n);
fm[0][0] = mult_fix_t(_two_,div_fix_t(n,w));
fm[0][1] = 0;
fm[0][2] = 0;
fm[0][3] = 0;
fm[1][0] = 0;
fm[1][1] = mult_fix_t(_two_,div_fix_t(n,h));
fm[1][2] = 0;
fm[1][3] = 0;
fm[2][0] = 0;
fm[2][1] = 0;
fm[2][2] = inv_fix_t(div_fix_t(add_fix_t(f,n),f_sub_n));
f = mult_fix_t(_two_,f);
fm[2][3] = inv_fix_t(div_fix_t(mult_fix_t(f,n),f_sub_n));
fm[3][0] = 0;
fm[3][1] = 0;
fm[3][2] = _minus_one_;
fm[3][3] = 0;
set_matrix_t(m,_matrix_stack[_toms]);
mult_matrix_t(_matrix_stack[_toms],m,fm);
}
void tglOrtho(fix_t w, fix_t h, fix_t n, fix_t f) {
matrix_t om, m;
fix_t f_sub_n;
f_sub_n = sub_fix_t(f,n);
MemSet(om,sizeof(matrix_t),0);
om[0][0] = div_fix_t(_two_,w);
om[1][1] = div_fix_t(_two_,h);
om[2][2] = div_fix_t(inv_fix_t(_two_),f_sub_n);
om[2][3] = inv_fix_t(div_fix_t(add_fix_t(f,n),f_sub_n));
om[3][3] = _one_;
set_matrix_t(m,_matrix_stack[_toms]);
mult_matrix_t(_matrix_stack[_toms],m,om);
}
Compare those with the man pages for glFrustum and glOrtho
精彩评论