Calculating lat/long on sphere from X,Y coordinates?
I want to get the latitude and longitude on a 3D sphere, depending on Mouse X,Y. Iv'e understood that i have to use trigonometry. The problem is that the spher开发者_C百科e is put into a perspective, at least that's what i guess the problem is because with my calculations:
radius *= (width / 2.0f);
float y = (mouseY - (height / 2.0f)) / radius;
latitude = (float) -Math.toDegrees(Math.asin(y));
and longitude:
float x = (float) ((mouseX - (width / 2.0f)) / (radius));
longitude = (float) (90 - Math.toDegrees(Math.acos(x)));
For simplicity, radius = 1 at the beginning (from center to edge of the screen). Also we ignore any rotation.
Problem:
My problem is that i don't get the right values. The further i move away from center; the bigger the error becomes. As is mentioned, i guess this has to do with the fact that i'm using a perspective (frustum). But i can't figure out how to solve it, and what's really causing this problem.If it's to any help, my perspective is set up to 45 degrees, and my width / height ration is 0.6. Therefor with this fromula: tan(2*atan(1,x)/0.6) = x, I translate the sphere x (~-4.17) into the screen (on z).
My projection:
gl.glViewport(0, 0, width, height);
gl.glMatrixMode(GL10.GL_PROJECTION);
gl.glLoadIdentity();
GLU.gluPerspective(gl, 45f, (float) width / (float) height, 0.1f,
100.0f);
gl.glMatrixMode(GL10.GL_MODELVIEW);
gl.glLoadIdentity();
What you need to do is find the ray which starts at the camera's origin and goes through the eye plane at the mouse cursor's position. This can be done by applying the inverse of your perspective projection transform to the mouse cursor's viewport coordinates to get a world-space point that represents the mouse's position.
The intersection of this ray with your sphere, if one exists, will be a surface point that you can use to compute latitude and longitude values, using simple trigonometry.
精彩评论