How do I work out screen lengths of axis aligned position vectors after a frustum projection transform?
In Java I create a frustum projection matrix via
Matrix.frustumM(projMatrix, 0, left, right, bottom top, near, far);
and a view matrix via
Matrix.setLookAtM(viewMatrix, 0, 0, 0, -2.5, // eye position 0, 0, 0 // look position 0, -1, 0); // up direction
I also know how many GL units wide and high my device screen is, say
height. If my game tiles are
dy GL units in size, then for an orthographic projection I can calculate
height/dy to determine how many tiles I should have horizotnally and vertically (approximately).
But with a frustum projection, which is basically always looking down as per definition above, this does not give me the correct result.
(Note my tiles are always positioned in the
x/y plane at
Of course I understand why - this is because the further away from the camera, the smaller lengths become. But how do I go about transforming my
dy values to cope with this?
I have tried:
1+2.5+1=4.5 since that's the distance from the eye to the tile plane.
Multiplying the projection and view matrices (and both together) by the vector (dx, 0, 0, 1) for example.
But these don't work.
I also hear the projection matrix has a
w component in the bottom right of the array which may be something to do with z-scaling, but I'm not sure if that's on the right track.
Any advice welcome. I'll keep looking into it.