I'm rendering a height map using shader-based OpenGL (3.3 to be precise). The height map data is from a grayscale bitmap (2D array of bytes). The dimension of the bitmap is 256 x 256.

With this data, being the Y value, I calculate X and Z as corners of uniformly distributed cells in a grid. I generate the vertices and upload it to a VBO. I generate indices and populate an IBO; indices are of type `GLshort`

.

I've implemented a free roaming camera to roam around the terrain and verify its correctness. As I move around everything renders fine, except in some angles, particularly when viewing the terrain from above/afar, I see that some triangles in peaks of mountains go missing. In this position, if I vary the camera's position along its Z axis, I see the missing triangle are also varying. When I go closer to the peak, the error goes away and everything is nice and whole.

I've read other similar questions here, which mainly suggest that it could be because those triangles are deduced as back facing and thus may disappear when facet culling is enabled. To rule that case out, I tried `glDisable(GL_CULL_FACE)`

in vain.

I thought of this issue too, but it doesn't make sense. Z fighting happens when two fragments have the same depth buffer value, but the triangles appearing in place of the missing ones (in a slightly different dull colour; see image below) are *behind* the missing ones and not at the same Z and thus shouldn't have this problem. In either case, I tried enabling and disabling depth test to be sure, but to no avail.

Worried about floating point issues, I make sure that the vertices of the terrain are distributed around the origin i.e. if each cell is a square of length `c`

, then both `X`

and `Z`

values of vertices are distributed equally around 0 e.g. X doesn't vary between 0 to 256 but from -128 to 128, assuming a per cell dimension to be 1 unit.

Other than those I tried changing the way I give draw calls to GL: `glDrawElements`

with `GL_TRIANGLES`

, `glDrawElementBaseVertex`

and `glMultiDrawElementsBaseVertex`

with `GL_TRIANGLE_STRIP`

didn't change the the situation too.

I tried searching on the internet for an explanation but I'm at a loss there too, particularly when one doesn't know the keyword to search for. This should be happening fairly commonly in serious games since they all have terrains of some kind. Is there a proper name for this, and a solution?

so, to rework from comments: this is Z-fighting. The math is very well explained here: http://chaosinmotion.com/blog/?p=555, and the ways to solve it here: https://www.opengl.org/wiki/Depth_Buffer_Precision, but the gist is that Z-buffer is discrete, non-linear, and depends on the ratio of farplane/nearplane.

discrete: the distance between near and far plane is sliced into layers. if two overlapping fragments (would-be pixels) fall in the same Z-layer, it's up to chance as to which one will be actually drawn. the number of layers depends on how many bits per pixel is allocated to the depth buffer. In extreme case, if the depth buffer is 2 bits per pixel, there are only 4 slices.

non-linear: far-away slices are 'thicker' than the nearby slices. so the further away you look, the more chance there is of Z-fighting

depending on the ratio: you say you have near at 0.001f and far at 5000.0f - ratio of 5 million. the suggested ratio is no more than 30k. one easy way to fix it is to move the near to 0.1f: not a huge loss of scenery, but a great increase of depth resolution.

November 17, 2014 11:59 AM

- Serverfault Help
- Superuser Help
- Ubuntu Help
- Webapps Help
- Webmasters Help
- Programmers Help
- Dba Help
- Drupal Help
- Wordpress Help
- Magento Help
- Joomla Help
- Android Help
- Apple Help
- Game Help
- Gaming Help
- Blender Help
- Ux Help
- Cooking Help
- Photo Help
- Stats Help
- Math Help
- Diy Help
- Gis Help
- Tex Help
- Meta Help
- Electronics Help
- Stackoverflow Help
- Bitcoin Help
- Ethereum Help