I'm trying to visualize (for debugging purposes) the depth of a scene on a textured quad (I want to display the depth of a water pond onto the surface of the water). So what I'm doing is rendering the scene from camera point of view in a depth buffer and then binding that same depth buffer as a shader resource to the water shader. But when I sample the texture I always get a complete white texture. I used
pow(value, 100.0) and I get a little variation which correctly folows the depth of the terrain below the water, so I must assume some values are there but they're in a small range around 1.00. So I tried normalizing the values by inverting the perspective transformation for z (from clip space back to eye space) with the following equation:
float linearDepth = near * far / (far - depth * (far - near));
but again, total white. The formula seems correct to me (I found it in the Frank Luna's book, and even if I compute the projection matrix by myself I get it). The far and near plane are in the range 0.1 300.0 but even playing with those values doesn't get me anywhere.
All this is because I want to implement soft edges on my water and I need the depth values, but I want to visualize them before so I know I'm doing it correctly.