Game Development Reference
In-Depth Information
// Multiplying the retrieved depth by 0.001 is done
// to convert to meters.
float fDepth =
(( float )KinectDepthBuffer[
int2( v.coords.x, 240-v.coords.y ) ]) * 0.001f;
uint2 uiOffsets =
KinectOffsetBuffer[ int2( v.coords.x, 240-v.coords.y ) ];
o.colorOffset =
float2( ( float )uiOffsets.x / 640.0f,
( float )uiOffsets.y / 480.0f );
float3 normal =
ComputeNormal( int2( v.coords.x, 240-v.coords.y ) );
float diffuse =
max( dot( normal, normalize(float3( 1.0f, 0.0f, 1.0f )) ),
0.0f );
// x_meters = (x_pixelcoord - 160) * z_meters *
// y_meters = (y_pixelcoord - 120) * z_meters *
float x_meters = (v.coords.x-160) * 0.003501f * fDepth;
float y_meters = (v.coords.y-120) * 0.003501f * fDepth;
float4 DepthCamViewSpace =
float4( x_meters, y_meters, fDepth, 1.0f );
o.position = mul( DepthCamViewSpace, WorldViewProjMatrix );
o.height = fDepth;
return o;
Listing 2.4. The vertex shader for rendering a 3D reconstruction of a depth frame.
given its integer coordinates of the pixel in the depth texture that it should be
Next we will consider the pipeline configuration that we will use to render the
geometry. In total, we will use the vertex shader, the geometry shader, and the
pixel shader stages. We will consider the vertex shader first, which is shown in
Listing 2.4. It starts out by reading the depth data and the offset for the depth-
to-color mapping frame texture. This offset is supplied in pixels, so we scale it
accordingly to produce texture coordinate offsets out of them. Next we calculate
a normal vector from the depth texture by using a Sobel filter. The normal vector
is not strictly needed, but it can be used to perform lighting operations on the
reconstructed surface if desired.
Search Nedrilad ::

Custom Search