RenderStateManager.SlopeScaleDepthBias Property

Language:

Note: This documentation is preliminary and is subject to change.

Retrieves or sets a value used to determine how much bias can be applied to coplanar primitives to reduce z-fighting.

Definition

Visual Basic .NET Public Property SlopeScaleDepthBias As Single
C# public float SlopeScaleDepthBias { get; set; }
Managed C++ public: __property float get_SlopeScaleDepthBias();
public: __property void set_SlopeScaleDepthBias(float);
JScript .NET public function get SlopeScaleDepthBias() : float
public function set SlopeScaleDepthBias(float);

Property Value

System.Single . Floating-point value that specifies the slope scale bias to apply.

This property is read/write. 

Remarks

The default value is 0.

Polygons that are coplanar in your 3-D space can be made to appear as if they are not coplanar by adding a z-bias to each one. An application can help ensure that coplanar polygons are rendered properly by adding a bias to the z-values that the system uses when rendering sets of coplanar polygons.

The following formula shows how to calculate the bias to be applied to coplanar primitives.

bias = (m * SlopeScaleDepthBias) + DepthBias

where m is the maximum depth slope of the triangle being rendered, defined as:

m = max(abs(delta z / delta x), abs(delta z / delta y))

See Also


© 2004 Microsoft Corporation. All rights reserved. Terms of use.

Feedback? Please provide us with your comments on this topic.