3D Lingo Dictionary > T-Z > textureRenderFormat |
![]() ![]() ![]() |
textureRenderFormat
Syntax
getRenderServices().textureRenderFormat
Description
3D rendererServices property; allows you to get or set the default bit format used by all textures in all 3d cast members. Use a texture's texture.RenderFormat property to override this setting for specific textures only. Smaller sized bit formats (i.e 16 bit variants such as #rgba5551) use less hardware accelerator video ram, allowing you to make use of more textures before being forced to switch to software rendering. Larger sized bit formats (i.e. 32 bit variants such as #rgba8888) generally look better. In order to use alpha transparency in a texture, the last bit must be nonzero. In order to get smooth transparency gradations the alpha channel must have more than 1 bit of precision.
Each pixel formats has four digits, with each digit indicating the degree of precision for red, green, blue, and alpha. The value you choose determines the accuracy of the color fidelity (precision of the alpha channel) and the amount of memory used by the hardware texture buffer. You can choose a value that improves color fidelity or a value that allows you to fit more textures on the card. You can fit twice as many 16-bit textures as 32-bit textures in the same space. If a movie uses more textures than fit on a card at a the same time, Director switches to #software
rendering.
You can specify any of the following values for textureRenderFormat
:
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
The default value is #rgba5551
.
Example
This statement sets the global textureRenderFormat
for the 3D member to #rgba8888
. Each texture in this movie will be rendered in 32 bit color unless its texture.renderFormat
property is set to a value other than #default
.
getRendererServices().textureRenderFormat = #rgba8888
See also
renderer
, preferred3DRenderer
, renderFormat
, getRendererServices()
![]() ![]() ![]() |