Microsoft DirectX 8.0 (C++)

D3DXGetErrorStringW

Returns the Unicode error string for an HRESULT.

HRESULT D3DXGetErrorStringW(
  HRESULT hr,
  LPWSTR pBuffer,
  UINT BufferLen
);

Parameters

hr
[in] The specified HRESULT error code to decipher.
pBuffer
[out] Pointer to the buffer to fill in with the Unicode error string.
BufferLen
[in] Number of characters in the buffer. Any error message longer than this length is truncated.

Return Values

If the function succeeds, the return value is D3D_OK.

If the function fails, the return value can be one of the following values.

D3DERR_INVALIDCALL
D3DXERR_INVALIDDATA

Remarks

This function Interprets all Microsoft® Direct3D® HRESULTS.

D3DXGetErrorString maps to either D3DXGetErrorStringA or D3DXGetErrorStringW, depending on the inclusion or exclusion of the #define UNICODE switch. Include or exclude the #define UNICODE switch to specify whether your application expects Unicode or ANSI strings. The following code fragment shows how D3DXGetErrorString is defined.

#ifdef UNICODE
#define D3DXGetErrorString D3DXGetErrorStringW
#else
#define D3DXGetErrorString D3DXGetErrorStringA
#endif

Requirements

  Header: Declared in D3dx8core.h.
  Import Library: Use D3dx8.lib.

See Also

D3DXGetErrorStringA