Microsoft DirectX 9.0 SDK Update (October 2004)

Using WAV Data

Language:

Note: This documentation is preliminary and is subject to change.

Microsoft® DirectSound® buffers play only waveform audio data consisting of digital samples of the sound at a fixed sampling rate. The representation of an analog signal by a sequence of numbers is known as Pulse Code Modulation (PCM).

WAV data is usually stored in files or resources in Resource Interchange File Format (RIFF). The data includes a description of the WAV format, including parameters such as the sampling rate and number of output channels. The format of a sound is described by a WaveFormat structure. Managed DirectSound does not support audio formats with more than two channels, as described in a C++ WAVEFORMATEXTENSIBLE structure.

DirectSound does not support compressed WAV formats. Applications should use the Audio Compression Manager (ACM) functions, provided with the Microsoft Win32® application programming interfaces (APIs) in the Microsoft Platform Software Development Kit (SDK), to convert compressed audio to PCM format before writing the data to a sound buffer.

For short sounds, you can create a SecondaryBuffer from a WAV file by using SecondaryBuffer(String,Device) or SecondaryBuffer(String,BufferDescription,Device). To access streaming buffers, you can use the SecondaryBuffer(Stream,Device) or SecondaryBuffer(Stream,BufferDescription,Device) constructors.


© 2004 Microsoft Corporation. All rights reserved. Terms of use.

Feedback? Please provide us with your comments on this topic.
For more help, visit the DirectX Developer Center