═══ 1. Notices ═══ First Edition (October 1994) The following paragraph does not apply to the United Kingdom or any country where such provisions are inconsistent with local law: INTERNATIONAL BUSINESS MACHINES CORPORATION PROVIDES THIS PUBLICATION "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Some states do not allow disclaimer of express or implied warranties in certain transactions, therefore, this statement may not apply to you. This publication could include technical inaccuracies or typographical errors. Changes are periodically made to the information herein; these changes will be incorporated in new editions of the publication. IBM may make improvements and/or changes in the product(s) and/or the program(s) described in this publication at any time. It is possible that this publication may contain reference to, or information about, IBM products (machines and programs), programming, or services that are not announced in your country. Such references or information must not be construed to mean that IBM intends to announce such IBM products, programming, or services in your country. Requests for technical information about IBM products should be made to your IBM authorized reseller or IBM marketing representative. ═══ 1.1. Copyright Notices ═══ COPYRIGHT LICENSE: This publication contains printed sample application programs in source language, which illustrate OS/2 programming techniques. You may copy, modify, and distribute these sample programs in any form without payment to IBM, for the purposes of developing, using, marketing or distributing application programs conforming to the OS/2 application programming interface. Each copy of any portion of these sample programs or any derivative work, which is distributed to others, must include a copyright notice as follows: "(C) (your company name) (year). All rights reserved." (C) Copyright International Business Machines Corporation 1994. All rights reserved. Note to U.S. Government Users - Documentation related to restricted rights - Use, duplication or disclosure is subject to restrictions set forth in GSA ADP Schedule Contract with IBM Corp. ═══ 1.2. Disclaimers ═══ References in this publication to IBM products, programs, or services do not imply that IBM intends to make these available in all countries in which IBM operates. Any reference to an IBM product, program or service is not intended to state or imply that only IBM's product, program, or service may be used. Any functionally equivalent product, program, or service that does not infringe any of IBM's intellectual property rights or other legally protectable rights may be used instead of the IBM product, program, or service. Evaluation and verification of operation in conjunction with other products, programs, or services, except those expressly designated by IBM, are the user's responsibility. IBM may have patents or pending patent applications covering subject matter in this document. The furnishing of this document does not give you any license to these patents. You can send license inquiries, in writing, to the IBM Director of Licensing, IBM Corporation, 500 Columbus Avenue, Thornwood NY 10594, U.S.A. ═══ 1.3. Trademarks ═══ The following terms, denoted by an asterisk (*) in this publication, are trademarks of the IBM Corporation in the United States or other countries: Audio Visual Connection Common User Access CUA IBM Multimedia Presentation Manager/2 OS/2 Personal System/2 Presentation Manager PS/2 Ultimotion Workplace Shell The following terms, denoted by a double asterisk (**) in this publication, are trademarks of other companies as follows. Other trademarks are trademarks of their respective companies. DVI Intel Corporation Indeo Intel Corporation Philips Philips Electronics N.V. Pioneer Pioneer Electronic Corporation Pro AudioSpectrum 16 Media Vision, Inc. ReelMagic Sigma Designs, Inc. Sony Sony Corporation Sound Blaster Creative Labs, Inc. Windows Microsoft Corporation WinTV Hauppauge Computer Works, Inc. ═══ 2. Introduction ═══ This guide provides information about application interfaces to help you select and implement functions for your OS/2 multimedia applications. It is written for application programmers who are interested in developing OS/2 multimedia applications. ═══ 2.1. Related Publications ═══ The following diagram provides an overview of the OS/2 Technical Library. Books can be ordered by calling toll free 1-800-879-2755 weekdays between 8:30 a.m. and 7:00 p.m. (EST). In Canada, call 1-800-465-4234. ═══ 2.2. Additional Multimedia Information ═══ Multimedia REXX - (online) Describes REXX functions that enable media control interface string commands to be sent from an OS/2 command file to control multimedia devices. This online book is provided with OS/2 multimedia. Guide to Multimedia User Interface Design - (41G2922) Describes design concepts to be considered when designing a CUA multimedia interface that is consistent within a particular multimedia product and across other products. ═══ 2.3. Using This Online Book ═══ Before you begin to use this online book, it would be helpful to understand how you can: o Expand the Contents to see all available topics o Obtain additional information for a highlighted word or phrase o Use action bar choices. How To Use the Contents When the Contents window first appears, some topics have a plus (+) sign beside them. The plus sign indicates that additional topics are available. To expand the Contents if you are using a mouse, select the plus sign (+). If you are using a keyboard, use the Up or Down Arrow key to highlight the topic, and press the plus key (+). To view a topic, double-click on the topic (or press the Up or Down Arrow key to highlight the topic, and then press Enter). How To Obtain Additional Information After you select a topic, the information for that topic appears in a window. Highlighted words or phrases indicate that additional information is available. You will notice that certain words in the following paragraph are highlighted in green letters, or in white letters on a black background. These are called hypertext terms. If you are using a mouse, double-click on the highlighted word. If you are using a keyboard, press the Tab key to move to the highlighted word, and then press the Enter key. Additional information will appear in a window. How To Use Action Bar Choices Several choices are available for managing information presented in the M-Control Program/2 Programming Reference. There are three pull-down menus on the action bar: the Services menu, the Options menu, and the Help menu. The actions that are selectable from the Services menu operate on the active window currently displayed on the screen. These actions include the following: Bookmark Sets a place holder so you can retrieve information of interest to you. When you place a bookmark on a topic, it is added to a list of bookmarks you have previously set. You can view the list, and you can remove one or all bookmarks from the list. If you have not set any bookmarks, the list is empty. To set a bookmark, do the following: 1. Select a topic from the Contents. 2. When that topic appears, choose the Bookmark option from the Services menu. 3. If you want to change the name used for the bookmark, type the new name in the field. 4. Select the Place radio button (or press the Up or Down Arrow key to select it). 5. Select OK. The bookmark is then added to the bookmark list. Search Finds occurrences of a word or phrase in the current topic, selected topics, or all topics. You can specify a word or phrase to be searched. You can also limit the search to a set of topics by first marking the topics in the Contents list. To search for a word or phrase in all topics, do the following: 1. Choose the Search option from the Services pull-down. 2. Type the word or words to be searched. 3. Select All sections. 4. Select Search to begin the search. 5. The list of topics where the word or phrase appears is displayed. Print Prints one or more topics. You can also print a set of topics by first marking the topics in the Contents list. You can print one or more topics. You can also print a set of topics by first marking the topics on the Contents list. To print the document Contents list, do the following: 1. Select Print from the Services menu. 2. Select Contents. 3. Select Print. 4. The Contents list is printed on your printer. Copy Copies a topic you are viewing to a file you can edit. You can copy a topic you are viewing into a temporary file named TEXT.TMP. You can later edit that file by using an editor such as the System Editor. To copy a topic, do the following: 1. Expand the Contents list and select a topic. 2. When the topic appears, select Copy to file from the Services menu. The system copies the text pertaining to that topic into the temporary TEXT.TMP file. For information on any of the other choices in the Services menu, highlight the choice and press the F1 key. Options Changes the way the Contents is displayed. You can control the appearance of the Contents list. To expand the Contents and show all levels for all topics, select Expand all from the Options menu. For information on any of the other choices in the Options menu, highlight the choice and press the F1 key. ═══ 3. Multimedia Application Programming Environment ═══ OS/2* multimedia (referred to as MMPM/2 or Multimedia Presentation Manager/2* in previous releases) is the multimedia platform for today because it takes advantage of OS/2 features to provide an effective multimedia environment. OS/2 multitasking capability supports synchronization and concurrent playback of multiple devices. The flat memory model supports the management of large data objects. OS/2 multimedia is also the multimedia platform for tomorrow because of its extendable architecture, which enables new functions, devices, and multimedia data types to be added as the technology of multimedia advances. Because OS/2 multimedia and its devices are architected to support synchronization activities, Presentation Manager* (PM) applications can easily incorporate multimedia function for playing multiple devices concurrently and synchronizing audio and video as media drivers become available. ═══ 3.1. Application Programming Model ═══ The application programming model for an OS/2 multimedia application is an extension of the OS/2 Presentation Manager programming model, providing both messaging and procedural programming interfaces. OS/2 multimedia API procedures allow applications to manage data and control devices, and messages from the OS/2 multimedia system notify applications of asynchronous events. The media control interface provides a view of the OS/2 multimedia system to both application developers and users that is similar to that of a video and audio home entertainment system. Operations are performed by controlling the processors of media information, known as media devices. Media devices can be internal or external hardware devices, or they can be software libraries that perform a defined set of operations by manipulating lower-level hardware components and system software functions. Multiple media devices can be used in an operation. For example, the playback of an audio compact disc can be implemented by coordinating the control of a compact disc player and an amplifier-mixer device. The Media Device Manager (MDM) shown in the following figure provides resource management for media devices and enables the command message and command string interface. The Media Device Manager provides device independence to an application developer. ┌───────────┐ ┌───────────┐ ┌───────────┐ ┌───────────┐ ┌───────────┐ ┌───────────┐ ┌───────────┐ │ Data │ │ │ │ Media │ │ │ │ Volume │ │ CD │ │ Digital │ │Converter │ │ Install │ │ Players │ │ Setup │ │ Control │ │ Player │ │ Audio │ └┬──────────┘ └─────┬─────┘ └─────┬─────┘ └─────┬─────┘ └─────┬─────┘ └─────┬─────┘ └─┬─────────┘ │ │ │ │ │ │ │ Applications -│- - - - - - - - - │ - - - - - - │ - - - - - - │ - - - - - - │ - - - - - - │ - - - - │ - - - - - - - - - - - │ ┌─────────────┴─────────────┴─────────────┴─────────────┴─────────────┴─────────┴─────────────────────┐ │ │ Media Device Manager Interface Layer │ │ └──────┬──────────────┬──────────────┬──────────────┬──────────────┬──────────────┬──────────────┬────┘ │ │ │ │ │ │ │ │ │ ┌─────┴──────┐ ┌─────┴──────┐ ┌─────┴──────┐ ┌─────┴──────┐ ┌─────┴──────┐ ┌─────┴──────┐ ┌─────┴──────┐ │ │ CD-ROM/XA │ │ CD Audio │ │Wave Audio │ │ Sequencer │ │Digitalvideo│ │ Amp/Mixer │ │ Videodisc │ │ │ Driver │ │ Driver │ │ Driver │ │ Driver │ │ Driver │ │ Driver │ │ Driver │ │ └─────────┬┬─┘ └─┬──────┬───┘ └────────┬───┘ └────┬───────┘ └───┬────────┘ └┬─┬─────────┘ └─────┬──────┘ │ Application ││ │ │ └────────┐ │ │ │ │ │ │ or ││ │ └─────────────────────┐ │ │ ┌───────────┘ │ │ │ │ Media Driver │└─────┼──────────────────────────┐ │ │ │ │ ┌─────────────────────┘ └───────┐ │ │ │ ┌─┘ │ │ │ │ │ │ │ │ ┌┴────────┐ │ │ Stream Handlers │ │ │ │ │ │ ┌───┴────┐ │ │ MMIO ├───┐ │ │ ┌──────────────┐ │ │ │ │ │ │ ┌──────────────┐ │ Audio │ │ │ Manager │ │ │ │┌─┤ File System │ │ │ │ │ │ │ ┌─┤ Audio ├─────┤ VSD │ │ └────┬────┘ └──┼────┼┤ └──────────────┘ │ │ │ │ │ │ │ └──────────────┘ │ │ │ ┌────┴─────────┐ │ ││ ┌──────────────┐ │ │ │ │ │ │ │ ┌──────────────┐ └───┬────┘ │ │I/O Procedures│ │ ││ │ Multi-track ├─┐ ┌────┴─┴─┴─┴─┴─┴┐ ├─┤ Memory │ │ │ └────┬─────────┘ │ ││ └──────────────┘ │ │┌─────────────┐│ │ └──────────────┘ │ │ │ │ ││ ┌──────────────┐ ├───┼┤ Sync/Stream ││ │ ┌──────────────┐ │ │ │ │ │└─┤ Video ├─┤ ││ Manager DLL ├┼───┼─┤ CD-ROM/XA │ │ │ │ └───┐│ └──────────────┘ │ │└──────┬──────┘│ │ └──────────────┘ │ │ │ ││ ┌──────────────┐ │ │ │ │ │ ┌──────────────┐ │ │ ┌──┴────────┐ ││ │ MIDI Mapper ├─┘ │ │ │ └─┤ CD/DA │ │ │ RING 3 │File System│ ││ └──────────────┘ │ │ │ └──────────────┘ │ │ DLL - -│- - - - - -│- - - ││- - - - - - - - - - - -│- - - -│- - - -│- - - - - - - - - - - - - - - │ - - - - │- - - - - │FAT/HPFS/XA│ ││ │┌──────┴──────┐│ │ │ RING 0 └─┬────┬──┬┬┘ ││ ││ Sync/Stream ││ │ │ PDD │ │ │└──────┐││ ││ Manager PDD ││ │ │ ┌───┴───┐│┌─┴─────┐ │││ │└─────────────┘│ │ IOCtl │ │ Hard │││ R/W │ │││ └───────┬───────┘ │ │ │ Disk │││Optical│ │││ │ │ │ │ PDD │││ PDD │ │││ │ │ │ IOCtl └─┬─────┘│└─┬─────┘ │││ │ │ │ │ │ │ │││ ┌───┴─────────────────────┐ │ │ │ │ └┐ │││ ┌───┤Ring 0 Stub Device Driver├────────────────┤ │ │ │ │ │││ │ └───────────┬─────────────┘ │ │ │┌─────┴──┐│ ┌───┴┴┴───┐ │ ┌───────────┼────────────┐ │ │ ││Diskette││ │CD-ROM/XA│ IDC │ │┌──────────┴───────────┐│ │ │ ││ PDD ││ │ PDD │ │ ││Common Audio Interface├┼─────────────────┘ │ │└─┬──────┘│ └───┬─────┘ │ │└──────────┬───────────┘│ │ │ │ │ ┌─┴──────────┐ │ │ ┌─────┴─────┐ │ ┌───┴───┐ │ │ │ │Generic SCSC│ │ │ │ OEM Audio │ │ │ OS/2 │ │ │ └────┤ DD │ └───┼─────┤ Device │ │ │ COM │ │ │ └─────┬──────┘ │ │ Driver │ │ │ PDD │ │ │ │ │ └─────┬─────┘ │ └───┬───┘ │ │ │ └───────────┼────────────┘ │ - -│- │ - - - - - - - - -│- - - - - - - - - - - - - - - - - - -│- - - - - - - - - - - - - - - - - - - - │- - - - │ └────────┐ ├─────────┐ │ │ ┴ ┴ ┴ ┴ ┌───┴─────┐ RS-232 Hard Diskette Optical CD-ROM │OEM Audio│ Drive Drive Drive Drive └──────┐ │ Physical └──┘ Devices Refer to the OS/2 Multimedia Subsystem Programming Guide for information on multimedia subsystem programming including media drivers and stream handlers. To assist you in creating a standardized user interface for your OS/2 multimedia application, OS/2 provides multimedia window controls, which have been implemented in OS/2 multimedia applications such as Volume Control. See the following figure. Graphic buttons are two-state buttons that can be toggled up and down. They can display text, or graphics, or both. They can also be animated. Their versatility makes graphic buttons ideal to use for device control panels. Circular sliders lend realism to your panel by providing familiar-looking dials. The dials are easy to operate and do not hog screen real estate. Secondary windows provide a sizeable dialog window to contain your multimedia device controls. ═══ 3.1.1. OS/2 Multimedia Application Requirements ═══ The IBM Developer's Toolkit for OS/2 Version 3 includes the bindings, header files, and libraries for developing OS/2 multimedia applications. A PM message queue is required for all OS/2 multimedia applications because it enables the efficient sharing of devices in the OS/2 multimedia environment. The minimum recommended stack size for an OS/2 multimedia application is 16KB. All OS/2 multimedia public interfaces, for example error message defines and common definitions, are accessible through the OS2ME.H file. Constants and prototypes for multimedia window control functions, MMIO file services functions, and high-level interfaces are accessible after the following defines are included in your application: Define Services #define INCL_SW Window Control Functions #define INCL_MMIOOS2 MMIO File Services #define INCL_MACHDR High-Level Services OS/2 multimedia applications should link with the MMPM2.LIB library. Note: OS/2 multimedia header files have naming conventions compatible with the standard OS/2 format. Applications using previous versions of the MMPM/2 header files will still use those header files by default when the applications are compiled. In order to use the OS/2-consistent header files in an application, define INCL_OS2MM in the program. Defining INCL_OS2MM automatically defines the following: INCL_MCIOS2 MCI-related include files (MCIOS2.H and MMDRVOS2.H) INCL_MMIOOS2 MMIO include file (MMIOOS2.H) All existing applications remain binary compatible. If they are recompiled a choice of which set of headers to use is available. If new header files are used, the source code must be modified to conform to the name changes. ═══ 3.1.2. Extendable Device Support ═══ The system architecture of OS/2 multimedia extensions is designed to be extendable. This level of modularity allows independent development of support for new hardware devices, logical media devices, and file formats. Examples of media control interface devices are listed in the following table. The table shows the logical device types that can be supported and already have media control interface definitions. Devices currently supported by OS/2 multimedia are indicated by (X) marks. ┌──────────────┬──────────┬──────────────┬──────────────────────────────────┐ │Media Device │OS/2 │String │Constant │ │Type │Multimedia│ │ │ ├──────────────┼──────────┼──────────────┼──────────────────────────────────┤ │Amplifier │X │ampmix │MCI_DEVTYPE_AUDIO_AMPMIX │ │mixer │ │ │ │ ├──────────────┼──────────┼──────────────┼──────────────────────────────────┤ │Audio tape │ │audiotape │MCI_DEVTYPE_AUDIO_TAPE │ │player │ │ │ │ ├──────────────┼──────────┼──────────────┼──────────────────────────────────┤ │CD audio │X │cdaudio │MCI_DEVTYPE_CD_AUDIO │ │player │ │ │ │ ├──────────────┼──────────┼──────────────┼──────────────────────────────────┤ │CD-XA player │X │cdxa │MCI_DEVTYPE_CDXA │ ├──────────────┼──────────┼──────────────┼──────────────────────────────────┤ │Digital audio │ │dat │MCI_DEVTYPE_DAT │ │tape │ │ │ │ ├──────────────┼──────────┼──────────────┼──────────────────────────────────┤ │Digital video │X │digitalvideo │MCI_DEVTYPE_DIGITAL_VIDEO │ │player │ │ │ │ ├──────────────┼──────────┼──────────────┼──────────────────────────────────┤ │Headphone │ │headphone │MCI_DEVTYPE_HEADPHONE │ ├──────────────┼──────────┼──────────────┼──────────────────────────────────┤ │Microphone │ │microphone │MCI_DEVTYPE_MICROPHONE │ ├──────────────┼──────────┼──────────────┼──────────────────────────────────┤ │Monitor │ │monitor │MCI_DEVTYPE_MONITOR │ ├──────────────┼──────────┼──────────────┼──────────────────────────────────┤ │Other │ │other │MCI_DEVTYPE_OTHER │ ├──────────────┼──────────┼──────────────┼──────────────────────────────────┤ │Video overlay │ │videooverlay │MCI_DEVTYPE_OVERLAY │ ├──────────────┼──────────┼──────────────┼──────────────────────────────────┤ │Sequencer │X │sequencer │MCI_DEVTYPE_SEQUENCER │ ├──────────────┼──────────┼──────────────┼──────────────────────────────────┤ │Speaker │ │speaker │MCI_DEVTYPE_SPEAKER │ ├──────────────┼──────────┼──────────────┼──────────────────────────────────┤ │Videodisc │X │videodisc │MCI_DEVTYPE_VIDEODISC │ │player │ │ │ │ ├──────────────┼──────────┼──────────────┼──────────────────────────────────┤ │Video │ │videotape │MCI_DEVTYPE_VIDEOTAPE │ │tape/cassette │ │ │ │ ├──────────────┼──────────┼──────────────┼──────────────────────────────────┤ │Waveform audio│X │waveaudio │MCI_DEVTYPE_WAVEFORM_AUDIO │ │player │ │ │ │ └──────────────┴──────────┴──────────────┴──────────────────────────────────┘ Note: M-Control Program 2.01, which supports the M-Motion Video Adapter/A, provides overlay extensions for OS/2 multimedia. ═══ 4. Media Control Interface ═══ This section describes the services offered to applications by the media control interface for managing devices in the multimedia environment. ═══ 4.1. Command Message and Command String Interfaces ═══ When a user activates a PM control to use a multimedia device function, the OS/2 multimedia application window procedure sends a command to the media control interface. Depending on the needs of the application, the window procedure can use the command message interface or the command string interface to implement these device commands. Messages for the command message interface (also referred to as procedural interface) are sent with mciSendCommand. Strings for the command string interface are sent to the Media Device Manager for parsing, using the mciSendString function. See the following figure. _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ | mciSendCommand mciSendString | | │ │ | | │ │ | |   | | ┌───────────────────────────────────────────────────────────┐ | | │ Media Device Manager Interface Layer │ | M └──┬───────────────────────┬────────────────────────────────┘ | D │ │ | M │ │ Default Tables | | │ │ ┌──────────────┐| | │ ┌────────────┴──────────────┐ ┌───┤ System │| | │ │ Table-Driven Parser │────────┤ ├──────────────┤| | │ └────────────┬──────────────┘ │ │ waveaudio │| | └─────────────┬─────────┘  │ ├──────────────┤| | │ │ │ │ sequencer │| |  │ │ ├──────────────┤| | ┌───────────────────────────┐ │ │ │ cdaudio │| | │ Media Device Manager │ │ │ ├──────────────┤| | └──┬────────┬────────┬──────┘ │ │ │ cdxa │| |_ _ _ _│ _ _ _ _│ _ _ _ _│_ _ _ _ _ _ _ _ _│_ _ _ │ ├──────────────┤| │ │ │ │ | │ │ ampmix │| ┌──┴───┐ ┌──┴───┐ ┌──┴───┐ │ | │ ├──────────────┤| │Media │ │Media │ │Media │ │ | │ │ videodisc │| │Driver│ │Driver│ │Driver│ │ | │ ├──────────────┤| └──────┘ └──────┘ └──────┘ │ | │ │ digitalvideo │| ┌──────┐ │ | │ ├──────────────┤| │Custom├─────────────┘ | └───┤ other │| │Table │ | └──────────────┘| └──────┘ | _ _ _ _ _ _ _ _ _ _ _ _ _| The string interface provides access to most functions of the message interface. However, operations that involve identifying multiple devices (for example, for the purpose of establishing connections), or operations that return complex data structures (such as a CD table of contents) are available only through the message interface. Each time a message is sent to the Media Device Manager with mciSendCommand, flags are set and a pointer to a data structure is passed. Each time a string is passed with mciSendString, it must be converted to the message format understood by the media driver. The Media Device Manager calls the multimedia string parser, which is case insensitive, to interpret the strings. The time required for this conversion process makes the string method of control slightly slower than the message method. However, the string interface generally requires less application code than the command message interface. The string interface also lets users interactively control devices with a command line or PM interface. See Command Strings. ═══ 4.1.1. Command Messages ═══ Command messages are used by the command message interface and specified with mciSendCommand. Most command messages have corresponding string commands that are used by the command string interface and specified with mciSendString. Command messages are sent either to a logical device or to the system. The following table lists the command messages sent to devices. Commands that cause asynchronous responses to be generated, such as cue point and position advise, can be invoked using the appropriate string command; however, their responses are returned to window procedures. ┌────────────────────────┬────────────────────────────────────┐ │Command Messages │ │ │Supported by All Devices│ │ ├────────────────────────┼────────────────────────────────────┤ │MCI_OPEN │Establishes a specific instance of a│ │ │multimedia device or file. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_GETDEVCAPS │Gets the capabilities of a device. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_INFO │Gets textual information from the │ │ │device. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_STATUS │Gets the current status of the │ │ │device. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_CLOSE │Closes the device. │ ├────────────────────────┼────────────────────────────────────┤ │Device Setup Command │ │ │Messages │ │ ├────────────────────────┼────────────────────────────────────┤ │MCI_SET │Changes the configuration of the │ │ │device. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_CONNECTOR │Enables, disables, or queries the │ │ │state of a connector. │ ├────────────────────────┼────────────────────────────────────┤ │Playback and Recording │ │ │Command Messages │ │ ├────────────────────────┼────────────────────────────────────┤ │MCI_CUE │Prerolls a device for playing or │ │ │recording. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_SEEK │Seeks to a specified position in the│ │ │file. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_PLAY │Begins transmitting output data. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_RECORD │Begins recording data from the │ │ │specified position. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_PAUSE │Suspends the playing or recording │ │ │operation. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_RESUME │Resumes the playing or recording │ │ │operation. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_STOP │Stops the playing or recording │ │ │operation. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_LOAD │Loads a data element into a media │ │ │device. An example of a data element│ │ │is a waveform file. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_SAVE │Saves the current file to disk. │ ├────────────────────────┼────────────────────────────────────┤ │Synchronization Command │ │ │Messages │ │ ├────────────────────────┼────────────────────────────────────┤ │MCI_SET_CUEPOINT │Sets run-time cue points. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_SET_POSITION_ADVISE │Advises the application when time │ │ │elapses or position changes. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_SET_SYNC_OFFSET │Biases MCI_PLAY starting positions │ │ │and MCI_SEEK target positions for │ │ │group operations. │ ├────────────────────────┼────────────────────────────────────┤ │Device-Specific Command │ │ │Messages │ │ ├────────────────────────┼────────────────────────────────────┤ │MCI_CAPTURE │Captures the current video image and│ │ │stores it as an image device │ │ │element. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_ESCAPE │Sends a custom message directly to │ │ │the media driver. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_GETIMAGEBUFFER │Gets the contents of the capture │ │ │video buffer or the current movie │ │ │frame. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_GETTOC │Gets a contents structure for the │ │ │currently loaded CD-ROM disc. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_PUT │Sets the source and destination │ │ │rectangles for the transformation of│ │ │the video image. It also sets the │ │ │size and position of the default │ │ │video. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_REWIND │Seeks the media to the beginning │ │ │point. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_SETTUNER │Causes the digital video MCD to │ │ │change the frequency the tuner │ │ │device is tuned to. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_SPIN │Spins the videodisc player up or │ │ │down. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_STEP │Advances or backs up the videodisc │ │ │player one or more frames. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_WHERE │Returns the source and destination │ │ │rectangles set by MCI_PUT. It also │ │ │returns the size and position of the│ │ │video window. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_WINDOW │Specifies the window in which to │ │ │display video output, and controls │ │ │the visibility of the default video │ │ │window. │ ├────────────────────────┼────────────────────────────────────┤ │Editing Command Messages│ │ ├────────────────────────┼────────────────────────────────────┤ │MCI_COPY │Copies specified data range into │ │ │clipboard or buffer. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_CUT │Removes specified data range and │ │ │places it into clipboard or buffer. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_DELETE │Deletes specified data range. │ │ │Clipboard or buffer is not used. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_PASTE │Deletes selected data range if │ │ │difference between FROM and TO is │ │ │more than zero, then inserts data │ │ │from buffer or clipboard. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_REDO │Reverses previous MCI_UNDO command. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_UNDO │Cancels previous RECORD, CUT, PASTE,│ │ │or DELETE. │ └────────────────────────┴────────────────────────────────────┘ The following table lists the system command messages specified with mciSendCommand. ┌────────────────────────┬────────────────────────────────────┐ │MCI_DEVICESETTINGS │Provides a media control interface │ │ │driver the opportunity to insert │ │ │custom settings pages. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_GROUP │Makes and breaks device group │ │ │associations. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_MASTERAUDIO │Sets the system master volume and │ │ │toggles speakers and headphones. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_SYSINFO │Gets and sets device and system │ │ │information. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_CONNECTORINFO │Gets information regarding the │ │ │number and types of connectors │ │ │defined for a device. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_DEFAULT_CONNECTION │Makes, breaks, or queries default │ │ │connections established for a │ │ │device. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_CONNECTION │Gets the device context connection │ │ │or establishes an alias for a │ │ │connected device. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_ACQUIREDEVICE │Acquires a device for use. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_RELEASEDEVICE │Releases a device from use. │ └────────────────────────┴────────────────────────────────────┘ ═══ 4.1.2. Command Strings ═══ String commands utilize a more English text format than command messages. Following is the valid syntax for passing string commands directly to the media control interface: This format is used for all string commands except masteraudio, which does not require a device name. The format for the masteraudio command is: An application calls mciSendString to pass the string command to the Media Device Manager for parsing and execution. The String Test Sample program, provided in the Toolkit (\TOOLKIT\SAMPLES\MM\MCISTRNG), illustrates the interpretive string interface. The following code fragment shows the call to mciSendString in the String Test Sample. ulSendStringRC = mciSendString( (PSZ) &acMCIString[ 0 ], /* The MCI String Command */ (PSZ) &acMCIReturnString[ 0 ], /* Place for return strings */ (USHORT) MCI_RETURN_STRING_LENGTH, /* Length of return space */ hwndDisplayDialogBox, /* Window to receive notifies */ usCountOfMCIStringSent ); /* The user parameter */ The following is an example of the string commands required to open a CD player and play an entire CD. open cdaudio01 alias cdaud1 shareable status cdaud1 media present wait status cdaud1 mode wait set cdaud1 time format milliseconds seek cdaud1 to start play cdaud1 notify . . . ** play the entire disc ** . . . close cdaud1 The status commands let the application know if a CD is present and if the drive is ready. Notice that wait flags are used; otherwise the commands would return immediately with no status information. The set command sets the time base to milliseconds for all future commands. The close command is sent after the application receives an MM_MCINOTIFY message at the completion of the play command. Note: The close command can be sent at any time. Authoring languages that include support for the media control interface can integrate device command strings like these with authoring language syntax to create multimedia presentations. The string interface provides a 16-bit interface to enable developers to integrate multimedia function with the macro languages of existing 16-bit applications. ═══ 4.1.3. Wait and Notify Flags ═══ An application can set a wait or a notify flag on a device command sent with mciSendString or mciSendCommand. These two flags are mutually exclusive and are available on all commands except some system commands. ┌───────────────┬─────────────────────────────────────────────┐ │Flag │Description │ ├───────────────┼─────────────────────────────────────────────┤ │wait │The command is executed synchronously. The │ │ │function waits until the requested action is │ │ │complete before returning to the application.│ ├───────────────┼─────────────────────────────────────────────┤ │notify │The command is executed asynchronously, │ │ │allowing control to be returned immediately │ │ │to the application. When the requested │ │ │action is complete, an MM_MCINOTIFY message │ │ │is sent to the application window procedure. │ └───────────────┴─────────────────────────────────────────────┘ Note: If a command is issued without a wait flag or notify flag specified, the command is executed asynchronously, and the application is never notified. The wait flag is useful for operations that are conducted quickly, like the playback of short sounds, which the application wants to complete before it continues. The wait flag is also useful for operations that return information, such as device capabilities, because the Media Device Manager parser converts the return code to a meaningful string. However, the conversion occurs only if the wait flag is specified. The wait flag should be used with care when issuing commands from threads that read application input message queues as it ties up the thread, preventing all PM messages in the system from being processed while the command issued with the wait flag is executed. The notify flag is useful for operations that are conducted over a period of time. For example, the playing of a waveform file often can take a while to complete. By specifying the notify flag, an application requests to be notified when processing of the command is complete. The application window procedure can then remain responsive to input queue processing. ═══ 4.1.4. Notification Messages ═══ The system returns asynchronous response messages (notification messages) to applications to indicate events such as completing a media device function or passing ownership of a media device from one process to another. As stated in the previous section, two standard flags available to most messages when using the mciSendCommand function are MCI_WAIT (default) and MCI_NOTIFY. These two flags are mutually exclusive. MCI_WAIT specifies that control does not return to the application until the function has completed. MCI_NOTIFY specifies that control returns immediately to the application and the media control interface is to post a notification message to the window specified in the callback window handle when the command completes processing. If the command was sent using the notify flag and the command action completes, an MM_MCINOTIFY message is returned asynchronously to the application using WinPostMsg. It can have any of the following values: ┌────────────────────────┬────────────────────────────────────┐ │Notification Code │Meaning │ ├────────────────────────┼────────────────────────────────────┤ │MCI_NOTIFY_SUCCESSFUL │The command completed successfully. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_NOTIFY_SUPERSEDED │Another command is being processed. │ ├────────────────────────┼────────────────────────────────────┤ │MCI_NOTIFY_ABORTED │Another command interrupted this │ │ │one. │ └────────────────────────┴────────────────────────────────────┘ Note: If none of the above notification codes are returned, an error code is returned, indicating that the asynchronous processing of the command ended in an error condition. To convert the error code to a textual description of the error, the application calls the mciGetErrorString function. The following code fragment illustrates how the Audio Recorder Sample program, provided in the Toolkit (\TOOLKIT\SAMPLES\MM\RECORDER), handles notification messages. case MM_MCINOTIFY: /* * This message is returned to an application when a device * successfully completes a command that was issued with a NOTIFY * flag, or when an error occurs with the command. * * This message returns two values. A user parameter (mp1) and * the command message (mp2) that was issued. The low word of mp1 * is the Notification Message Code, which indicates the status of the * command like success or failure. The high word of mp2 is the * Command Message which indicates the source of the command. */ usNotifyCode = (USHORT) SHORT1FROMMP( mp1); /* low-word */ usCommandMessage = (USHORT) SHORT2FROMMP( mp2); /* high-word */ switch (usCommandMessage) { case MCI_PLAY: switch (usNotifyCode) { case MCI_NOTIFY_SUCCESSFUL: if (eState != ST_STOPPED) { /* * Update the status line with appropriate message. */ UpdateTheStatusLine(hwnd, IDS_STOPPED); eState = ST_STOPPED; /* * Stop the play button animation */ WinSendMsg( hwndPlayPB, /* Play button handle */ GBM_ANIMATE, /* Animation control */ MPFROMSHORT(FALSE),/* Animation flag */ NULL ); /* Ignore return data */ } break; case MCI_NOTIFY_SUPERSEDED: case MCI_NOTIFY_ABORTED: /* we don't need to handle these messages. */ break; default: /* * If the message is none of the above, then it must be * a notification error message. */ ShowMCIErrorMessage( usNotifyCode); eState = ST_STOPPED; /* * Stop the play button animation and update the status * line with appropriate text. */ WinSendMsg( hwndPlayPB, /* Play button handle */ GBM_ANIMATE, /* Animation control */ MPFROMSHORT(FALSE), /* Animation flag */ NULL ); /* Ignore return data */ UpdateTheStatusLine(hwnd, IDS_STOPPED); break; } break; } return( (MRESULT) 0); ═══ 4.1.5. Time Formats for Device Commands ═══ Media position and time information are required as input and also returned as output by many multimedia commands. Time formats vary, depending on the device being used and the format of the data being operated on. The default time base for both the procedural and string interfaces is MMTIME. See MMTIME Format. Other time formats, such as milliseconds, are also supported. Time formats used by media control interface devices for measuring time are listed in the following table. The flags shown in the table are set with the MCI_SET command. ┌───────────────┬─────────────────────────┬─────────────────────────┐ │Device │Formats │Flags │ ├───────────────┼─────────────────────────┼─────────────────────────┤ │CD-DA │milliseconds │MCI_FORMAT_MILLISECONDS │ │ │mmtime │MCI_FORMAT_MMTIME │ │ │minutes-seconds-frames │MCI_FORMAT_MSF │ │ │tracks-min-sec-frame │MCI_FORMAT_TMSF │ ├───────────────┼─────────────────────────┼─────────────────────────┤ │CD-XA │milliseconds │MCI_FORMAT_MILLISECONDS │ │ │mmtime │MCI_FORMAT_MMTIME │ ├───────────────┼─────────────────────────┼─────────────────────────┤ │digital video │milliseconds │MCI_FORMAT_MILLISECONDS │ │ │mmtime │MCI_FORMAT_MMTIME │ │ │frames │MCI_FORMAT_FRAMES │ │ │hours-minutes-seconds │MCI_FORMAT_HMS │ │ │hours-min-sec-frames │MCI_FORMAT_HMSF │ ├───────────────┼─────────────────────────┼─────────────────────────┤ │waveform audio │milliseconds │MCI_FORMAT_MILLISECONDS │ │ │mmtime │MCI_FORMAT_MMTIME │ │ │bytes │MCI_FORMAT_BYTES │ │ │samples │MCI_FORMAT_SAMPLES │ ├───────────────┼─────────────────────────┼─────────────────────────┤ │MIDI sequencer │milliseconds │MCI_FORMAT_MILLISECONDS │ │ │mmtime │MCI_FORMAT_MMTIME │ │ │SMPTE 24 │MCI_SEQ_SET_SMPTE_24 │ │ │SMPTE 25 │MCI_SEQ_SET_SMPTE_25 │ │ │SMPTE 30 │MCI_SEQ_SET_SMPTE_30 │ │ │SMPTE 30 │MCI_SEQ_SET_SMPTE_30DROP │ │ │song pointer │MCI_SEQ_SET_SONGPTR │ └───────────────┴─────────────────────────┴─────────────────────────┘ ═══ 4.1.5.1. MMTIME Format ═══ MMTIME is a standard time and media position format supported by the media control interface. This time unit is 1/3000 second, or 333 microseconds. Conversion macros are provided for convenient conversion of other popular time formats to and from this format. MMTIME values are passed as long (32-bit) integer values. To use MMTIME on command messages, send the MCI_SET message specifying the MCI_SET_TIME_FORMAT flag. Use MCI_FORMAT_MMTIME in the ulTimeFormat field of the MCI_SET_PARMS structure. The macros shown in the following figure are available for conversion to and from the MMTIME format. ┌──────────────────────────────┬──────────────────────────────┐ │Conversion to MMTIME │Conversion to Other Formats │ ├──────────────────────────────┼──────────────────────────────┤ │REDBOOKTOMM (ULONG) │REDBOOKFROMMM (ULONG) │ │FPS24TOMM (ULONG) │FPS24FROMMM (ULONG) │ │FPS25TOMM (ULONG) │FPS25FROMMM (ULONG) │ │FPS30TOMM (ULONG) │FPS30FROMMM (ULONG) │ │MSECTOMM (ULONG) │MSECFROMMM (ULONG) │ │HMSTOMM (ULONG) │HMSFROMMM (ULONG) │ └──────────────────────────────┴──────────────────────────────┘ Packed Time Formats The packed time formats described in the following sections require that the application format the ULONG value passed in command message parameter structures. When these values are passed in string commands, any value containing a colon (:) is assumed to be a field-oriented value. For example, if the time format for a CD audio device is set to TMSF, and the value 4:10:00:00 is specified, this value is interpreted as track 4, 10 minutes, 0 seconds, and 0 frames. However, if the value 4100000 is specified, the integer is passed directly, and the assignment to byte fields is quite different. It is not required that a field-oriented value contain specifications for all fields. For example, the following are equivalent specifications for track 4: 4:00:00:00 4:00:00 4:00: 4:00 4: 4 The interpretation of field-oriented values is left-justified with respect to the placement of colons. Values not specified default to zero. If a value has a colon, it is subject to field-oriented interpretation, regardless of the time format currently set for the device. HMSF (SMPTE) Packed Time Format The HMSF packed time format represents elapsed hours, minutes, seconds, and frames from any specified point. This time format is packed into a 32-bit ULONG value as follows: ┌───────────────┬───────────────┬───────────────┬───────────────┐ │High-Order Byte│Low-Order Byte │High-Order Byte│Low-Order Byte │ ├───────────────┼───────────────┼───────────────┼───────────────┤ │Frames │Seconds │Minutes │Hours │ └───────────────┴───────────────┴───────────────┴───────────────┘ MSF Packed Time Format The CD-DA MSF time format, also referred to as the Red Book time format, is based on the 75-frame-per-second CD digital audio standard. Media position values in this format are packed into a 32-bit ULONG value as follows: ┌───────────────┬───────────────┬───────────────┬───────────────┐ │High-Order Byte│Low-Order Byte │High-Order Byte│Low-Order Byte │ ├───────────────┼───────────────┼───────────────┼───────────────┤ │Reserved │Frames │Seconds │Minutes │ └───────────────┴───────────────┴───────────────┴───────────────┘ The following macros aid in extracting information in packed MSF format: Macro Description MSF_MINUTE(time) Gets the number of minutes. MSF_SECOND(time) Gets the number of seconds. MSF_FRAME(time) Gets the number of frames. For example, the following code fragment sets the time in ulTime to 6 minutes and 30 seconds (06:30:00). ULONG ulTime; . . . MSF_MINUTE(ulTime) = 6 MSF_SECOND(ulTime) = 30; MSF_FRAME(ulTime) = 0; TMSF Packed Time Format The CD-DA TMSF time format is based on the 75-frame-per-second CD digital audio standard. Media position values in this format are packed into a 32-bit ULONG value as follows: ┌───────────────┬───────────────┬───────────────┬───────────────┐ │High-Order Byte│Low-Order Byte │High-Order Byte│Low-Order Byte │ ├───────────────┼───────────────┼───────────────┼───────────────┤ │Frames │Seconds │Minutes │Track │ └───────────────┴───────────────┴───────────────┴───────────────┘ The following macros aid in extracting information in packed TMSF format: Macro Description TMSF_TRACK(time) Gets the number of tracks. TMSF_MINUTE(time) Gets the number of minutes. TMSF_SECOND(time) Gets the number of seconds. TMSF_FRAME(time) Gets the number of frames. For example, the following code fragment sets the time in ulTime to 2 minutes into track 2 (02:02:00:00). ULONG ulTime; . . . TMSF_TRACK(ulTime) = 2; TMSF_MINUTE(ulTime) = 2; TMSF_SECOND(ulTime) = 0; TMSF_FRAME(ulTime) = 0; Note: MSF and TMSF macros can be found in the MCIOS2.H file. HMS Packed Time Format The HMS packed time format, representing hours, minutes, and seconds, is packed into a 32-bit ULONG value as follows: ┌───────────────┬───────────────┬───────────────┬───────────────┐ │High-Order Byte│Low-Order Byte │High-Order Byte│Low-Order Byte │ ├───────────────┼───────────────┼───────────────┼───────────────┤ │Reserved │Seconds │Minutes │Hours │ └───────────────┴───────────────┴───────────────┴───────────────┘ ═══ 4.2. Opening a Media Device ═══ Media devices are categorized as simple or compound devices. A compound device is an internal device that operates on data objects, such as files, within the system. These data objects are referred to as device elements. A simple device is an external device that does not require a device element. Media device types supported by OS/2 multimedia are shown in the following table. ┌────────────────────┬───────────────┬──────────────────────────────┐ │Logical Device Type │String │Constant │ ├────────────────────┼───────────────┼──────────────────────────────┤ │Amplifier-mixer │ampmix │MCI_DEVTYPE_AUDIO_AMPMIX │ ├────────────────────┼───────────────┼──────────────────────────────┤ │CD-DA player │cdaudio │MCI_DEVTYPE_CD_AUDIO │ ├────────────────────┼───────────────┼──────────────────────────────┤ │CD-XA player │cdxa │MCI_DEVTYPE_CDXA_PLAYER │ ├────────────────────┼───────────────┼──────────────────────────────┤ │Digital video player│digitalvideo │MCI_DEVTYPE_DIGITAL_VIDEO │ ├────────────────────┼───────────────┼──────────────────────────────┤ │MIDI sequencer │sequencer │MCI_DEVTYPE_SEQUENCER │ ├────────────────────┼───────────────┼──────────────────────────────┤ │Waveform audio │waveaudio │MCI_DEVTYPE_WAVEFORM_AUDIO │ │player │ │ │ ├────────────────────┼───────────────┼──────────────────────────────┤ │Videodisc player │videodisc │MCI_DEVTYPE_VIDEODISC │ └────────────────────┴───────────────┴──────────────────────────────┘ Device type constants represent one way of specifying devices in command messages. String names can be specified in either command messages or command strings. To use the string interface to communicate with a device, an application calls mciSendString and passes the textual command open. Following is the syntax used for the textual command: open device_name Parameters for the open command are: ┌────────────────────┬────────────────────────────────────────┐ │Parameters │Description │ ├────────────────────┼────────────────────────────────────────┤ │device_name │Specifies the name of a device or device│ │ │element. │ ├────────────────────┼────────────────────────────────────────┤ │shareable │Indicates the device or device element │ │ │may be shared by other applications. │ ├────────────────────┼────────────────────────────────────────┤ │type device_type │Specifies the device type when │ │ │device_name is a device element. │ ├────────────────────┼────────────────────────────────────────┤ │alias alias │Specifies an alternate name for the │ │ │device. │ └────────────────────┴────────────────────────────────────────┘ Here is an example of the syntax for opening a device: open horns.wav type waveaudio alias sound1 where "horns.wav" is the device element and "waveaudio" is the device type. The system also supports a shortcut version of the syntax: open device_type!element_name The shortcut version of the previous example looks like this: open waveaudio!horns.wav alias sound1 ═══ 4.2.1. File Type Associations ═══ A specific device can have file extensions and .TYPE EAs (Extended Attributes) associated with it. The OS/2 multimedia user can map a file extension or .TYPE EA to a specific device with the Multimedia Setup application located in the Multimedia folder. An OS/2 multimedia subsystem developer writing an installation DLL can map a file extension or .TYPE EA to a device using MCI_SYSINFO_SET_EXTENSIONS or MCI_SYSINFO_SET_TYPES. For an extension or .TYPE EA to be mapped to a device, it must be unique across installation names. For example, the Multimedia Setup application can be used to associate the WAV extension with the waveaudio01 device. The device can then be opened by passing the name of a data element with a WAV extension as a parameter in the open command to mciSendString. Suppose the following string is passed: open honk.wav wait The waveaudio01 device is opened with the data file honk.wav. ═══ 4.2.2. Default and Specific Devices ═══ The following table shows some examples of open commands. A default device is opened if only a logical device type (for example, waveaudio) is specified in the open command. The default device for a logical device type can be queried and set by the user with the Multimedia Setup application. The default device also can be queried and set with MCI_SYSINFO by an installation DLL for a media device. A specific device is opened by specifying its name (for example waveaudio01), or by specifying a device element with an extension or .TYPE EA that is associated with the device. ┌────────────────────────┬────────────────────────────────────┐ │Open Command │Description │ ├────────────────────────┼────────────────────────────────────┤ │open waveaudio │Opens a default device of type │ │ │waveaudio. │ ├────────────────────────┼────────────────────────────────────┤ │open waveaudio01 │Opens a specific device of type │ │ │waveaudio. │ ├────────────────────────┼────────────────────────────────────┤ │open foo.xyz │Opens a specific device that is │ │ │associated with the .TYPE EA (if │ │ │any) of foo.xyz; otherwise opens a │ │ │specific device that has a unique │ │ │association with file extension xyz;│ │ │otherwise returns │ │ │MCIERR_INVALID_DEVICE_NAME. │ └────────────────────────┴────────────────────────────────────┘ ═══ 4.2.3. Shareable Flag ═══ By setting the shareable flag for an open request, an application can share an OS/2 multimedia device with other applications. To enable device sharing, the multimedia system posts the MM_MCIPASSDEVICE message with WinPostMsg to the application. The message informs the application the device context is becoming active (MCI_GAINING_USE) or inactive (MCI_LOSING_USE). After the application processes the MCI_GAINING_USE event notification, it can issue device commands. The device context becomes inactive when the MCI_LOSING_USE event notification is received. If the application has specified a notify flag on the open, the receipt of an MM_MCINOTIFY message does not mean the device context is active. When MCI_NOTIFY_SUCCESSFUL is received, the commands status, capability, and info can be issued, because the multimedia system allows these commands to be made to inactive instances. If the application issues a command to an inactive instance and the instance must be active to process the command, the multimedia system returns MCIERR_INSTANCE_INACTIVE. When an application opens a device without setting the shareable flag, the Media Device Manager attempts to acquire the device for the exclusive use of the application. If a device context already exists that was either opened as nonshareable by another application or opened as shareable but then acquired exclusively by another application, the open fails and the application receives the MCIERR_DEVICE_LOCKED error code. The application can subsequently make the device context shareable by issuing an MCI_RELEASEDEVICE message. See Device Sharing By Applications for more detailed information on device sharing. ═══ 4.2.4. Device Alias ═══ When a device is opened, it can be given an alias, or alternate name. The primary use of a device alias is to simplify the specifying of subsequent commands to control the device through the string interface. A device alias is referenced only from the string interface, and it is valid only within the process that opened the device context. For example, the following strings can be passed with mciSendString: open horns.wav alias honk play honk A secondary use of the device alias is to differentiate between device contexts opened by the same process. For example: open horns.wav alias honk open bells.wav alias ring play ring wait play honk wait Note: The maximum length for an alias is 20 characters. Placing an alias in quotation marks is permitted. When a device is opened using the string interface, a device context ID is returned. If the application provides a return buffer in the call to mciSendString, the ID can be used to issue commands to the device context using the mciSendCommand interface, when necessary. ═══ 4.2.5. Using the Command Message Interface ═══ To use the command message interface to communicate with a device, an application calls mciSendCommand and passes the command message MCI_OPEN. If the request is successful, a device handle for access to the device context is returned in the usDeviceID field of the MCI_OPEN_PARMS data structure. This handle is retained for use in subsequent message commands. An alias can be specified with the MCI_OPEN_ALIAS flag in the command message MCI_OPEN. Commands can then be issued to the device context by means of the string interface. The following code fragment shows the opening of devices in the Duet Player I sample program. The hwndCallback field contains the application's main window procedure so that the MM_MCIPASSDEVICE messages are sent to it when the duet player gains or passes control of the device. The device ID and type fields of the structure are not needed because the audio file name is specified as the element field of the structure. This causes the Media Device Manager (MDM) to open the appropriate device based on the file name extension. Once the MCI_OPEN_PARMS structure is initialized, an MCI_OPEN command is specified with the mciSendCommand function for each separate part of the duet. /* * Open one part of the duet. The first step is to initialize an * MCI_OPEN_PARMS data structure with the appropriate information, * then issue the MCI_OPEN command with the mciSendCommand function. * We will be using an open with only the element name specified. * This will cause the default connection, as specified in the * MMPM.INI file, for the data type. */ mopDuetPart.hwndCallback = (ULONG) hwnd; /* For MM_MCIPASDEVICE */ mopDuetPart.usDeviceID = (USHORT) NULL; /* this is returned */ mopDuetPart.pszDeviceType = (PSZ) NULL; /* using default conn. */ mopDuetPart.pszElementName = (PSZ) aDuet[sDuet].achPart1; ulError = mciSendCommand( (USHORT) 0, MCI_OPEN, MCI_WAIT | MCI_OPEN_ELEMENT | MCI_OPEN_SHAREABLE | MCI_READONLY, (PVOID) &mopDuetPart, UP_OPEN); if (!ulError) /* if we opened part 1 */ { usDuetPart1ID = mopDuetPart.usDeviceID; /* * Now, open the other part */ mopDuetPart.pszElementName = (PSZ) aDuet[sDuet]achPart2; ulError = mciSendCommand( (USHORT) 0, MCI_OPEN, MCI_WAIT | MCI_OPEN_ELEMENT | MCI_OPEN_SHAREABLE | MCI_READONLY, (PVOID) &mopDuetPart, UP_OPEN); if (!ulError) /* if we opened part 2 */ { usDuetPart2ID = mopDuetPart.usDeviceID; ═══ 4.3. Memory Playlists ═══ In addition to specifying files or Resource Interchange File Format (RIFF) chunks to be loaded by compound devices, you also can specify memory objects. You create memory objects, for example, to play synthesized audio using the waveform audio media driver. These memory objects can be placed under the control of the memory playlist. The memory playlist is a data structure in an application. It contains an array of simple, machine-like instructions you formulate, each of which has a fixed format consisting of a 32-bit operation code and three 32-bit operands. Playlist instructions are described in the following table. To have playlist instructions interpreted by the playlist processor, you specify the MCI_OPEN_PLAYLIST flag with the MCI_OPEN command message. This flag indicates that the pszElementName field in the MCI_OPEN_PARMS data structure is a pointer to a memory playlist. Using playlist instructions, you can play audio objects in succession from one or more memory buffers. Instructions include branching to and returning from subroutines within the playlist. In addition, the playlist can be modified dynamically by the application while it is being played. ┌────────────────────┬──────────────────────────────────────────────────┐ │Opcode │Description │ ├────────────────────┼──────────────────────────────────────────────────┤ │BRANCH_OPERATION │Transfers control to another instruction in the │ │ │playlist. │ │ │Operand 1-Ignored. │ │ │Operand 2-The absolute instruction number in the │ │ │playlist to which control is being transferred. │ │ │Because the playlist is defined as an array of │ │ │structures (instruction, operation, and operand │ │ │values) its first instruction is referenced as │ │ │array element, index 0. Therefore, the first │ │ │instruction in the list is 0, the second │ │ │instruction is 1, and so on. │ │ │Operand 3-Ignored. │ │ │Branching out of a subroutine is not prohibited; │ │ │however, it is not recommended because an unused │ │ │return address is left on the stack maintained by │ │ │the playlist processor. │ │ │An application can enable or disable a │ │ │BRANCH_OPERATION by exchanging it with a │ │ │NOP_OPERATION. Operands for a NOP_OPERATION are │ │ │ignored. │ ├────────────────────┼──────────────────────────────────────────────────┤ │CALL_OPERATION │Transfers control to the absolute instruction │ │ │number specified in Operand 2, saving the number │ │ │of the instruction following the CALL for use on a│ │ │RETURN instruction. │ │ │CALL instructions may be nested up to 20 levels. │ │ │Operand 1-Ignored. │ │ │Operand 2-Absolute instruction number in the │ │ │playlist to which control is being transferred. │ │ │Operand 3-Ignored. │ ├────────────────────┼──────────────────────────────────────────────────┤ │CUEPOINT_OPERATION │Causes a cue point data record to be entered into │ │ │the data stream. Note that the cue point is │ │ │relative to the DATA_OPERATION that follows it. │ │ │Operand 1-User-defined parameter to be returned as│ │ │the low word of ulMsgParam1 in the MM_MCICUEPOINT │ │ │message. │ │ │Operand 2-Offset in MMTIME units for the actual │ │ │time the CUEPOINT message should be generated. │ │ │Operand 3-Ignored. │ │ │The MM_MCICUEPOINT message is returned to the │ │ │application as soon as possible after the cue │ │ │point data record is encountered in the data │ │ │stream. The message is sent to the window handle │ │ │specified when the device was originally opened. │ │ │Note: The CUEPOINT instruction is ignored when │ │ │used in a recording operation. │ ├────────────────────┼──────────────────────────────────────────────────┤ │DATA_OPERATION │Specifies a data buffer to be played from or │ │ │recorded into. │ │ │Operand 1-Long pointer to a buffer in the │ │ │application. │ │ │Operand 2-Length of the buffer pointed to by │ │ │Operand 1. │ │ │Operand 3-Current position in the buffer. This │ │ │operand is updated by the system during a │ │ │recording or playback operation. For a playback │ │ │operation, it is the number of bytes that have │ │ │been sent to the output device handler. For a │ │ │recording operation, it is the number of bytes │ │ │that have been placed into a user buffer. │ │ │The current position in the buffer is particularly│ │ │important after a recording operation, because │ │ │this field contains the number of bytes of │ │ │recorded data. The remaining bytes in the buffer │ │ │are not valid. This field is initialized to zero │ │ │when the DATA_OPERATION statement is first │ │ │encountered. │ │ │The buffer indicated by the DATA instruction must │ │ │only contain the raw data bytes from the device │ │ │and cannot include any header information. │ │ │Therefore, the precise meaning or format of the │ │ │data is dependent on the current settings of the │ │ │media device. For example, a wave audio data │ │ │element is assumed to have the format PCM or │ │ │ADPCM, number of bits per sample, and so on, that │ │ │is indicated by the settings of the audio device. │ ├────────────────────┼──────────────────────────────────────────────────┤ │EXIT_OPERATION │Indicates the end of the playlist. │ │ │Operand 1-Ignored. │ │ │Operand 2-Ignored. │ │ │Operand 3-Ignored. │ ├────────────────────┼──────────────────────────────────────────────────┤ │LOOP_OPERATION │Controls iteration in a playlist. It is the │ │ │responsibility of the application to initialize │ │ │the current iteration. The current iteration is │ │ │reset to zero following loop termination. │ │ │Operand 1-Number of times the loop is to be │ │ │executed. │ │ │Operand 2-Target instruction to branch to, when │ │ │the loop condition fails. │ │ │Operand 3-Current iteration. │ │ │The last instruction in a loop is a branch back to│ │ │the LOOP_OPERATION. The operation of the │ │ │LOOP_OPERATION instruction is as follows: │ │ │1. If Operand 3 is less than Operand 1, control is│ │ │transferred to the playlist instruction following │ │ │the LOOP instruction, and the iteration count in │ │ │Operand 3 is incremented. │ │ │2. Otherwise, the iteration count is reset to zero│ │ │and control is passed to the instruction specified│ │ │in Operand 2. │ │ │Typically, the application sets the iteration │ │ │count to zero when the playlist is passed to the │ │ │device, but this is not required. The loop │ │ │instruction merely compares the loop count with │ │ │the iteration count. If the iteration count is set│ │ │to a value other than zero when the playlist is │ │ │passed in, it is as if the loop has been executed │ │ │that number of times. Also, if a playback │ │ │operation is stopped, and then the same playlist │ │ │is loaded again, the loop iteration count is not │ │ │initialized by the playlist processor. │ │ │It is the application's responsibility to see that│ │ │iteration count values are what is required when │ │ │switching from play to record, record to play, and│ │ │when changing settings for the data (for example, │ │ │bitspersample, samplespersec, and so on) with the │ │ │set command. These commands cause the playlist │ │ │stream to be destroyed and re-created, and the │ │ │playlist to be reassociated as a new playlist with│ │ │the playlist processor. │ ├────────────────────┼──────────────────────────────────────────────────┤ │MESSAGE_OPERATION │Returns a message to the application during │ │ │playlist processing. │ │ │Operand 1-Ignored. │ │ │Operand 2-ULONG that is returned to the │ │ │application in the MM_MCIPLAYLISTMESSAGE message │ │ │MsgParam2. │ │ │Operand 3-Ignored. │ │ │Each time the playlist processor encounters a │ │ │MESSAGE instruction, MM_MCIPLAYLISTMESSAGE is │ │ │returned to the application. MESSAGE instructions │ │ │can be used by the application to trace specific │ │ │points during the execution of the playlist │ │ │processor. The message is sent to the window │ │ │handle specified when the device was originally │ │ │opened. │ │ │This function is not intended to be used for │ │ │timing of data production or consumption │ │ │identified by previously interpreted instructions.│ │ │Do not rely on the MESSAGE instruction to indicate│ │ │precisely when a particular piece of digital audio│ │ │has been played by an audio device; however, the │ │ │MESSAGE instruction can be used to indicate when a│ │ │buffer has been consumed and needs to be refilled.│ ├────────────────────┼──────────────────────────────────────────────────┤ │NOP_OPERATION │Used as a placeholder. │ │ │Operand 1-Ignored. │ │ │Operand 2-Ignored. │ │ │Operand 3-Ignored. │ ├────────────────────┼──────────────────────────────────────────────────┤ │RETURN_OPERATION │Transfers control to the playlist instruction │ │ │following the most recently executed CALL │ │ │instruction. │ │ │Operand 1-Ignored. │ │ │Operand 2-Ignored. │ │ │Operand 3-Ignored. │ └────────────────────┴──────────────────────────────────────────────────┘ ═══ 4.3.1. Clock Sample Program Playlist Example ═══ The data structure in the following figure holds the playlist that is used to play the chimes in the Clock Sample program provided in the Toolkit (\TOOLKIT\SAMPLES\MM\CLOCK). Note that the definitions for the playlist operation codes can be found in the MCIOS2.H file. /* * This double array holds the playlists that will be used to play the * chimes for the clock. Each array has three fields within the * structure: one for the playlist command (32-bit value) and three * operands (32-bit values). The DATA_OPERATION's first operand will * contain the address to the respective waveform buffers. Once the * playlist has been played, the CHIME_PLAYING_HAS_STOPPED message * will be sent so that the application knows that the audio has * finished. * The clock will have a unique chime for each quarter hour. * There are three chime files that are used in different combinations * to create all of the chimes used for the clock. These three files * are CLOCK1.WAV, CLOCK2.WAV, and CLOCK3.WAV. * * The first playlist will play quarter hour chime. This is simply * CLOCK1.WAV. * * The second playlist will play the half hour chime. This * consists of CLOCK1.WAV + CLOCK2.WAV. * * The third playlist will play the three quarter hour chime. This * consists of CLOCK1.WAV + CLOCK2.WAV + CLOCK1.WAV. * * The fourth playlist plays the hour chime. This consists of * CLOCK1.WAV + CLOCK2.WAV + CLOCK1.WAV + CLOCK2.WAV + * (HOUR * CLOCK3.WAV) * The Number of loops to perform for the hour value is kept in * the first operand. This will be set in a later procedure when the * hour of the chime time is known. */ PLAY_LIST_STRUCTURE_T apltPlayList[ NUMBER_OF_PLAYLISTS ] [ NUMBER_OF_COMMANDS ] = { /* * Quarter Hour Chime. */ { DATA_OPERATION, 0, 0, 0, /* Chime file 1. */ MESSAGE_OPERATION, 0, CHIME_PLAYING_HAS_STOPPED, 0, EXIT_OPERATION, 0, 0, 0 }, /* * Half Hour Chime. */ { DATA_OPERATION, 0, 0, 0, /* Chime file 1. */ DATA_OPERATION, 0, 0, 0, /* Chime file 2. */ MESSAGE_OPERATION, 0, CHIME_PLAYING_HAS_STOPPED, 0, EXIT_OPERATION, 0, 0, 0 }, /* * Three Quarter Hour Chime. */ { DATA_OPERATION, 0, 0, 0, /* Chime file 1. */ DATA_OPERATION, 0, 0, 0, /* Chime file 2. */ DATA_OPERATION, 0, 0, 0, /* Chime file 1. */ MESSAGE_OPERATION, 0, CHIME_PLAYING_HAS_STOPPED, 0, EXIT_OPERATION, 0, 0, 0 }, /* * Hour Chime. */ { DATA_OPERATION, 0, 0, 0, /* Chime file 1. < Line 0 >*/ DATA_OPERATION, 0, 0, 0, /* Chime file 2. < Line 1 >*/ DATA_OPERATION, 0, 0, 0, /* Chime file 1. < Line 2 >*/ DATA_OPERATION, 0, 0, 0, /* Chime file 2. < Line 3 >*/ DATA_OPERATION, 0, 0, 0, /* Chime file 3. < Line 4 >*/ LOOP_OPERATION, 0, 4, 0, /* Which line to loop on. < Line 5 >*/ MESSAGE_OPERATION, 0, CHIME_PLAYING_HAS_STOPPED, 0, EXIT_OPERATION, 0, 0, 0 } To prevent lost data, the address range of memory buffers used in DATA operations should not overlap. ═══ 4.3.1.1. Setting up the Playlist ═══ The SetupPlaylist procedure is performed once, during initialization of the Clock Sample program. It calls the procedure CopyWaveformIntoMemory to copy the waveform files into memory buffers. It also initializes the playlist data structure by supplying the address and size of the memory buffers holding the data in the appropriate data structure fields. VOID SetupPlayList( VOID ) { /* * This array keeps the address of each audio chime file. */ static LONG *pulBaseAddress[ NUMBER_OF_CHIME_FILES ]; USHORT usChimeFileId; /* Chime audio file ID. */ ULONG ulSizeOfFile, /* Size of audio file. */ ulMemoryAllocationFlags = PAG_COMMIT | PAG_READ | PAG_WRITE; for(usChimeFileId=0; usChimeFileId ulMarkedStartBytes ) { ulFlags |= MCI_NOTIFY | MCI_FROM | MCI_TO | MCI_CONVERT_FORMAT; /* * Set the from and to items to the beginning and end * of the selected area. */ mcieditstr.ulFrom = ulMarkedStartBytes; mcieditstr.ulTo = ulMarkedEndBytes; } else { /* * Otherwise, nothing in the wave is selected so the flags are * only NOTIFY and CONVERT_FORMAT. */ ulFlags |= MCI_NOTIFY | MCI_CONVERT_FORMAT; /* * Because this is a paste operation without FROM/TO, * we have to SEEK so that the media position is set * to the place that we want to paste. */ if( ulResult = weMciCall( hwnd, MCI_SEEK ) ) return( ulResult ); } ulResult = mciSendCommand( usDeviceID, MCI_PASTE, ulFlags, (ULONG)&mcieditstr, 0 ); return( ulResult ); { The following is an example of using the command string interface with editing commands to create a repeating sound. open test.wav alias a wait copy a from 0 to 3000 wait seek a to end paste a wait paste a wait paste a wait ═══ 4.5. Device Sharing By Applications ═══ The multimedia system supports sharing of physical devices among multiple applications. If a device is capable of being shared; that is, if it can maintain state information, the system can establish a unique device state, much like a Presentation Manager device context, for each application that uses the device. The scope of a device state is defined by each device. The state of a simple device like the digital video player contains information about the current frame position, whether the device is playing or stopped, what its current playback speed is set to, and so on. The state of a compound device can include the name of the currently selected file, RIFF object, and playback position. Media devices vary in their ability to support multiple device contexts concurrently. The different types of device use that are supported by media devices are: o Fixed single-context o Dynamic single-context o Limited multiple-context o Unlimited context. The following table contains descriptions and examples of these device use types. ┌─────────────────────────┬─────────────────────────────────────────────┐ │Context Use Type │Description │ ├─────────────────────────┼─────────────────────────────────────────────┤ │Fixed single-context │A fixed single-context device can establish │ │ │only one device context. The state of a │ │ │fixed single-context device cannot be queried│ │ │or set by software. │ │ │An example of a fixed single-context device │ │ │is a video cassette recorder that does not │ │ │report the tape position to the driver. │ ├─────────────────────────┼─────────────────────────────────────────────┤ │Dynamic single-context │A dynamic single-context device is serially │ │ │shareable. That is, the device can be used by│ │ │only one application at a time but can be │ │ │passed from one application to another. A │ │ │device state for each application is saved │ │ │and restored appropriately. │ │ │This is the most common concurrent use type │ │ │for a media device. An example of a dynamic │ │ │single-context device is a CD-ROM player. │ ├─────────────────────────┼─────────────────────────────────────────────┤ │Limited multiple-context │A limited multiple-context device can │ │ │establish multiple device contexts, but the │ │ │number of device contexts is limited by the │ │ │physical device. │ │ │An example of a limited multiple-context │ │ │device is a 4-channel amp-mixer audio device,│ │ │which can concurrently support any of the │ │ │following multiple-contexts: │ │ │Four monaural contexts, two stereo contexts, │ │ │and one stereo and two monaural contexts. │ ├─────────────────────────┼─────────────────────────────────────────────┤ │Unlimited context │An unlimited context device can support an │ │ │arbitrary number of concurrent device │ │ │contexts. The number of concurrent contexts │ │ │is limited only by the resource limits of the│ │ │system. │ └─────────────────────────┴─────────────────────────────────────────────┘ ═══ 4.5.1. Getting Control of a Shared Device ═══ The MM_MCIPASSDEVICE message sent with WinPostMsg by the multimedia system to applications and the MCI_ACQUIREDEVICE message sent by applications with mciSendCommand to the multimedia system provide a device-sharing scheme for the OS/2 multimedia environment. To participate in device sharing, an application issues an MCI_OPEN with the MCI_OPEN_SHAREABLE flag set. The system then attempts to acquire the device for the application. The application must wait until it receives the asynchronous MM_MCIPASSDEVICE message to gain control of the device. The multimedia system sends the MM_MCIPASSDEVICE message to inform the application that the device context is becoming active (MCI_GAINING_USE). Before an application receives an MM_MCIPASSDEVICE message with an event of MCI_GAINING_USE, it can make inquiries about the device and the media. MCI_STATUS, MCI_GETDEVCAPS, MCI_INFO and MCI_CLOSE commands can be sent to an inactive device context. Note: If your application has set an MCI_NOTIFY flag on the open request, notification will be posted to the application before the MM_MCIPASSDEVICE message is sent. However, if the application message queue has other messages already queued, it is possible that the application may receive the MM_MCIPASSDEVICE message before it receives the notification message. The active instance of the application remains active until the application returns from the WinPostMsg (MCI_LOSING_USE). This guarantees that the application has an active device context until it returns from WinPostMsg. If the application receives an MM_MCIPASSDEVICE message with an event of MCI_GAINING_USE, it should return immediately. The following code fragment illustrates the device sharing architecture from the Clock Sample program. /* * The next two messages are handled so that the Clock application * can participate in device sharing. Because it opens the devices * as shareable devices, other applications can gain control of the * devices. When this happens, we will receive a pass device * message. We keep track of this device passing in the fPassed * boolean variable. * If we do not have access to the device when we receive an * activate message, then we will issue an acquire device command * to gain access to the device. */ case MM_MCIPASSDEVICE: if (SHORT1FROMMP(mp2) == MCI_GAINING_USE) { fPassed = FALSE; /* Gaining control of device */ } else { fPassed = TRUE; /* Losing control of device */ } return( WinDefSecondaryWindowProc( hwnd, msg, mp1, mp2 ) ); case WM_ACTIVATE: /* We use the WM_ACTIVATE message to participate in device sharing. * We first check to see if this is an activate or a deactivate * message (indicated by mp1). Then, we check to see if we've * passed control of the device that we use. If these conditions * are true, we issue an acquire device command to regain * control of the device, because we're now the active window on * the screen. * * This is one possible method that can be used to implement * device sharing. For applications that are more complex * than this sample program, developers may wish to take * advantage of a more robust method of device sharing. * This can be done by using the MCI_ACQUIRE_QUEUE flag on * the MCI_ACQUIREDEVICE command. */ /* * First we check to see if we've passed control of the device */ if ((BOOL)mp1 && fPassed == TRUE) { mciGenericParms.hwndCallback = hwnd; ulError = mciSendCommand( mciOpenParameters.usDeviceID, MCI_ACQUIREDEVICE, (ULONG)MCI_NOTIFY, (PVOID) &mciGenericParms, (USHORT)NULL); if (ulError) { ShowAMessage(acStringBuffer[IDS_NORMAL_ERROR_MESSAGE_BOX_TEXT-1]; IDS_CHIME_FILE_ERROR, /* ID of message */ MB_OK | MB_INFORMATION | MB_HELP | MB_APPLMODAL | MB_MOVEABLE ); /* Style of msg box. */ } } return( WinDefSecondaryWindowProc( hwnd, msg, mp1, mp2 ) ); Regaining Control of a Shared Device An application regains control of a shared device by issuing the MCI_ACQUIREDEVICE message with mciSendCommand after it has received a WM_ACTIVATE message. The application receives a WM_ACTIVATE message whenever its frame window is activated or deactivated by user selection. The time for the application to regain control of a shared device is during the period its window is activated. A "greedy" application that grabs back a device as soon as it loses it defeats the purpose of the WM_ACTIVATE message processing scheme, which is to give control of a shared device to the application with which the user is interacting. Only dynamic single-context and limited multiple-context devices are acquired by applications. The MCI_ACQUIREDEVICE function does not perform any function for fixed single-context and unlimited-context devices, because device contexts are not saved or restored for these classes of devices. To better understand the allocation of resources to multiple device contexts, imagine a stack of device contexts. The physical device is associated with the topmost device context on the stack. Whenever a device context is opened, it is placed on top of the stack, and the physical device is associated with the new device context. When MCI_ACQUIREDEVICE is issued for a particular device context, that device context moves to the top of the stack, and the physical device is associated with the existing device context. Closing a device context removes it from the stack. Queued Acquire Command Setting the MCI_ACQUIRE_QUEUE flag of the MCI_ACQUIREDEVICE message enables the message to be queued and executed as soon as device resources become available. An application can issue an MCI_ACQUIREDEVICE message and, at a later point, the device context becomes active. This is true if either the MCI_NOTIFY or MCI_WAIT flag is specified. If the MCI_WAIT flag is specified, the calling thread is blocked until the device context becomes active. If the MCI_ACQUIREDEVICE request can be satisfied immediately, the command is not queued. The acquire command can be used to acquire a device instance when the resource becomes available: open music1.wave alias wave1 shareable wait play wave1 notify . . . ** During this time a losing use message is received ** ** and this instance becomes inactive. ** . . . acquire wave1 queue notify If an MCI_ACQUIREDEVICE is queued and an application issues an MCI_RELEASEDEVICE or MCI_CLOSE for that instance, the queued MCI_ACQUIREDEVICE message is canceled. Releasing the Resource The release resource command is used in conjunction with the queued acquire command. An application can release a device instance from the active state and make the next available inactive device instance active by setting the MCI_RETURN_RESOURCE flag of the MCI_RELEASEDEVICE message. When a device instance no longer needs its resources, the device instance can give up the resource to another device requesting the resources (with MCI_ACQUIRE_QUEUE). The release command as shown in the following example can be used to release exclusive hold on a device. open waveaudio alias wave2 shareable wait acquire wave2 exclusive wait record wave2 notify . . . ** Open the device exclusively to avoid interruptions ** ** during recording. ** . . . stop wave2 wait release wave2 return resource wait The device instance will not be made active again unless an application issues an MCI_ACQUIREDEVICE message for this device context. This function is ignored if the instance is already in an inactive state. The instance remains active if the resource used by this instance is not required by any other instance. ═══ 4.5.2. Using a Device Exclusively ═══ There are times when an application must retain control of the physical resource, such as the during the duration of a recording operation or when the application needs to establish specific settings for the device context. The application can retain control by not specifying the shareable flag with the open request or by setting the MCI_EXCLUSIVE flag of the MCI_ACQUIREDEVICE message. When a device has been acquired for exclusive use, other applications cannot acquire the device until the application using the device closes it or releases it from exclusive use with the MCI_RELEASEDEVICE message. When an application releases a device from exclusive use, it does not lose use of the device until another application acquires it. When an application needs to acquire a device context for exclusive use without acquiring the entire device resource, the application can set the MCI_EXCLUSIVE_INSTANCE flag of the MCI_ACQUIREDEVICE message. This flag prevents the device context from being made inactive unless the application using the device issues the MCI_CLOSE or MCI_RELEASEDEVICE message. The MCI_EXCLUSIVE_INSTANCE and MCI_EXCLUSIVE flags are mutually exclusive. An instance can be in one of three sharing states: o Instance exclusive o Device exclusive o Fully shareable Using the MCI_EXCLUSIVE_INSTANCE flag places an instance in an instance-exclusive sharing state. Using the MCI_EXCLUSIVE flag places an instance in a device-exclusive sharing state. The MCI_RELEASEDEVICE message places an instance in a fully shareable state. ═══ 4.6. Device Groups ═══ When an OS/2 multimedia application needs to control more than one device at a time, it uses the MCI_GROUP message to group device contexts. The MCI_GROUP_MAKE and MCI_GROUP_DELETE flags are used to make and delete groups. MCI_GROUP_MAKE ties several device instances together so that a single command sent to the group by an application is actually sent to each device instance in the group by the multimedia system. This flag can be combined with any of the other MCI_GROUP flags except MCI_GROUP_DELETE in which case an MCIERR_FLAGS_NOT_COMPATIBLE error code is returned. Device instances must have been previously opened but can be in any mode (such as, playing, stopped, or paused) for this message to be successful. If one or more device IDs are invalid, the MCIERR_INVALID_DEVICE_ID error code is returned. If a device ID or alias refers to a device in another group, the MCIERR_ID_ALREADY_IN_GROUP error code is returned. Once a group has been made, certain command messages sent to the group ID (or alias name) are sent to each device making up that group. Command messages that support groups are: MCI_ACQUIREDEVICE MCI_RELEASEDEVICE MCI_CLOSE MCI_RESUME MCI_CUE MCI_SEEK MCI_PAUSE MCI_SET MCI_PLAY MCI_STOP MCI_RECORD Note: Commands sent to a group must use the MCI_NOTIFY flag. To end a group association, an application uses the MCI_GROUP_DELETE flag of the MCI_GROUP message. None of the other flags of the MCI_GROUP message can be combined with MCI_GROUP_DELETE because the only information required by this flag is a group ID. If any other flags are supplied an MCIERR_FLAGS_NOT_COMPATIBLE error code is returned. The MCIERR_INVALID_GROUP_ID error code is returned if an application passes an invalid ID. When a device in a group is closed, it is removed from the group. When the last device in a group is closed, the group is automatically deleted. Applications can use the MCI_GROUP_ALIAS flag to refer to a group by a name rather than a group ID for use with the mciSendString interface. This flag can only be used with an MCI_GROUP_MAKE flag; the given alias is used to refer to the new group. If the alias is already in use, an MCIERR_DUPLICATE_ALIAS error code is returned. Each string group "make" command must include an alias so the group can later be referred to. The alias follows the group command as shown by the following syntax: group grp1 make (wave1 cd1) wait This defines the alias to be "grp1". The list of device names (members of the group) is delimited by parenthesis and separated by spaces and optional quotation marks. The following syntax is used to delete this group: group grp1 delete wait ═══ 4.6.1. Duet Player Sample Program Example ═══ The following code fragment illustrates the creation of a device group in the Duet Player I sample program. An array is filled with the IDs of opened devices to be associated in the group. The application then calls MCI_GROUP to create the group and return a handle. /* If this is the first time through this routine, then we need to * open the devices and make the group. * * On subsequent calls to this routine, the devices are already open * and the group is already made, so we only need to load the * appropriate files onto the devices. */ { /* * Open one part of the duet. The first step is to initialize an * MCI_OPEN_PARMS data structure with the appropriate information, * then issue the MCI_OPEN command with the mciSendCommand function. * We will be using an open with only the element name specified. * This will cause the default connection, as specified in the * MMPM.INI file, for the data type. */ mopDuetPart.hwndCallback = hwnd; /* For MM_MCIPASSDEVICE */ mopDuetPart.usDeviceID = (USHORT) NULL; /* this is returned */ mopDuetPart.pszDeviceType = (PSZ) NULL; /* using default conn.*/ mopDuetPart.pszElementName = (PSZ) aDuet[sDuet].achPart1; ulError = mciSendCommand( (USHORT) 0, MCI_OPEN, MCI_WAIT | MCI_OPEN_ELEMENT | MCI_OPEN_SHAREABLE | MCI_READONLY, (PVOID) &mopDuetPart, UP_OPEN); if (!ulError) /* if we opened part 1 */ { usDuetPart1ID = mopDuetPart.usDeviceID; /* * Now, open the other part. */ mopDuetPart.pszElementName = (PSZ) aDuet[sDuet].achPart2; ulError = mciSendCommand( (USHORT) 0, MCI_OPEN, MCI_WAIT | MCI_OPEN_ELEMENT | MCI_OPEN_SHAREABLE | MCI_READONLY, (PVOID) &mopDuetPart, UP_OPEN); if (!ulError) /* if we opened part 2 */ { usDuetPart2ID = mopDuetPart.usDeviceID; /* * Now we need to create a group. To do this, * we need to fill an array with the IDs of the already open * devices that we want to group. Then we call MCI_GROUP to * create the group and return a handle to it. */ ulDeviceList[0] = (ULONG)usDuetPart1ID; ulDeviceList[1] = (ULONG)usDuetPart2ID; mgpGroupParms.hwndC lback = (HWND) NULL; /* Not needed - we're waiting */ mgpGroupParms.ulNumDevices = NUM_PARTS; /* Count of devices */ mgpGroupParms.paulDeviceID = (PULONG)&ulDeviceList; /* Array of devices */ mgpGroupParms.ulStructLength = sizeof (mgpGroupParms); ulError = mciSendCommand( (USHORT) 0, MCI_GROUP, MCI_WAIT | MCI_GROUP_MAKE| MCI_NOPIECEMEAL, (PVOID) &mgpGroupParms, UP_GROUP); fFirstPlay = FALSE; ═══ 4.6.2. Resource Allocation ═══ An application avoids piecemeal resource allocation problems by setting the MCI_NOPIECEMEAL flag of the MCI_GROUP message. This flag specifies that the associated group is treated as a whole rather than several separate instances. If one of the device instances becomes inactive then all the instances in the group will go inactive. This flag can only be combined with the MCI_GROUP_MAKE flag as it specifies the nature of the group to be created. If the MCI_NOPIECEMEAL flag is set during creation and one or more of the instances in the list of IDs or aliases is already inactive, the entire group will go inactive and each device in the group saves its state. The device contexts in the group can subsequently be restored by passing the group device context ID with the MCI_ACQUIREDEVICE message, using mciSendCommand. If the MCI_NOPIECEMEAL flag is not specified and devices are lost, the application retains control over the remaining devices in the group, unless one of the lost devices is the master of the group. When the master of a group of devices is lost, the group is lost. ═══ 4.7. Event Synchronization ═══ Applications can perform event synchronization by taking an action at a specified point during the playback of a data object. There are two ways an application can do this: o The application can request to be notified when a specified point in playback is encountered by sending the MCI_SET_CUEPOINT message to the multimedia system. When this cue point is encountered, the multimedia system sends an MM_MCICUEPOINT message to the application. o The application can request notification on a periodic basis, based on time or position, by sending the MCI_SET_POSITION_ADVISE message to the multimedia system. As each time period (or position) specified passes, the multimedia system sends an MM_MCIPOSITIONCHANGE message to the application. ═══ 4.7.1. Cue Points ═══ Cue points are discrete locations or time positions in a media device. When a device encounters a time position associated with a cue point, a message is returned to the application window handle that is specified to receive the cue point messages. Cue points are maintained as part of a device context, so setting a cue point in one device context will not cause cue point messages to be generated for other device contexts. Applications specify cue points for a device with the MCI_SET_CUEPOINT message. A cue point is identified by its location; setting a cue point "on" sets a cue point at the specified location, and setting a cue point "off" removes the cue point. Because cue points are identified by location, only one cue point can be set at a specified location in the media. Therefore, setting a cue point at a location where a cue point is already set causes the second MCI_SET_CUEPOINT to fail and to return the error MCIERR_DUPLICATE_CUEPOINT. Cue points can be set at any valid location in the media, regardless of current media position. If a device is currently playing at 2:00 (two minutes), and a cue point is set at 1:00 (one minute) in the media, and the device is subsequently seeked and played from the beginning, the cue point message will be generated when the device passes the 1:00 point in the media. Cue points are persistent. That is, they remain set after they are encountered. The device will generate cue point messages whenever the cue point location is encountered, which may be many times if the device is seeked or played repeatedly. Cue points are encountered only when a device is playing or recording. If a device is seeked from its current position to some new position, cue points set at locations between the old and new position are not encountered during the seek operation, and no cue point messages are generated. Because cue points can be set only within the valid range of a media element, cue points cannot be set when a file is not loaded. All cue points for a device context are removed when a new file element is loaded. Cue points also can be created as part of a media element. In the case of cue points imbedded directly in a media element, the MCI_SET_CUEPOINT message performs no function. Imbedded cue points always result in cue point messages being returned when they are encountered. The user parameter value returned on the cue point message varies from one media data type to another and should be set to a meaning that is significant to the application. When a cue point is encountered, an MM_MCICUEPOINT message is sent to the window specified by the hwndCallback field of the MCI_CUEPOINT_PARMS data structure passed with the MCI_SET_CUEPOINT message. The MM_MCICUEPOINT message parameters contain the device ID of the device context that generated the cue point message, as well as the media position and an additional application-defined parameter that can be specified when the cue point is set. Although the media position specified by the application on the MCI_SET_CUEPOINT message is in the currently set device units, the media position returned on the MM_MCICUEPOINT message is always in MMTIME units. MMTIME units are used because the time format set when the cue point is set might not be the same time format set when the cue point is encountered. The maximum number of cue points that can be set in a device context is defined by the implementation of the logical device. Devices generally support up to 20 cue points per device context. ═══ 4.7.2. Position Advises ═══ In addition to notification messages at discrete locations in the media, periodic notification of elapsed media time can also be requested. These periodic messages, referred to as "position advise" messages, are requested for a device context based on a specified time interval. Position advise messages are requested by issuing the MCI_SET_POSITION_ADVISE message to a device context as shown in the following code fragment. MCI_OPEN_PARMS mop; static MCI_PLAY_PARMS mpp; /* parms for MCI_PLAY */ static MCI_POSITION_PARMS mppPos; /* parms for MCI_SET_POSITION_ADVISE */ iState = ST_PLAYING; /* Set state to reflect play mode */ mppPos.hwndCallback = hwndMainDlg; mppPos.ulUnits = 1500; /* Request position advise messages */ mppPos.usUserParm = usPositionUP; mppPos.Reserved0 = 0; mciSendCommand ( mop.usDeviceID, MCI_SET_POSITION_ADVISE, MCI_NOTIFY | MCI_SET_POSITION_ADVISE_ON, (PVOID) &mppPos, UP_POSITION ); This causes MM_MCIPOSITIONCHANGE messages to be returned to the application window specified in the MCI_POSITION_PARMS structure at the requested frequency as media time passes in the device context. Only one position advise frequency may be active for a device context, and having position advise notification active in one device context does not cause messages to be generated in other device contexts. Position advise messages can be set only when a device element is loaded in the device context, and are reset when a new device element is loaded. Like MM_MCICUEPOINT messages, MM_MCIPOSITIONCHANGE message parameters contain the device ID of the device context that generated the position advise message, as well as the media position and an additional application-defined parameter that can be specified when the position advise notification is requested. Although the media position interval (frequency) specified by the application on the MCI_SET_POSITION_ADVISE message is in the currently set device units, the media position returned on the MM_MCIPOSITIONCHANGE message is always in MMTIME units. MMTIME units are used because the time format set when the position advise notification is set might not be the same time format set when the position advise notification messages are returned. Position advise notifications are generated only during playback or recording. MM_MCIPOSITIONCHANGE messages are usually not generated during seek operations initiated by the application. The exception is a device, such as a tape recorder, that has a discernible position during the seek operation. A device like this can generate position advise messages as the media is traversed, to indicate the progress of the seek operation. The following code fragment shows how the Caption Sample application handles the MM_MCIPOSITIONCHANGE message. When the Caption Sample application receives a position change message, it updates its media position slider arm allowing the application to advance the media position slider smoothly as the audio plays. case MM_MCIPOSITIONCHANGE: /* * This message will be returned (in MMTIME) to the application * whenever the audio position changes. This time will be used to * increment the audio position slider. This message is only * generated during playback. */ if ( eState == ST_PLAYING ) { ulTime = (ULONG) LONGFROMMP(mp2); /* * Get the new slider arm position and set it. */ sArmPosition = (SHORT) ( ( ulTime * ( sAudioArmRange - 1) ) / ulAudioLength ); WinSendMsg( hwndAudioSlider, SLM_SETSLIDERINFO, MPFROM2SHORT( SMA_SLIDERARMPOSITION, SMA_RANGEVALUE ), MPFROMSHORT( sArmPosition )); } return 0; ═══ 4.8. System Values ═══ The OS/2 multimedia system provides a number of system-wide values that can be queried and set by applications. Because OS/2 multimedia applications such as Volume Control and Multimedia Setup allow users to set system values, it is recommended that applications only query the settings users have selected. The following table describes the system values that can be queried and set using mciQuerySysValue and mciSetSysValue. ┌────────────────────┬─────────────────────────────────────────────┐ │System Value │Description │ ├────────────────────┼─────────────────────────────────────────────┤ │MSV_CLOSEDCAPTION │Query or set the current state of a │ │ │captioning flag. │ │ │By querying the setting of this flag, an │ │ │application can determine whether to display │ │ │text along with audio, for example, for a │ │ │hearing-impaired user. │ ├────────────────────┼─────────────────────────────────────────────┤ │MSV_MASTERVOLUME │Query or set the current master audio level. │ │ │This value acts as a "multiplier" of the │ │ │individual volume levels of each device │ │ │context, allowing one application to control │ │ │the volume for a number of open devices or │ │ │elements. │ ├────────────────────┼─────────────────────────────────────────────┤ │MSV_HEADPHONES │Reserved for future use. │ ├────────────────────┼─────────────────────────────────────────────┤ │MSV_SPEAKERS │Reserved for future use. │ ├────────────────────┼─────────────────────────────────────────────┤ │MSV_WORKPATH │Query or set the directory for storing of │ │ │temporary files by the media driver. │ │ │This value can be used to point to, for │ │ │example, a directory on the hard disk that │ │ │holds waveform data from a recording │ │ │operation. │ ├────────────────────┼─────────────────────────────────────────────┤ │MSV_SYSQOSERRORFLAG │Query the Quality of Service (QOS) error │ │ │flag. │ │ │By querying this flag, an application can │ │ │determine an error occuring during band-width│ │ │reservation. │ ├────────────────────┼─────────────────────────────────────────────┤ │MSV_SYSQOSVALUE │Query or set the QOS specification value. │ │ │This system-wide Quality of Service (QOS) │ │ │specification value is used for band-width │ │ │reservation (for example, bytes per second) │ │ │over the network. │ └────────────────────┴─────────────────────────────────────────────┘ The following code fragment demonstrates how to obtain the multimedia work path. CHAR szWorkpath[CCHMAXPATH] ; /* Work path for temporary files */ if ( mciQuerySysValue( MSV_WORKPATH, szWorkPath ) ) { /* mciQuerySysValue was successful, szWorkPath now */ /* contains the multimedia workpath */ } ═══ 4.8.1. Clock Sample Program Caption Query ═══ When it is time to chime the clock, the Clock Sample program checks the system captioning flag to determine whether or not it should display a visual chime while the audio chime is playing. The Clock program sets the global variable fClosedCaptionIsSet to store the value of the system captioning flag. /* * If the Captioning Flag indicates that the bell should be * animated (swung), the region of the presentation space * that contains the bell bit map is to be invalidated so that a * WM_PAINT will be sent to draw the bells. */ mciQuerySysValue( MSV_CAPTION, (PVOID)&fCaptionIsSet ); ═══ 5. Multimedia Logical Devices ═══ OS/2 multimedia represents audio adapters, CD-ROM drives, videodiscs and other real hardware devices as logical media devices that are managed by the Media Device Manager (MDM). Media devices are a logical representation of the function available from either a real hardware device, software emulation in combination with real hardware, or pure software emulation. The actual implementation is not relevant to an application, because the multimedia system provides device independence with the mciSendCommand and mciSendString interfaces. The following logical devices are currently supported in this release of OS/2. Additional media devices may be available from IBM or from other companies as OS/2 multimedia is completely extensible at all levels. o Amplifier mixer o Waveform audio o MIDI Sequencer o CD audio o CD-XA o Videodisc o Digital video Frequently there is a one-to-one correspondence between a real hardware device, such as a CD-ROM drive and its associated media device. Other hardware may be represented as multiple logical devices. For example, a multi-function audio adapter can be represented as waveform audio, MIDI sequencer, and amplifier-mixer media devices. The following sections describe the function and typical use of each media device, plus the software model presented to the application developer. ═══ 5.1. Multimedia Information and OS/2 Multimedia Connectors ═══ A connector is a software representation of the physical way in which multimedia data moves from one device to another. Simple examples are the headphone jack on a CD-ROM player, or the speakers jack on an audio adapter. If an audio card has both a speaker and a line OUT jack, it is desirable to let an application choose the destination of the audio, while remaining independent from the actual hardware implementation. OS/2 multimedia connectors provide this function by allowing an application to query which connectors are supported by a logical device, and manipulate whether or not information is flowing through the connector. The connectors for a logical device can be accessed either by number or by a symbolic connector type. When specifying a symbolic type such as microphone or line IN, a number can also be specified to select the first connector, second connector, and so on, of that specific connector type. The MCI_CONNECTORINFO message can be used to determine which connectors are supported by a device, whereas the MCI_CONNECTOR message can be used to enable, disable, or query the state of a particular connector. Although connectors are typically associated with the representation of externally visible audio and video jacks on multimedia equipment, another category of connectors can represent the flow of information within a computer. For example, a connector on an audio adapter can be attached to the internal PC speaker. A more subtle example is the flow of digital audio information into an audio adapter. This information could come from a file, system memory, or another device. Connectors of this category are referred to as stream connectors to convey the idea of a logical stream of information flowing from one device to another. ═══ 5.2. Connector Types ═══ Each connector is defined by specific type or name, so that applications can make requests symbolically instead of using an absolute connector number that is device dependent. If an application specifies only a connector type, then the default connector of that type is selected on the device. Both a type and a number may be specified to select connectors when more than one connector of the same type exists on a device. The following table describes connector types and typical uses. ┌──────────────────────────┬─────────────┬───────────────────────────────────┐ │Connector Type │Name │Description │ ├──────────────────────────┼─────────────┼───────────────────────────────────┤ │MCI_MIDI_STREAM_CONNECTOR │midi stream │Digital input or output for the │ │ │ │sequencer device. This information│ │ │ │is typically streamed to an │ │ │ │amplifier-mixer logical device. │ ├──────────────────────────┼─────────────┼───────────────────────────────────┤ │MCI_CD_STREAM_CONNECTOR │cd stream │Digital output from a CD-ROM drive │ │ │ │capable of reading the CD-DA data │ │ │ │directly off the disc. This │ │ │ │information is typically streamed │ │ │ │to an amplifier-mixer logical │ │ │ │device. │ ├──────────────────────────┼─────────────┼───────────────────────────────────┤ │MCI_XA_STREAM_CONNECTOR │xa stream │The flow of digital audio │ │ │ │information from a CD-ROM/XA drive │ │ │ │capable of streaming the ADPCM data│ │ │ │internally. This information is │ │ │ │typically streamed to an │ │ │ │amplifier-mixer logical device. │ ├──────────────────────────┼─────────────┼───────────────────────────────────┤ │MCI_WAVE_STREAM_CONNECTOR │wave stream │Digital input or output for the │ │ │ │waveaudio device. This information │ │ │ │is typically streamed to an │ │ │ │amplifier-mixer logical device. │ ├──────────────────────────┼─────────────┼───────────────────────────────────┤ │MCI_AMP_STREAM_CONNECTOR │amp stream │The flow of information to an │ │ │ │amplifier-mixer device. Typically │ │ │ │this information comes from another│ │ │ │logical device and can include │ │ │ │MIDI, and various kinds of digital │ │ │ │audio. │ ├──────────────────────────┼─────────────┼───────────────────────────────────┤ │MCI_HEADPHONES_CONNECTOR │headphones │The connector on the device labeled│ │ │ │for, or typically used, to attach │ │ │ │headphones to the device. │ ├──────────────────────────┼─────────────┼───────────────────────────────────┤ │MCI_SPEAKERS_CONNECTOR │speakers │The connector on the device labeled│ │ │ │for, or typically used, to attach │ │ │ │speakers to the device. │ ├──────────────────────────┼─────────────┼───────────────────────────────────┤ │MCI_MICROPHONE_CONNECTOR │microphone │The connector on the device labeled│ │ │ │for, or typically used, to attach a│ │ │ │microphone to the device. │ ├──────────────────────────┼─────────────┼───────────────────────────────────┤ │MCI_LINE_IN_CONNECTOR │line in │The connector on the device labeled│ │ │ │for, or typically used to provide │ │ │ │line level input to the device. │ ├──────────────────────────┼─────────────┼───────────────────────────────────┤ │MCI_LINE_OUT_CONNECTOR │line out │The connector on the device labeled│ │ │ │for, or typically used, to provide │ │ │ │line level output from the device. │ ├──────────────────────────┼─────────────┼───────────────────────────────────┤ │MCI_VIDEO_IN_CONNECTOR │video in │The connector on the device labeled│ │ │ │for, or typically used to provide │ │ │ │video input to the device. │ ├──────────────────────────┼─────────────┼───────────────────────────────────┤ │MCI_VIDEO_OUT_CONNECTOR │video out │The connector on the device labeled│ │ │ │for, or typically used, to provide │ │ │ │video output from the device. │ ├──────────────────────────┼─────────────┼───────────────────────────────────┤ │MCI_PHONE_SET_CONNECTOR │phone set │The connector on the device labeled│ │ │ │for, or typically used, to attach a│ │ │ │phone set to the device. │ ├──────────────────────────┼─────────────┼───────────────────────────────────┤ │MCI_PHONE_LINE_CONNECTOR │phone line │The connector on the device labeled│ │ │ │for, or typically used, to attach │ │ │ │an external phone line to the │ │ │ │device. │ ├──────────────────────────┼─────────────┼───────────────────────────────────┤ │MCI_AUDIO_IN_CONNECTOR │audio in │The connector on the device labeled│ │ │ │for, or typically used, to provide │ │ │ │audio input to the device. │ ├──────────────────────────┼─────────────┼───────────────────────────────────┤ │MCI_AUDIO_OUT_CONNECTOR │audio out │The connector on the device labeled│ │ │ │for, or typically used, to provide │ │ │ │audio output from the device. │ ├──────────────────────────┼─────────────┼───────────────────────────────────┤ │MCI_UNIVERSAL_CONNECTOR │universal │A connector on a device which does │ │ │ │not fall into any of the other │ │ │ │categories. This connector type │ │ │ │may be used to access device │ │ │ │dependent function. The │ │ │ │manufacturer of the device should │ │ │ │define the exact use of this │ │ │ │connector. │ └──────────────────────────┴─────────────┴───────────────────────────────────┘ ═══ 5.3. A Connector Example Using the IBM M-Audio Adapter ═══ The following figure illustrates how the capabilities of an audio card might be modeled as an OS/2 amplifier-mixer device using connectors. This example uses the IBM M-Audio Capture and Playback Adapter, however a model can easily be defined for any manufacturer's audio card. The number and type of connectors may vary. In this particular model, the speakers(1) connector is the default speakers connector and represents the physical speaker jack on the M-Audio card. The speakers(2) connector represents the internal connection to the PC speaker. The amp stream connector represents the flow of digital information to and from the audio card. ═══ 5.4. Establishing Connections between Devices ═══ A connection is the establishment of a flow of information from one device connector to a compatible connector on another device. One example of a connection is the attachment of a speaker to an audio card with speaker wire. Typically, an application might enable the speaker connector on an audio card, causing the flow of information out of the speaker connector. If you have connected the wire, then the audio is heard. In this case the multimedia system must rely on a person to make the actual connection. Another category of connection exists where the connection is made internally in the computer, typically through the transfer of digital information from one media device to another. For instance, the waveaudio media device has a connection to its associated amplifier mixer device. ═══ 5.5. Default and Device Context Connections ═══ Device connections are usually automatically established by the media device when the device is opened. The choice of connection is determined by a default established during installation of the media driver, and can be re-established using the MCI_DEFAULT_CONNECTION message. Once opened, the media device may open and connect to another media device to provide the complete function of the originally opened device to the application. This is transparent to the calling application. One example is the waveaudio device, which uses a connected amplifier-mixer device to actually produce sound from the digital audio stream. The waveaudio device also uses the services of the amplifier-mixer device to set the volume. While some services of the connected device can be surfaced in the definition of the originally opened device, the connected device can also provide some extended features beyond those required by the original device. If the application wishes to access these extended features, it can get the handle to the particular device context or instance of the connected device, using the MCI_CONNECTION message. Note the subtle difference between a default connection and a device context connection. A default connection is the name of a connected device, whereas a device context connection is the actual handle to a particular instance of an opened device. An example of this is a waveaudio01 device that has a default connection to an ampmix01 device. When the waveaudio01 device is opened, it automatically opens the ampmix01 device, creating an instance of each device. Because devices may be shared in OS/2 multimedia, the waveaudio01 device can be opened again by another application and two new instances will be created. Although the default connection is the same in both cases, the device context connections are different. ═══ 5.6. Connectors Supported by Media Drivers ═══ Each implementation of a media device defines the connectors that it supports. This information is maintained in the media driver and can vary with the underlying hardware. The following table lists each media device and the connector types applicable to each device. The actual number and types of connectors in each device can vary from one implementation to another. ┌───────────────┬─────────────────────────────────────────────┐ │Device Type │Connectors │ ├───────────────┼─────────────────────────────────────────────┤ │ampmix │amp stream, headphones, speakers, microphone,│ │ │line in, line out │ ├───────────────┼─────────────────────────────────────────────┤ │cdaudio │cd stream, headphones │ ├───────────────┼─────────────────────────────────────────────┤ │cdxa │xa stream │ ├───────────────┼─────────────────────────────────────────────┤ │digitalvideo │headphones, speakers, microphone, line in, │ │ │line out, audio in, audio out │ ├───────────────┼─────────────────────────────────────────────┤ │headphones │audio in │ ├───────────────┼─────────────────────────────────────────────┤ │microphone │audio out │ ├───────────────┼─────────────────────────────────────────────┤ │sequencer │midi stream, headphones, speakers, line out │ ├───────────────┼─────────────────────────────────────────────┤ │videodisc │video out, line out │ ├───────────────┼─────────────────────────────────────────────┤ │speakers │audio in │ ├───────────────┼─────────────────────────────────────────────┤ │waveaudio │wave stream, headphones, speakers, │ │ │microphone, line in, line out │ └───────────────┴─────────────────────────────────────────────┘ ═══ 5.7. Allowable Connections for Connector Types ═══ For a connection to exist between two media devices, the connectors on each device must be of compatible types. For instance, a connection could be established between the line OUT connector on one device and the line IN connector on another device, however a connection between line OUT and video IN is prohibited. This mechanism will eliminate the majority of incorrect connections. The following table lists the allowable connections based on connector types. These connections are allowed in either direction. Please note that this table should be read as meaning that a device which has a connector of the type in the first column can connect to a device which has a connector of the type in the second column. ┌───────────────┬───┬───────────────┐ │amp stream │<->│wave stream │ │ │ │midi stream │ │ │ │cd stream │ │ │ │xa stream │ ├───────────────┼───┼───────────────┤ │line in │<->│line out │ │ │ │audio out │ ├───────────────┼───┼───────────────┤ │audio in │<->│headphones │ │ │ │speakers │ │ │ │line out │ ├───────────────┼───┼───────────────┤ │video in │<->│video out │ ├───────────────┼───┼───────────────┤ │headphones │<->│audio in │ │ │ │line in │ ├───────────────┼───┼───────────────┤ │speakers │<->│audio in │ │ │ │line in │ ├───────────────┼───┼───────────────┤ │microphone │<->│audio out │ ├───────────────┼───┼───────────────┤ │phone line │<->│phone set │ └───────────────┴───┴───────────────┘ The connections in the table shown above are based on connector types, not device types. The list of connections might appear incorrect if the connector types are misinterpreted as device types. For example, "headphones <-> audio in" might be misinterpreted as meaning that headphones can be connected to an audio input. Referring to however, we see that the HEADPHONE device has a connector of type audio in. This connector is analogous to the plug on the headphones and, from the perspective of the headphones, it is input. We also see that the AMPMIX and CDAUDIO devices have connectors of type headphones. The connection is correct, because headphones can be connected to either the CD player (CDAUDIO) or to the audio adapter (AMPMIX). ═══ 6. Amplifier-Mixer Device ═══ The OS/2 amplifier-mixer (ampmix) device is similar to a home stereo amplifier-mixer. Components are plugged into the amplifier-mixer so that audio signals can be transferred to a pair of attached speakers, headphones, or perhaps another device. A comparable example of connecting to another device is playing an old phonograph record, and recording the sound on a new DAT (Digital Audio Tape) deck. The ampmix is the center of all audio signals and provides input or output switching and sound shaping services such as volume, treble, or bass control. The logical ampmix device in OS/2 supports both analog and digital devices. Other OS/2 multimedia logical devices may be connected to the ampmix device. Similar to the previous example, the CD audio logical device could provide an analog input to the ampmix device, which could then be recorded by the digital waveform audio device. Both a logical ampmix device and the audio adapter performs all the functions surfaced by the ampmix device. Two important points are the speaker and amp stream connectors. Although there is actually no visible speaker jack on the back of the audio card, it is a convenient fiction for an application to view the PC internal speaker as another set of speakers that might be plugged into the back of the audio card. Using the previously defined concept of a connector, an application can view all flows of information into and out of the ampmix device in a similar fashion. Selecting the internal speaker as opposed to the external speakers may require the ampmix device to issue a completely different set of instructions to the actual hardware device. The application, however, remains completely device independent. The other features of the ampmix device are provided by either issuing commands to the hardware device, or emulating in software. One example of software emulation is the support of changing the volume over a period of time, or fade in/fade out. The audio card may only support setting the volume to a particular value, however the ampmix device can send a series of values to achieve the fade effect. ═══ 6.1. The Amp Stream Connector ═══ The amp stream connector represents the flow of digital information to and from the ampmix device. Again similar to home stereo amplifier-mixers, the ampmix device by itself is not especially notable until another device is attached. Information is transferred from the device and played back on a pair of attached speakers. Note that the ampmix device is a conduit of information, and relies on another device to provide the flow of information. Therefore, commands for the transport of information (such as play, seek, or stop), are sent to the attached device. Commands for transforming the information (such as treble or bass) are sent directly to the ampmix device. As a nicety for applications, the attached device will provide volume control, so that the application need not provide ampmix functions unless some advanced audio functions are required. The volume command is transparently routed to the attached ampmix device. If the application needs to talk directly to the ampmix device, the value of the stream connector may be queried using the MCI_CONNECTION message, which returns a device context connection. If the string interface is being used, an alias can be established for the connected device. Ampmix commands may then be sent directly to the ampmix device. Some devices also provide a connector service, which also alleviates the need to talk directly to the ampmix device for frequently requested function. An example of this is the waveaudio device, which attempts to process requests for speakers and several other connector types. If the service is available from the associated ampmix device it is routed; otherwise, the function fails. The connectors and connector services provided by each OS/2 multimedia logical device are discussed in the section for that device. ═══ 6.2. Sharing the Amplifier-Mixer Device ═══ Because many components of OS/2 multimedia utilize the amplifier-mixer device, it is typically opened shareable so that several devices can use the ampmix device simultaneously, or serially, in an application-window-focus driven sharing scheme. The Media Device Manager (MDM) is responsible for allocating the resources of the underlying hardware correctly and informs an application with the MM_MCIPASSDEVICE message whenever use of the ampmix device is gained or lost. When other media devices in an application use the ampmix device, the amplifier-mixer becomes a source of contention, depending on the capabilities of the underlying audio adapter. For example, the IBM M-Audio adapter supports the simultaneous playback of two mono 22 kHz PCM waveforms. However, if a third waveform is started, one of the previous two waveforms must be suspended. The application that opened the waveform audio (waveaudio) device receives a MM_MCIPASSDEVICE message with an event of MCI_LOSING_USE. Following completion of the third waveform, the second waveform is automatically restored and can then play to completion. See Device Sharing By Applications for information on device sharing. The OS/2 multimedia system manages all device sharing, and informs the application when the device is temporarily unavailable. ═══ 6.3. Audio Shaping Features ═══ The OS/2 ampmix device provides the following control of audio signals. Other companies may develop ampmix devices for use in OS/2 multimedia that provide additional capabilities. Ampmix features are accessed using the MCI_SET command. Support of these features can vary by manufacturer. If a feature is not supported, MCIERR_UNSUPPORTED_FLAG will be returned. volume Sets volume as a percentage of the maximum achievable effect. Volume of the left and right channel for stereo signals may be controlled independently. An over parameter may also specified to cause the volume to fade in or fade out over a specified period of time. treble Sets the treble as a percentage of the maximum achievable effect. bass Sets the bass as a percentage of the maximum achievable effect. balance Sets the final output balance. Zero is full left balance, 100 is full right balance. pitch Sets the pitch as a percentage of the maximum achievable effect. gain Sets the gain as a percentage of the maximum achievable effect. monitor Controls whether or not the signal from an input device is heard when it is being routed to another device for recording. The values for all of these functions may be retrieved using the MCI_STATUS message. ═══ 6.4. Master Volume and the Ampmix Device ═══ The maximum volume level of all logical devices in the system are controlled by the Volume Control application supplied with OS/2 multimedia. The volume control application sets a scale by which all subsequent volume commands to the ampmix device is biased. For instance, if the volume control sets the master volume at 50%, then all volume levels are cut in half. Some devices may only support two levels of volume (on/off). These devices are off when the master volume is set to zero, and on at any other value. Note: While the MCI_MASTERAUDIO message can be sent by any application, only the master volume application or a replacement should utilize this message to set the master volume. Master volume should only be controlled at the discretion of an end user as implemented in the OS/2 multimedia Master Volume application. A parameter of the MCI_MASTERAUDIO message allows an application or a media driver to query the master volume level. ═══ 6.5. Amplifier-Mixer Command Messages ═══ ┌────────────────────┬─────────────────────────────────────────────┐ │Message │Description │ ├────────────────────┼─────────────────────────────────────────────┤ │MCI_CONNECTOR │Enables, disables, or queries the status of a│ │ │connector on a device. │ ├────────────────────┼─────────────────────────────────────────────┤ │MCI_CLOSE │Closes the amp mixer instance. │ ├────────────────────┼─────────────────────────────────────────────┤ │MCI_GETDEVCAPS │Gets device capabilities. │ ├────────────────────┼─────────────────────────────────────────────┤ │MCI_INFO │Gets device information. │ ├────────────────────┼─────────────────────────────────────────────┤ │MCI_OPEN │Opens an instance of the amp mixer. │ ├────────────────────┼─────────────────────────────────────────────┤ │MCI_SET │Sets the following parameters, using the │ │ │MCI_AMP_SET_PARMS structure: │ │ │-Channel number (right, left, or all │ │ │channels) │ │ │-Volume, treble, bass, balance, pitch, or │ │ │gain as a percentage of the maximum │ │ │achievable effect. │ │ │-Delay time for vectored changes in │ │ │milliseconds. │ │ │-Monitor control. │ ├────────────────────┼─────────────────────────────────────────────┤ │MCI_STATUS │Gets device status. │ └────────────────────┴─────────────────────────────────────────────┘ ═══ 6.6. Ampmix Connectors ═══ The number and type of connectors varies by manufacturer. To determine which connectors are supported an application may issue the MCI_CONNECTORINFO message. The following connectors are typically supported by ampmix devices: o amp stream o speakers o line out o headphones o line in o microphone ═══ 6.6.1. M-Audio Adapter Specifics ═══ 1. The speakers (1) connector is the external speakers jack on the back of the card. The speakers (2) connector is really the internal PC speaker. 2. The Line OUT and speakers (1) connectors can be enabled or disabled by the ampmix device, although the adapter is incapable of actually switching the output. The ampmix device does report that the connector is actually enabled or disabled. 3. The speakers(2) connector can be enabled or disabled, resulting in the PC internal speaker being turned on or off. 4. The microphone and line IN connectors are mutually exclusive. Enabling one connector automatically disables the other. Disabling both connectors automatically enables the microphone. 5. The amp stream connector represents the transfer of digital audio information to and from the M-Audio card. This connector is always enabled. 6. The M-Audio adapter does not support independent control of volume for the left and right channels of a stereo signal. Any device connected to an M-Audio amplifier-mixer device returns MCIERR_UNSUPPORTED_FLAG if an attempt is made to independently control the volume of the left and right channels with the MCI_SET command. ═══ 7. Waveform Audio Device ═══ The OS/2 waveform audio (waveaudio) device allows an application to play or record digital audio using files or application memory buffers. While audio refers to the sound waves (changes in air pressure) that have a perceived effect on the human ear, waveform refers to a digital representation of the original audio sound wave. Using one technique called pulse code modulation (PCM), discrete samples of the sound wave are encoded by an audio adapter at precise intervals. The numerical value of the sample increases when the sound wave's force (loudness) increases. The variation of the sample increases as the frequency of the sound wave increases. The number of samples per second taken of the original sound wave as well as the precision (or resolution) of the sample dictate the quality of the sound reproduction. Typical sampling rates include 44 kHz, 22 kHz, and 11 kHz, where kHz is an abbreviation for kilohertz or thousands of cycles per second. The sampling precision is usually measured in bits where 8 or 16 bits per sample are representative of most audio adapters. Mono or stereo refers to the number of channels transferring digital audio. Mono represents one channel and stereo represents two channels. Generally, the higher the sampling rate and resolution, the higher the perceived quality; however this comes at the expense of potentially enormous data rates and file sizes. For example, audio quality equivalent to that produced by a CD audio device requires a sampling rate of 44.1 kHz, and 16-bit resolution for each of the channels in a stereo recording. This information alone results in a data rate of 172 kilobytes per second! Luckily, many applications of digital audio are adequately supported with sampling rates and resolutions as low as 22 kHz and 8 bits respectively. The exact choice of parameters will vary, depending on the requirements of the application. ═══ 7.1. The Wave Stream Connector ═══ The wave stream connector represents the flow of digital information to and from the waveaudio device to its associated amplifier-mixer (ampmix) device. During playback, the waveaudio device sends digitized sounds from either application memory or files to the ampmix device for subsequent conversion into audio that can be heard through conventional speakers or headphones. When recording, the waveaudio device receives waveforms from the ampmix device and stores the digital information in a file or in application memory. Control of the characteristics of the waveform information is provided by the waveaudio device. The quality of the waveform can be controlled by setting the format, sampling rate, bits per sample, and the number of channels. As an additional service, the waveaudio device will also allow the volume to be controlled. This service is actually provided by the ampmix device in a way that is transparent to the calling application. If other advanced audio shaping features are required, the application can retrieve the device ID of the ampmix device using the MCI_CONNECTION message. Once the device ID has been obtained, the application can send commands directly to the ampmix device. Examples include set commands to manipulate treble, bass, and balance. ═══ 7.1.1. Waveaudio Device Features ═══ o Multiple time formats o Waveform characteristics Data Format File format (RIFF WAVE, AVC, or others if an MMIO procedure is supplied) Sampling rate Bits per sample (resolution) Number of channels o Playback and record sources File system Application memory o Audio shaping Volume control Other features that might be available through the associated ampmix device o Cue point and position advise notification ═══ 7.2. Waveform Data Formats ═══ There are several formats used for storing waveform data within a computer system. OS/2 multimedia recognizes several resolutions of the Pulse Code Modulation (PCM) format, because it is supported by most audio adapters. OS/2 multimedia also recognizes ADPCM formats. Refer to the Appendix of the OS/2 Multimedia Programming Reference for descriptions of these formats. Pulse Code Modulation (PCM) refers to the variation of a digital signal to represent audio amplitude. This method of assigning binary values to amplitude levels supports the conversion of analog signals to digital signals by adapters such as the M-Audio Capture and Playback Adapter. Adaptive Differential Pulse Code Modulation (ADPCM) is a technique for compressing waveform samples. ADPCM can reduce the amount of data storage required by a factor of 16 to 1, but some price is paid in fidelity for the higher compression rates. ═══ 7.2.1. M-Audio Adapter Specifics ═══ The following tables list the valid MCI_WAVE_SET items for the operation modes currently supported by the M-Audio Capture and Playback Adapter. Note: Numbers in table stand for number of channels supported in this mode-- mono (1), stereo (2), N/A (Not Available) ┌───────────────┬────────────┬────────────┬────────────┬────────────┐ │Data Size │8000 Hz │11025 Hz │22050 Hz │44100 Hz │ ├───────────────┼────────────┼────────────┼────────────┼────────────┤ │8-bit │1, 2 │1, 2 │1, 2 │1, 2 │ ├───────────────┼────────────┼────────────┼────────────┼────────────┤ │16-bit │1, 2 │1, 2 │1, 2 │1, 2 │ └───────────────┴────────────┴────────────┴────────────┴────────────┘ ┌───────────────┬────────────┬────────────┬────────────┬────────────┐ │Data Size │8000 Hz │11025 Hz │22050 Hz │44100 Hz │ ├───────────────┼────────────┼────────────┼────────────┼────────────┤ │8-bit │N/A │N/A │N/A │N/A │ ├───────────────┼────────────┼────────────┼────────────┼────────────┤ │16-bit │N/A │1 │1, 2 │1 │ └───────────────┴────────────┴────────────┴────────────┴────────────┘ ┌───────────────┬────────────┬────────────┬────────────┬────────────┐ │Data Size │8000 Hz │11025 Hz │22050 Hz │44100 Hz │ ├───────────────┼────────────┼────────────┼────────────┼────────────┤ │8-bit │1,2 │1,2 │1,2 │1,2 │ ├───────────────┼────────────┼────────────┼────────────┼────────────┤ │16-bit │N/A │N/A │N/A │N/A │ └───────────────┴────────────┴────────────┴────────────┴────────────┘ ┌───────────────┬────────────┬────────────┬────────────┬────────────┐ │Data Size │8000 Hz │11025 Hz │22050 Hz │44100 Hz │ ├───────────────┼────────────┼────────────┼────────────┼────────────┤ │8-bit │1,2 │1,2 │1,2 │1,2 │ ├───────────────┼────────────┼────────────┼────────────┼────────────┤ │16-bit │N/A │N/A │N/A │N/A │ └───────────────┴────────────┴────────────┴────────────┴────────────┘ ═══ 7.3. Audio Device Capabilities ═══ If MCI_GETDEVCAPS_EXTENDED is specified in conjunction with MCI_GETDEVCAPS_ITEM, the MCI_GETDEVCAPS_WAVE_FORMAT value can be placed in the ulItem field for the waveaudio device as an extended request. The MCI_GETDEVCAPS_WAVE_FORMAT value allows an application to query if the device supports a specific waveaudio format. The application must fill in the ulBitsPerSample, ulFormatTag, ulSamplesPerSec, ulChannels, and ulFormatMode fields in the MCI_WAVE_GETDEVCAPS_PARMS data structure. The driver returns MCI_TRUE if the format is supported or returns a specific error describing why the command field failed if the format is not supported. The following code fragment shows a portion of the Audio Recorder Sample program provided in the Toolkit (\TOOLKIT\SAMPLES\MM\RECORDER). This program uses the MCI_GETDEVCAPS message to determine the capabilities of the currently selected waveaudio device. ULONG ulRC; /* return code from api */ MCI_WAVE_GETDEVCAPS_PARMS mciAudioCaps; /* MCI_GETDEVCAPS_PARMS structure */ memset( &mciAudioCaps , 0, sizeof(MCI_WAVE_GETDEVCAPS_PARMS)); /* Test to see if the device can play 11 kHz, 8-bit, mono files. */ mciAudioCaps.ulBitsPerSample = 8; mciAudioCaps.ulFormatTag = DATATYPE_WAVEFORM; mciAudioCaps.ulSamplesPerSec = 11025; mciAudioCaps.ulChannels = 1; mciAudioCaps.ulFormatMode = MCI_PLAY; mciAudioCaps.ulItem = MCI_GETDEVCAPS_WAVE_FORMAT; ulRC = mciSendCommand (mciOpenParms.usDeviceID, /* Device ID */ MCI_GETDEVCAPS, MCI_WAIT | MCI_GETDEVCAPS_EXTENDED | MCI_GETDEVCAPS_ITEM, (PVOID) &mciAudioCaps, 0); . . . /* Test to see if the device can record 11 kHz, 16-bit, mono files. */ mciAudioCaps.ulBitsPerSample = 16; mciAudioCaps.ulFormatTag = DATATYPE_WAVEFORM; mciAudioCaps.ulSamplesPerSec = 11025; mciAudioCaps.ulChannels = 1; mciAudioCaps.ulFormatMode = MCI_RECORD; mciAudioCaps.ulItem = MCI_GETDEVCAPS_WAVE_FORMAT; ulRC = mciSendCommand (mciOpenParms.usDeviceID, /* Device ID */ MCI_GETDEVCAPS, MCI_WAIT | MCI_GETDEVCAPS_EXTENDED | MCI_GETDEVCAPS_ITEM, (PVOID) &mciAudioCaps, 0); ═══ 7.4. Using the Waveform Audio Device ═══ Because the waveaudio device is a compound device, it requires a device element. The device element is typically a file that contains a sampled waveform for playback. The waveaudio device can be opened with or without a device element. A device element can subsequently be specified using the load command. ═══ 7.4.1. Opening the Waveform Audio Device ═══ The following string commands open the default waveaudio device and load a file onto it. open waveaudio alias wave shareable load wave c:\mysounds\train.wav OS/2 multimedia allows you to specify the device to be used for a particular file based on the file's extension or its extended attributes (EAs). Using .TYPE EAs is the preferred method, because they remain with the files even when the files are renamed. Both file extensions and extended attributes can be associated with a device using the Multimedia Setup application. For instance, assuming files with an extension of .WAV have been associated with the waveaudio device, the following command will result in a file being loaded into the waveaudio device: open c:\mysounds\monkey.wav alias monkey shareable Finally, both the device element and the device type can be specified: open c:\mysounds\paperjam.wav type waveaudio alias wave shareable ═══ 7.4.2. Recording a Waveform File ═══ One of the typical uses of the waveform audio device is to digitize an input signal or sound into discrete samples for storage in a file. An example of this would be recording an electronic audio mail message to actually tell someone about an idea, as opposed to typing a memo on the same subject. An electronic audio mail application would be completely shielded from the complexity of digitizing a signal and would only need to specify a file, while providing the user with a simple control panel to allow the message to be recorded. The user might press a stop button on the control panel when finished describing the idea. The application could then issue a stop command to the waveaudio device to discontinue the recording. open myidea.wav waveaudio alias wave wait record wave notify . . . ** recording the idea into myidea.wav ** . . . stop wave wait Like many text editors, the waveform audio media driver will not actually modify the original file until it receives a command to save the changes. Any temporary files created during the record operation will be located in the directory specified by the MSV_WORKPATH multimedia system variable. The path can be specified on the system page of the Multimedia Setup application. The use of temporary files is completely transparent to the application. The file can be saved using the original file name, or a new file name can be specified. If a save command is not issued before closing the waveform audio device, all changes will be discarded. save wave wait close wave wait It is possible to open or load the waveaudio device specifying a special readonly option. In this mode, the waveaudio device prevents any modification to the file from either the save or record commands. In certain circumstances, the driver might be able to optimize performance by utilizing the information that the file will not be modified. The option will also allow multiple applications to share the same file for playback purposes and will prevent inadvertent modification of the file. open bigwave.wav type waveaudio alias wave readonly shareable ═══ 7.4.3. Creating New Files ═══ The waveaudio device will create a new file on either the MCI_OPEN or MCI_LOAD commands if a file element is indicated (MCI_OPEN_ELEMENT_ID) and the specified file name does not exist. If no file name is indicated, the waveaudio driver will create an unnamed temporary file. If an unnamed temporary file is created, it can later be named by issuing the MCI_SAVE command, which must include the permanent name of the new file. To support file creation from the string interface, a special file name called new is reserved for system use. This file name should be used in place of the usual application supplied file name. As in the command message interface, the save command must be issued to give the file a permanent name. open new type waveaudio alias wave wait record wave notify . . . ** recording ** . . . stop wave wait save wave myspeech.wav wait When a file is initially created, default settings will be assigned by the media driver and might depend on the capabilities of the audio adapter. The IBM waveform audio driver will use PCM, 22 kHz, 16 bits per sample, and mono as the default for 16-bit adapters. If the adapter does not support 16-bit PCM, then the resolution (bits per sample) will be downgraded to 8 bits. The following table lists audio adapters supported by OS/2 multimedia. The default settings are those initially assigned by the media driver to a new file when that particular audio adapter is being used. ┌─────────────────────────┬──────────┬──────────┬──────────┬──────────┐ │Audio Adapter │Format │Sampling │Bits per │Channels │ │ │ │Rate │Sample │ │ ├─────────────────────────┼──────────┼──────────┼──────────┼──────────┤ │IBM M-Audio │PCM │22 kHz │16 │1 │ ├─────────────────────────┼──────────┼──────────┼──────────┼──────────┤ │Sound Blaster** │PCM │22 kHz │8 │1 │ ├─────────────────────────┼──────────┼──────────┼──────────┼──────────┤ │Sound Blaster Pro │PCM │22 kHz │8 │2 │ ├─────────────────────────┼──────────┼──────────┼──────────┼──────────┤ │Sound Blaster 16 │PCM │22 kHz │16 │1 │ ├─────────────────────────┼──────────┼──────────┼──────────┼──────────┤ │Pro AudioSpectrum 16** │PCM │22 kHz │16 │1 │ └─────────────────────────┴──────────┴──────────┴──────────┴──────────┘ OS/2 multimedia enables recording of digital audio information in the format that fits specific needs, such as space or quality. For example, assume that a new waveaudio file is created with the following command: open new type waveaudio alias a wait When the file is created, you might want a file that is compatible with mu-law (the compression scheme used by the telephone system). To change the compression scheme, the format tag must be set for the file. The following string commands prepare the file for recording mu-law by setting the format tag: set a format tag mulaw wait set a bitspersample 8 wait set a channels 1 wait set a samplespersec 11025 wait If you wanted to record with a compression scheme commonly used in Europe (a-law), the following command could have been issued: set a format tag alaw wait An application should always set the waveform format, sampling rate, resolution, and number of channels to ensure that the waveform is created with the desired parameters as shown in the following string interface example. set wave format tag PCM wait set wave samplespersec 22050 wait set wave bitspersample 8 wait set wave channels 1 wait Note: When modifying the settings on a waveaudio device, the format tag should be changed first, because it might force the automatic modification of other settings to make them compatible with the new format. For instance, a waveaudio device that supports 16-bit PCM might only support 8-bit ADPCM. Changing the format from PCM to ADPCM will automatically change the bits per sample setting. ═══ 7.5. Playing and Recording non-RIFF Waveforms ═══ The waveform audio device will create new waveforms according to the RIFF WAVE data standard. It is possible, however to play other data formats using OS/2 multimedia if the appropriate MMIO procedure has been supplied. The selection of the appropriate I/O procedure (IOProc) is transparent to the application if the IOProc has been installed. One example of this feature is OS/2 multimedia's ability to play waveform audio files that were created using IBM's AVC application and the M-Audio card. Note that the AVC support provides playback capabilities only. The waveform audio device will temporarily report FALSE to the save and record capabilities of the device capabilities (MCI_GETDEVCAPS) function when the underlying I/O procedure does not support the creation of files. Applications should check the device capabilities to appropriately display a user interface that reflects the true capabilities of the waveaudio driver and its associated I/O procedure. For example, a waveform editor application should grey out its record button when an AVC file is loaded, as only playback operations are supported. Querying the device capabilities would return FALSE for can record. If a waveform file in the RIFF WAVE format is subsequently loaded, the record button should be enabled, because the same can record query will now return TRUE. In all instances, by using the high-level OS/2 multimedia mciSendString or mciSendCommand interface to reference device capabilities, the application is shielded from the underlying implementation. ═══ 7.6. Creating a Waveform Playlist ═══ Specialized applications such as a waveform editor might require the capability of playing and recording using application memory buffers instead of files. The memory playlist feature of OS/2 multimedia provides the construct for supplying memory buffers to the waveaudio device. Besides implementing simple circular buffering schemes, memory playlists can be used to synthesize complex and unique waveform sounds. By following each DATA statement with a MESSAGE statement, an application can be informed as to when the buffer can be reused. Playlist Structure Depending on the complexity of the application, memory playlists can be used to provide a single large memory buffer, or multiple buffers in a circular buffering scheme. The following is an example of how a memory playlist might be constructed to implement a simple circular buffering scheme. 0: NOP 1: DATA... 2: MESSAGE... 3: DATA... 4: MESSAGE... 5: DATA... 6: MESSAGE... 7: BRANCH 0 Note that regardless of whether the playlist is being used for play or record operations, the MESSAGE instruction will notify the application when the playlist processor has consumed or filled the preceding DATA buffer. An MM_MCIPLAYLISTMESSAGE will be sent to the window procedure specified when the waveaudio device was originally opened. The following code fragment shows the SetUpPlaylist procedure that is performed once, during initialization of the Clock Sample program. It calls the procedure CopyWaveformIntoMemory to copy the waveform files into memory buffers. It also initializes the playlist data structure by supplying the address and size of the memory buffers holding the data in the appropriate data structure fields. VOID SetupPlayList( VOID ) { /* * This array keeps the address of each audio chime file. */ static LONG *pulBaseAddress[ NUMBER_OF_CHIME_FILES ]; USHORT usChimeFileId; /* Chime audio file ID. */ ULONG ulSizeOfFile, /* Size of audio file. */ ulMemoryAllocationFlags = PAG_COMMIT | PAG_READ | PAG_WRITE; for(usChimeFileId=0; usChimeFileId