Multimedia technologies in iOS let you access the sophisticated audio and video capabilities of iOS devices. Use Core Audio to generate, record, mix, process, and play audio in your application. The open-source OpenAL interface supports high-performance positional audio playback, ideal for games. The Media Player framework supports full-screen playback of video files and provides playback of the audio items in a user’s iPod library. Starting in iOS 3.0, you can add video recording capability to your application.
Before you begin writing audio code:
Read What Is Core Audio? in Core Audio Overview to become familiar with the primary interface to audio services in iOS.
Read “Using Audio” in Multimedia Programming Guide to learn about audio development for iOS devices.
Continue with Core Audio Essentials in Core Audio Overview to learn about the architecture, programming conventions, and use of Core Audio.
See the avTouch sample, which shows how to play sounds; the SpeakHere sample, which demonstrates basic recording and playback; and the Audio UI Sounds (SysSound) sample, which demonstrates how to invoke vibration and play alerts and user-interface sound effects.
Before you begin writing video code:
Read “Using Video” in Multimedia Programming Guide for an overview of video recording and playback on iOS devices.
You select among the various APIs for audio and video based on the needs of your application. All iOS developers need to learn about audio sessions.
Audio Session Services in iOS let you manage your application’s behavior in the context of interruptions such as incoming phone calls, and to handle audio routing changes such as when users unplug headsets. This technology also lets you specify your audio intentions, for example, whether or not you want your audio to continue when the user moves the Ring/Silent switch to silent.
Every iOS application that uses audio should employ Audio Session Services. For more information, read Audio Session Programming Guide. For an example of how to use this technology in a recording and playback application, see the SpeakHere sample.
Depending on your needs, you play audio in iOS using the AVAudioPlayer
class, Audio Queue Services, OpenAL, the I/O audio unit or System Sound Services. You can also play audio items from the user’s iPod library using the iPod Library Access API.
To play sounds of any duration, to play multiple sounds simultaneously, or to play sounds with level control, use the AVAudioPlayer
class. This technology can play any audio format supported in iOS including MP3, AAC, ALAC (Apple Lossless), IMA4, linear PCM, and others. See iOS Application Programming Guide and AVAudioPlayer Class Reference. Also see the avTouch sample.
To play sounds with precise control—such as for synchronization—or to play audio captured from an Internet stream, use Audio Queue Services. See Audio Queue Services Programming Guide. For sample code, see SpeakHere.
To play positional audio, especially if your application is a game, use OpenAL, documented at http://openal.org. See OpenAL FAQ for iPhone OS for important information on using this technology. For sample code, see oalTouch.
To play sounds with lowest I/O latency, or to provide simultaneous audio input and output, use the I/O audio unit. For voice chat applications, use the Voice Processing I/O audio unit. See System Audio Unit Access Guide and the aurioTouch sample.
To play alerts and user-interface sound effects, or to invoke vibration, use System Audio Services. This technology can play .caf
, .wav
, and .aif
files containing sounds with durations of 30 seconds or less. Refer to “Using Sound in iPhone OS” in iOS Application Programming Guide and to System Sound Services Reference.
To play audio items from a user’s iPod library, use the iPod Library Access API from the Media Player framework. Read iPod Library Access Programming Guide. See the AddMusic sample.
You can capture and play streamed audio.
To parse streamed audio content, such as from a network connection, use Audio File Stream Services, documented in Audio File Stream Services Reference.
To play a captured stream, use Audio Queue Services. See Audio Queue Services Programming Guide and Audio Queue Services Reference.
You can also play Internet audio files in the AAC-LC format using the MPMoviePlayerController
class. For sample code that shows how, see the MoviePlayer sample.
To record sounds on iOS devices, use the AVAudioRecorder
class as described in AVAudioRecorder Class Reference.
You can also record sounds using Audio Queue Services, documented in Audio Queue Services Programming Guide and Audio Queue Services Reference. See the SpeakHere sample, which demonstrates how to record sound in all audio formats supported in iOS.
The MPMoviePlayerController
class supports video playback in either H.264 (Baseline Profile Level 3.0) format or MPEG-4 Part 2 video (Simple Profile) format. Playback is full-screen and is primarily intended for game developers who want to play animations.
You can also use the MPMoviePlayerController
class to play videos streamed from the Internet, as demonstrated in the MoviePlayer sample.
Read “Playing Video Files” in iOS Application Programming Guide for more on adding video playback to your application.
Starting in iOS 3.0, you can record video, with included audio, on supported devices. Employ the UIImagePickerController
class, just as for capturing still images. Read “Recording Video” in iOS Application Programming Guide and see UIImagePickerController Class Reference.
The iPhone Dev Center provides guides, reference documentation, technical notes, and sample code for adding multimedia support to your application.
The Core Audio mailing list (coreaudio-api@lists.apple.com) is an excellent place to discuss Core Audio and OpenAL issues with fellow developers.
The iPhone Developer Forums let you ask questions and read answers about any and all iOS development topics.
Last updated: 2010-07-09