iOS Reference Library Apple Developer
Search

About Configuring Audio Behavior

iOS handles audio behavior at the application, interapplication, and device levels using the notion of audio sessions. Using the audio session API, you implement answers to questions such as these:

Users may plug in or unplug headsets, phone calls may arrive, and alarms may sound. Indeed, the audio environment on an iOS device is quite complex. iOS does the heavy lifting, while you employ audio session APIs to specify configuration and to respond gracefully to system requests, using very little code.

An Audio Session Encapsulates a Set of Behaviors

An audio session is the intermediary between your application and iOS for configuring audio behavior. Upon launch, your application automatically gets a singleton audio session. The behavior you specify should meet user expectations as described in “Using Sound” in iPhone Human Interface Guidelines.

Relevant Chapters: ‚ÄúAudio Session Basics‚Äù and ‚ÄúConfiguring the Audio Session.‚Äù

Categories Express Audio Roles

The primary mechanism for expressing audio intentions is to set the audio session category. A category is a key that identifies a set of audio behaviors. By setting the category, you indicate whether your audio should continue when the screen locks, whether you want iPod audio to continue playing along with your audio, and so on.

Six audio session categories, along with a set of override and modifier switches, let you customize audio behavior according to your application’s personality or role. Various categories support playback, recording, playback along with recording, and offline audio processing. When the system knows your app’s audio role, it affords you appropriate access to hardware resources. The system also ensures that other audio on the device behaves in a way that works for your application; for example, if you need iPod audio to be silenced, it is.

Relevant Chapters: ‚ÄúConfiguring the Audio Session‚Äù and ‚ÄúAudio Session Categories.‚Äù

Delegates Support Interruption Handling

An audio interruption is the deactivation of your application’s audio session—which immediately stops your audio. Interruptions happen when a competing audio session from a built-in application activates and that session is not categorized by the system to mix with yours. After your session goes inactive, the system sends a “you were interrupted” message which you can respond to by saving state, updating the user interface, and so on.

To handle interruptions, implement Objective-C interruption delegate methods provided by the AV Foundation framework. Write your beginInterruption and endInterruption methods to ensure the minimum possible disruption, and the most graceful possible recovery, from the perspective of the user.

Relevant Chapters: ‚ÄúHandling Audio Interruptions.‚Äù

Callbacks Support Audio Route Change Handling

Users have particular expectations when they initiate an audio route change by docking or undocking a device, or by plugging in or unplugging a headset. “Using Sound” in iPhone Human Interface Guidelines describes these expectations and provides guidelines on how to meet them. Handle route changes by writing a C callback function and registering it with your audio session.

Relevant Chapters: ‚ÄúHandling Audio Hardware Route Changes.‚Äù

Properties Support Advanced Features

You can fine-tune an audio session category in a variety of ways. Depending on the category, you can:

You also use properties to optimize your application for device hardware at runtime. This lets your code adapt to the characteristics of the device it’s running on, as well as to changes in hardware capabilities initiated by the user (such as by plugging in a headset) as your application runs.

To use properties, employ the C-based Audio Session Services API from the Audio Toolbox framework.

Relevant Chapters: ‚ÄúConfiguring the Audio Session,‚Äù ‚ÄúOptimizing for Device Hardware,‚Äù and ‚ÄúWorking with Movies and iPod Music‚Äù

How to Use This Document

All iOS developers who use the AV Foundation framework, Audio Queue Services, OpenAL, or the I/O audio unit should read this document in its entirety. Other iOS developers should read at least this and the next chapter, “Audio Session Basics.” If you are already up-to-speed on audio session concepts, go straight to the “Audio Session Cookbook” chapter.

An appendix, “Audio Session Categories,” provides details on the behavior of each iOS audio session category.

Prerequisites

Before reading this document, you should be familiar with Cocoa Touch development as introduced in iOS Application Programming Guide and with the basics of Core Audio as described in that document and in Core Audio Overview. Because audio sessions bear on practical end-user scenarios involving the various switches, buttons, and connectors on a device, you should also be familiar with using iPhone.

See Also

You may find the following resources helpful as you learn about using audio sessions:

Availability

The Audio Session APIs are specialized for the needs of iOS devices, and so are available only in iOS. Basic audio session support was available in iOS 2.0. Some of the features described in this guide require iOS 3.1




Last updated: 2010-07-09

Did this document help you? Yes It's good, but... Not helpful...