iOS Reference Library Apple Developer
Search

Media Layer

In the Media layer are the graphics, audio, and video technologies geared toward creating the best multimedia experience available on a mobile device. More importantly, these technologies were designed to make it easy for you to build applications that look and sound great. The high-level frameworks in iOS make it easy to create advanced graphics and animations quickly, and the low-level frameworks provide you with access to the tools you need to do things exactly the way you want.

Graphics Technologies

High-quality graphics are an important part of all iOS applications. The simplest (and most efficient) way to create an application is to use prerendered images together with the standard views and controls of the UIKit framework and let the system do the drawing. However, there may be situations where you need to go beyond what is offered by UIKit and provide custom behaviors. In those situations, you can use the following technologies to manage your application’s graphical content:

For the most part, applications running on devices with high-resolution screens should work with little or no modifications. The coordinate values you specify during drawing or when manipulating views are all mapped to a logical coordinate system, which is decoupled from the underlying screen resolution. Any content you draw is automatically scaled as needed to support high-resolution screens. For vector-based drawing code, the system frameworks automatically use any extra pixels to improve the crispness of your content. And if you use images in your application, UIKit provides support for loading high-resolution variants of your existing images automatically. For more information about what you need to do to support high-resolution screens, see “Supporting High-Resolution Screens” in iOS Application Programming Guide.

For information about the graphics-related frameworks, see the corresponding entries in “Media Layer Frameworks.”

Audio Technologies

The audio technologies available in iOS are designed to help you provide a rich audio experience for your users. This includes the ability to play back or record high-quality audio and the ability to trigger the vibration feature on devices that support those capabilities.

The system provides several ways to play back and record audio content depending on your needs. When choosing an audio technology, remember that the higher-level frameworks simplify the work you have to do to support audio playback and are generally preferred. The frameworks in the following list are ordered from highest to lowest level, with the Media Player framework offering the highest-level interfaces you can use.

The audio technologies in iOS support the following audio formats:

For information about each of the audio frameworks, see the corresponding entry in “Media Layer Frameworks.”

Video Technologies

Whether you are playing movie files from your application bundle or streamed content from the network, iOS provides several technologies to play those movies. On devices with the appropriate video hardware, you can also use these technologies to capture video and incorporate it into your application.

The system provides several ways to play and record video content depending on your needs. When choosing a video technology, remember that the higher-level frameworks simplify the work you have to do to support the features you need and are generally preferred. The frameworks in the following list are ordered from highest to lowest level, with the Media Player framework offering the highest-level interfaces you can use.

The video technologies in iOS support the playback of movie files with the .mov, .mp4, .m4v, and .3gp filename extensions and using the following compression standards:

For information about each of the audio frameworks, see the corresponding entry in “Media Layer Frameworks.”

Media Layer Frameworks

The following sections describe the frameworks of the Media layer and the services they offer.

Assets Library Framework

Introduced in iOS 4.0, the Assets Library framework (AssetsLibrary.framework) provides a query-based interface for retrieving a user’s photos and videos. Using this framework, you can access the same assets that are nominally managed by the Photos application, including items in the user’s saved photos album and any photos and videos that were imported onto the device. You can also save new photos and videos back to the user’s saved photos album.

For more information about the classes and methods of this framework, see Assets Library Framework Reference.

AV Foundation Framework

Introduced in iOS 2.2, the AV Foundation framework (AVFoundation.framework) contains Objective-C classes for playing audio content. You can use this support to play file- or memory-based sounds of any duration. You can play multiple sounds simultaneously and control various playback aspects of each sound. In iOS 3.0 and later, this framework also includes support for recording audio and managing audio session information.

In iOS 4.0 and later, the services offered by this framework were expanded significantly to include:

The AV Foundation framework is a single source for recording and playing back audio and video in iOS. This framework also provides much more sophisticated support for handling and managing media items.

For more information about the classes of the AV Foundation framework, see AV Foundation Framework Reference.

Core Audio

Native support for audio is provided by the Core Audio family of frameworks, which are listed in Table 3-1. Core Audio is a C-based interface that supports the manipulation of stereo-based audio. You can use Core Audio in iOS to generate, record, mix, and play audio in your applications. You can also use Core Audio to access the vibrate capability on devices that support it.

Table 3-1  Core Audio frameworks

Framework

Services

CoreAudio.framework

Defines the audio data types used throughout Core Audio.

AudioToolbox.framework

Provides playback and recording services for audio files and streams. This framework also provides support for managing audio files, playing system alert sounds, and triggering the vibrate capability on some devices.

AudioUnit.framework

Provides services for using the built-in audio units, which are audio processing modules.

For more information about Core Audio, see Core Audio Overview. For information about how to use the Audio Toolbox framework to play sounds, see Audio Queue Services Programming Guide and Audio Toolbox Framework Reference.

Core Graphics Framework

The Core Graphics framework (CoreGraphics.framework) contains the interfaces for the Quartz 2D drawing API. Quartz is the same advanced, vector-based drawing engine that is used in Mac OS X. It provides support for path-based drawing, anti-aliased rendering, gradients, images, colors, coordinate-space transformations, and PDF document creation, display, and parsing. Although the API is C based, it uses object-based abstractions to represent fundamental drawing objects, making it easy to store and reuse your graphics content.

For more information on how to use Quartz to draw content, see Quartz 2D Programming Guide and Core Graphics Framework Reference.

Core Text Framework

Introduced in iOS 3.2, the Core Text framework (CoreText.framework) contains a set of simple, high-performance C-based interfaces for laying out text and handling fonts. The Core Text framework provides a complete text layout engine that you can use to manage the placement of text on the screen. The text you manage can also be styled with different fonts and rendering attributes.

This framework is intended for use by applications that require sophisticated text handling capabilities, such as word processing applications. If your application requires only simple text input and display, you should continue to use the existing classes of the UIKit framework.

For more information about using the Core Text interfaces, see Core Text Programming Guide and Core Text Reference Collection.

Core Video Framework

Introduced in iOS 4.0, the Core Video framework (CoreVideo.framework) provides buffer and buffer pool support for Core Media. Most applications should never need to use this framework directly.

Image I/O Framework

Introduced in iOS 4.0, the Image I/O framework (ImageIO.framework) provides interfaces for importing and exporting image data and image metadata. This framework is built on top of the Core Graphics data types and functions and supports all of the standard image types available in iOS.

For more information about the functions and data types of this framework, see Image I/O Reference Collection.

Media Player Framework

The Media Player framework (MediaPlayer.framework) provides high-level support for playing audio and video content from your application. You can use this framework to playback video using a standard system interface. In iOS 3.0, support was added for accessing the user’s iTunes library. With this support, you can play music tracks and playlists, search for songs, and present a media picker interface to the user.

In iOS 3.2, changes were made to this framework to support the playback of video from a resizable view. (Previously, only full-screen support was available.) In addition, numerous interfaces were added to support configuring and managing movie playback.

For information about the classes of the Media Player framework, see Media Player Framework Reference. For information on how to use these classes to access the user’s iTunes library, see iPod Library Access Programming Guide.

OpenAL Framework

In addition to Core Audio, iOS includes support for the Open Audio Library (OpenAL). The OpenAL interface is a cross-platform standard for delivering positional audio in applications. You can use it to implement high-performance, high-quality audio in games and other programs that require positional audio output. Because OpenAL is a cross-platform standard, the code modules you write using OpenAL on iOS can be ported to run on many other platforms.

For information about OpenAL, including how to use it, see http://www.openal.org.

OpenGL ES Framework

The OpenGL ES framework (OpenGLES.framework) provides tools for drawing 2D and 3D content. It is a C-based framework that works closely with the device hardware to provide high frame rates for full-screen game-style applications.

You always use the OpenGL framework in conjunction with the EAGL interfaces. These interfaces are part of the OpenGL ES framework and provide the interface between your OpenGL ES drawing code and the native window objects of your application.

In iOS 3.0 and later, the OpenGL ES framework includes support for both the OpenGL ES 2.0 and the OpenGL ES 1.1 interface specifications. The 2.0 specification provides support for fragment and vertex shaders and is available only on specific iOS–based devices running iOS 3.0 and later. Support for OpenGL ES 1.1 is available on all iOS–based devices and in all versions of iOS.

For information on how to use OpenGL ES in your applications, see OpenGL ES Programming Guide for iOS. For reference information, see OpenGL ES Framework Reference.

Quartz Core Framework

The Quartz Core framework (QuartzCore.framework) contains the Core Animation interfaces. Core Animation is an advanced animation and compositing technology that uses an optimized rendering path to implement complex animations and visual effects. It provides a high-level, Objective-C interface for configuring animations and effects that are then rendered in hardware for performance. Core Animation is integrated into many parts of iOS, including UIKit classes such as UIView, providing animations for many standard system behaviors. You can also use the Objective-C interface in this framework to create custom animations.

For more information on how to use Core Animation in your applications, see Core Animation Programming Guide and Core Animation Reference Collection.




Last updated: 2010-07-08

Did this document help you? Yes It's good, but... Not helpful...