This article summarizes the developer-related features introduced in iOS 4.0. This version of the operating system runs on iPhone and iPod touch only and does not run on iPad. In addition to describing the new features, this article lists the documents that describe those features in more detail.
Note: iOS 4.0 does not support iPad. It runs only on iPhone and iPod touch devices. You must use a different version of the iPhone SDK to develop iPad applications.
For the latest updates and information, you should also see iOS 4.0 Release Notes. For the list of API differences between the iOS 4.0 and earlier versions of iOS, see iOS 4.0 API Diffs.
Applications built using iPhone SDK 4.0 or later (and running in iOS 4.0 and later) are no longer terminated when the user presses the Home button; instead, they now shift to a background execution context. For many applications, this means that the application enters a suspended state of execution shortly after entering the background. Keeping the application in memory avoids the subsequent launch cycle and allows an application to simply reactivate itself, which improves the overall user experience. And suspending the application improves overall system performance by minimizing power usage and giving more execution time to the foreground application.
Although most applications are suspended shortly after moving to the background, applications that need to continue working in the background may do so using one of the following techniques:
An application can request a finite amount of time to complete some important task.
An application can declare itself as supporting specific services that require regular background execution time.
An application can use local notifications to generate user alerts at designated times, whether or not the application is running.
Regardless of whether your application is suspended or continues running in the background, supporting multitasking does require some additional work on your part. Background applications can still be terminated under certain conditions (such as during low-memory conditions), and so applications must be ready to exit at any time. This means that many of the tasks you used to perform at quit time must now be performed when your application moves to the background. This requires implementing some new methods in your application delegate to respond to application state transitions.
For more information on how to handle the new background state transitions, and for information on how to continue running in the background, see iOS Application Programming Guide.
The following sections describe the technologies you can use to enhance your application’s user experience.
Local notifications complement the existing push notifications by giving applications an avenue for generating the notifications locally instead of relying on an external server. Background applications can use local notifications as a way to get a user’s attention when important events happen. For example, a navigation application running in the background can use local notifications to alert the user when it is time to make a turn. Applications can also schedule the delivery of local notifications for a future date and time and have those notifications delivered even if the application is not running.
The advantage of local notifications is that they are independent of your application. Once a notification is scheduled, the system manages the delivery of it. Your application does not even have to be running when the notification is delivered.
For more information about using local notifications, see Local and Push Notification Programming Guide.
The Event Kit framework (EventKit.framework
) provides an interface for accessing calendar events on a user’s device. You can use this framework to get existing events and add new events to the user’s calendar. Calendar events can include alarms that you can configure with rules for when they should be delivered. In addition to using Event Kit for creating new events, you can use the view controllers of the Event Kit UI framework (EventKitUI.framework
) to present standard system interfaces for viewing and editing events.
For more information about the classes and methods of these frameworks, see Event Kit Framework Reference and Event Kit UI Framework Reference.
The Core Motion framework (CoreMotion.framework
) provides a single set of interfaces for accessing all motion-based data available on a device. The framework supports accessing both raw and processed accelerometer data using a new set of block-based interfaces. For devices with a built-in gyroscope, you can retrieve the raw gyro data as well as processed data reflecting the attitude and rotation rates of the device. You can use both the accelerometer and gyro-based data for games or other applications that use motion as input or as a way to enhance the overall user experience.
For more information about the classes and methods of this framework, see “Motion Events” in Event Handling Guide for iOS.
Applications that work with sensitive user data can now take advantage of the built-in encryption available on some devices to protect that data. When your application designates a particular file as protected, the system stores that file on-disk in an encrypted format. While the device is locked, the contents of the file are inaccessible to both your application and to any potential intruders. However, when the device is unlocked by the user, a decryption key is created to allow your application to access the file.
Implementing data protection requires you to be considerate in how you create and manage the data you want to protect. Applications must themselves be designed to secure the data at creation time and to be prepared for changes in access to that data when the user locks and unlocks the device.
For more information about how to add data protection to the files of your application, see “Implementing Standard Application Behaviors” in iOS Application Programming Guide.
The Core Telephony framework (CoreTelephony.framework
) provides interfaces for interacting with phone-based information on devices that have a cellular radio. Applications can use this framework to get information about a user’s cellular service provider. Applications interested in cellular call events can also be notified when those events occur.
For more information about using the classes and methods of this framework, see Core Telephony Framework Reference.
You can use iAd (iAd.framework
) to deliver banner-based advertisements from your application. Advertisements are incorporated into standard views that you integrate into your user interface and present when you want. The views themselves work with Apple’s ad service to automatically handle all the work associated with loading and presenting the ad content and responding to taps in those ads.
For more information about using iAd in your applications, see iAd Framework Reference.
The following sections describe the new graphics and media-related technologies you can incorporate into your applications.
For the most part, applications running on devices with high-resolution screens should work with little or no modifications. The coordinate values you specify during drawing or when manipulating views are all mapped to a logical coordinate system, which is decoupled from the underlying screen resolution. Any content you draw is automatically scaled as needed to support high-resolution screens. For vector-based drawing code, the system frameworks automatically use any extra pixels to improve the crispness of your content. And if you use images in your application, UIKit provides support for loading high-resolution variants of your existing images automatically.
For detailed information about how to support high-resolution screens, see “Supporting High-Resolution Screens” in iOS Application Programming Guide.
The Quick Look framework (QuickLook.framework
) provides a direct interface for previewing the contents of files your application does not support directly. This framework is intended primarily for applications that download files from the network or that otherwise work with files from unknown sources. After obtaining the file, you use the view controller provided by this framework to display the contents of that file directly in your user interface.
For more information about the classes and methods of this framework, see Quick Look Framework Reference.
The AV Foundation framework (AVFoundation.framework
) is for applications that need to go beyond the music and movie playback features found in the Media Player framework. Originally introduced in iOS 3.0, this framework has been expanded in iOS 4.0 to include significant new capabilities, substantially broadening its usage beyond basic audio playback and recording capabilities. Specifically, this framework now includes support for the following features:
Media asset management
Media editing
Movie capture
Movie playback
Track management
Metadata management for media items
Stereophonic panning
Precise synchronization between sounds
An Objective-C interface for determining details about sound files, such as the data format, sample rate, and number of channels
The AV Foundation framework is a single source for recording and playing back audio and video in iOS. This framework also provides much more sophisticated support for handling and managing media items.
For more information about the classes and methods of the AV Foundation framework, see AV Foundation Framework Reference.
The Assets Library framework (AssetsLibrary.framework
) provides a query-based interface for retrieving a user’s photos and videos. Using this framework, you can access the same assets that are nominally managed by the Photos application, including items in the user’s saved photos album and any photos and videos that were imported onto the device. You can also save new photos and videos back to the user’s saved photos album.
For more information about the classes and methods of this framework, see Assets Library Framework Reference.
The Image I/O framework (ImageIO.framework
) provides interfaces for importing and exporting image data and image metadata. This framework is built on top of the Core Graphics data types and functions and supports all of the standard image types available in iOS.
For more information about the functions and data types of this framework, see Image I/O Reference Collection.
The Core Media framework (CoreMedia.framework
) provides the low-level media types used by AV Foundation. Most applications should never need to use this framework, but it is provided for those few developers who need more precise control over the creation and presentation of audio and video content.
For more information about the functions and data types of this framework, see Core Media Framework Reference.
The Core Video framework (CoreVideo.framework
) provides buffer and buffer pool support for Core Media. Most applications should never need to use this framework directly.
The following sections describe the new lower-level technologies and features you can incorporate into your applications.
Block objects are a C-level language construct that you can incorporate into your C and Objective-C code. A block object is essentially an anonymous function and the data that goes with that function, something which in other languages is sometimes called a closure or lambda. Blocks are particularly useful as callbacks or in places where you need a way of easily combining both the code to be executed and the associated data.
In iOS, blocks are commonly used in the following scenarios:
As a replacement for delegates and delegate methods
As a replacement for callback functions
To implement completion handlers for one-time operations
To facilitate performing a task on all the items in a collection
Together with dispatch queues, to perform asynchronous tasks
For an introduction to block objects and how you use them, see A Short Practical Guide to Blocks. For more information about blocks, see Blocks Programming Topics.
Grand Central Dispatch (GCD) is a BSD-level technology that you use to manage the execution of tasks in your application. GCD combines an asynchronous programming model with a highly optimized core to provide a convenient (and more efficient) alternative to threading. GCD also provides convenient alternatives for many types of low-level tasks, such as reading and writing file descriptors, implementing timers, monitoring signals and process events, and more.
For more information about how to use GCD in your applications, see Concurrency Programming Guide. For information about specific GCD functions, see Grand Central Dispatch (GCD) Reference.
The Accelerate framework (Accelerate.framework
) contains interfaces for performing math, big-number, and DSP calculations, among others. The advantage of using this framework over writing your own versions of these libraries is that it is optimized for the different hardware configurations present in iOS–based devices. Therefore, you can write your code once and be assured that it runs efficiently on all devices.
For more information about the functions of the Accelerate framework, see Accelerate Framework Reference.
The following sections describe the improvements to the Xcode tools and the support for developing iOS applications.
Xcode 3.2.3 introduces automatic device and provisioning-profile management in the Organizer window. With automatic device provisioning enabled, you can install applications on your device for debugging and testing without having to log in to your team portal to register the device and download a provisioning profile
Note: You still need to log in to your team portal to create provisioning profiles with specific application IDs for in-app purchase and push notifications. However, once created, those provisioning profiles will also be managed by Xcode if automatic device provisioning is enabled.
For more information about using Xcode, see iOS Development Guide.
The Instruments application now provides support for automating the testing of your iOS applications. The built-in Automation instrument works from scripts (written in JavaScript) that you provide to drive the simulation of events in your application. These synthetic events are generated with the help of the accessibility interfaces built into iOS and integrated into all existing UIKit views. You can use this instrument to improve your testing process and deliver more robust applications.
For information about how to use the Automation instrument, see Instruments User Guide. For information about the JavaScript objects and commands you use in your scripts, see UI Automation Reference Collection.
The following existing frameworks and technologies include additional incremental changes. For a complete list of new interfaces, see iOS 4.0 API Diffs.
The UIKit framework includes the following enhancements:
The UIApplication
class and UIApplicationDelegate
protocol include new methods for scheduling local notifications and for supporting multitasking.
Drawing to a graphics context in UIKit is now thread-safe. Specifically:
The routines used to access and manipulate the graphics context can now correctly handle contexts residing on different threads.
String and image drawing is now thread-safe.
Using color and font objects in multiple threads is now safe to do.
The UIImagePickerController
class includes methods for programmatically starting and stopping video capture. It also includes options for selecting which camera you want to use on a device and for enabling a built-in flash.
The UILocalNotification
class supports the configuration of local notifications; see “Local Notifications.”
The UIView
class includes new block-based methods for implementing animations.
The UIWindow
class has a new rootViewController
property that you can use to change the contents of the window.
Media applications can now receive events related to the controls on an attached set of headphones. You can use these events to control the playback of media-related items.
Several new accessibility interfaces help you make some UI elements more accessible and allow you to customize your application experience specifically for VoiceOver users:
The UIAccessibilityAction
protocol makes it easy for VoiceOver users to adjust the value of UI elements, such as pickers and sliders.
UIPickerViewAccessibilityDelegate
protocol enables access to the individual components of a picker.
UIAccessibilityFocus
protocol allows you to find out when VoiceOver is focused on an element, so you can help users avoid making unnecessary taps.
The UIAccessibilityTraitStartsMediaSession
trait allows you to prevent VoiceOver from speaking during a media session that should not be interrupted.
New interfaces in UIAccessibility
protocol allow you to specify the language in which labels and hints are spoken, and provide announcements that describe events that don't update application UI in way that would be perceptible to VoiceOver users.
The UINib
class provides a way to instantiate multiple sets of objects efficiently from the same nib file.
For information about the classes of the UIKit framework, see UIKit Framework Reference.
The Foundation framework includes the following enhancements:
Most delegate methods are now declared in formal protocols instead of as categories on NSObject
.
Block-based variants are now available for many types of operations.
There is new support for creating and formatting date information in NSDate
and NSDateFormatter
.
The NSDateComponents
class added support for specifying time zone and quarter information.
There is support for regular-expression matching using the NSRegularExpression
, NSDataDetector
, and NSTextCheckingResult
classes.
The NSBlockOperation
class allows you to add blocks to operation queues.
You can use the NSFileManager
class to mark files as protected; see “Data Protection.”
The NSFileWrapper
class allows you to work with package-based document types.
The NSOrthography
class describes the linguistic content of a piece of text.
The NSCache
class provides support for storing and managing temporary data.
The URL-related classes have been updated so that you can now pipeline URL requests and set request priorities.
For information about the classes of the Foundation framework, see Foundation Framework Reference.
The OpenGL ES framework includes the following enhancements:
The APPLE_framebuffer_multisample
extension enables full-scene anti-aliasing.
The EXT_framebuffer_discard
extension can be used to improve the performance of applications that use depth buffers or multisample framebuffers.
The APPLE_texture_max_level
and EXT_shader_texture_lod
extensions provide more control over texture sampling.
The OES_vertex_array_object
(http://www.khronos.org/registry/gles/extensions/OES/OES_vertex_array_object.txt) API allows caching of vertex array state, to decrease driver overhead.
The OES_depth_texture
extension enables rendering real-time shadows using shadow maps.
The OES_texture_float
(http://www.khronos.org/registry/gles/extensions/OES/OES_texture_float.txt) and OES_texture_half_float
(http://www.khronos.org/registry/gles/extensions/OES/OES_texture_float.txt) extensions adds texture formats with floating point components to enable High Dynamic Range rendering.
The APPLE_rgb_422
(http://www.opengl.org/registry/specs/APPLE/rgb_422.txt) extension enables texturing from some common video formats.
Performance of texture creation and modification has been significantly improved.
Driver performance has been generally improved.
The Game Kit framework includes a beta implementation of a centralized service called Game Center. This service provides game developers with a standard way to implement the following features:
Aliases allow users to create their own online persona. Users log in to Game Center and interact with other players anonymously through their alias. Players can set status messages as well as mark specific people as their friends.
Leader boards allow your application to post scores to Game Center and retrieve them later.
Matchmaking allows players to connect with other players with Game Center accounts.
Important: GameCenter is available to developers only in iOS 4.0. It is introduced as a developer-only feature so that you can provide feedback as you implement and test Game Center features in your applications. However, Game Center is not a user feature in iOS 4.0 and you should not deploy applications that use it to the App Store.
For information about the classes of the Game Kit framework, see Game Kit Framework Reference.
The Core Location framework now supports the following features:
A location monitoring service that tracks significant changes using only cellular information. This solution offers a lower-power alternative for determining the user’s location.
The ability to define arbitrary regions and detect boundary crossings into or out of those regions. This feature can be used for proximity detection regardless of whether the application is running.
For information about the classes of the Core Location framework, see Core Location Framework Reference.
The Map Kit framework includes the following enhancements:
Support for draggable map annotations
Support for map overlays
Draggable map annotations make it much easier to reposition those annotations after they have been added to a map. The Map Kit framework handles most of the touch events associated with initiating, tracking, and ending a drag operation. However, the annotation view must work in conjunction with the map view delegate to ensure that dragging of the annotation view is supported.
Map overlays provide a way to create more complex types of annotations. Instead of being pinned to a single point, an overlay can represent a path or shape that spans a wider region. You can use overlays to layer information such as bus routes, election maps, park boundaries, and weather maps on top of the map.
For information about the functions and types of the Map Kit framework, see Map Kit Framework Reference.
The Message UI framework includes a new MFMessageComposeViewController
class for composing SMS messages. This class manages a standard system interface for composing and sending SMS messages. In contrast with sending SMS messages using a specially formatted URL, this class allows you to create and send the message entirely from within your application.
For more information about the classes of the Message UI framework, see Message UI Framework Reference.
The Core Graphics framework includes the following enhancements:
The ability to embed metadata into PDF files using the CGPDFContextAddDocumentMetadata
function
Support for creating color spaces using an ICC profile
Graphics context support for font smoothing and fine-grained pixel manipulation
For information about the functions and types of the Core Graphics framework, see Core Graphics Framework Reference.
The International Components for Unicode (ICU) libraries were updated to version 4.4. ICU is an open-source project for Unicode support and software internationalization. The installed version of ICU includes only a subset of the header files that are part of the broader ICU library. Specifically, iOS includes only the headers used to support regular expressions.
For more information about using the functions of the ICU 4.4 library, see the documentation at http://site.icu-project.org/.
Although iOS 3.2 does not run on iPhone and iPod touch devices, many of the features introduced in that version of the operating system are also supported in iOS 4.0. Specifically, iOS 4.0 supports:
Custom input views
Connecting external displays
File-sharing support
Gesture recognizers
Core Text for text layout and rendering
Text input through integration with the keyboard
Custom fonts
ICU Regular Expressions
Document types
PDF generation
Xcode Tools changes
UIKit framework changes
Media Player framework changes
Core Animation changes
Foundation framework changes
What is not supported are new controls and classes designed specifically for iPad. For more information about these features, see their descriptions in “iOS 3.2.”
Last updated: 2010-07-08