Mac OS X Reference Library Apple Developer
Search

The Mac OS X Accessibility Protocol

In Mac OS X version 10.2, Apple introduced the accessibility framework. This framework includes:

This chapter introduces the accessibility protocol. It describes:

If you’re an application developer, you should read this chapter to learn about the Mac OS X accessibility protocol. Then, if you’re ready to access-enable your application, you should read Accessibility Programming Guidelines for Cocoa or Accessibility Programming Guidelines for Carbon.

Important: If your application uses only standard, noncustom Carbon or Cocoa objects, most of your application is already accessible. There remain a few things you must do, however. This chapter provides fundamental information about the accessibility protocol that helps you understand the reasons for these things.

Note: Java developers should implement the Java Accessibility API (the javax.accessibility package) to ensure their applications are accessible (both Swing and AWT interfaces are accessible). See for more information on this API, see the Java 1.4.2 API reference in the Java Reference Library.

If you are developing an assistive application, you should read this chapter to learn how accessible applications represent themselves in Mac OS X. You’ll find out what information your assistive application can expect to get from an accessible application.

The Accessibility Model

An assistive application helps a user interact with the applications on the user’s computer. To do this, an assistive application must be able to access everything in an application’s user interface and perform all the application’s functions. To be accessible, therefore, an application must provide information about its user interface and capabilities in a standard manner that any assistive application or technology can understand.

This is a challenge because each application type has its own native way of representing its user interface. Cocoa applications, for example, use the NSWindow and NSControl classes to display windows and controls. A modern Carbon application, on the other hand, uses HIObject and HIView objects to implement its user interface. Other types of applications use other native constructs.

Apple solved this problem in Mac OS X version 10.2 with the introduction of a generic object, called an accessibility object. In an accessible application, a user interface element, such as a window, a control, and even the application itself is represented by an accessibility object. To learn more about the accessibility object and the information it provides, see “The Accessibility Object.”

Accessibility objects provide a uniform representation of an application’s user interface elements, regardless of the application framework on which the application depends. Figure 3-1 shows how an assistive application communicates with different types of applications using the accessibility objects the applications provide.

Figure 3-1  Communication between an assistive application and other applications

Communication between an assistive application and other applications

The Mac OS X accessibility model represents an application’s user interface as a hierarchy of accessibility objects. For the most part, the hierarchy is defined by the parent-child relationships among accessibility objects. For example, an application’s dialog window might contain several buttons. The accessibility object representing the dialog contains a list of child accessibility objects, each representing a button in the dialog. In turn, each accessibility object representing one of the buttons knows its parent is the accessibility object representing the dialog.

Of course, the accessibility objects representing the menu bar and windows in an application are children of the application-level accessibility object. Even the application-level accessibility object has a parent, which is the system-wide accessibility object. An application never needs to worry about its system-wide parent because it is out of the application’s scope. On the other hand, an assistive application might query the system-wide accessibility object to find out which running application has keyboard focus.

Figure 3-2 shows the hierarchy of accessibility objects in a simple application.

Figure 3-2  The accessibility object hierarchy

The accessibility object hierarchy

A strength of the accessibility object hierarchy is that it can leave out implementation-specific details that are irrelevant to an assistive application and, by extension, to the user. For example, in Cocoa, a button in a window is usually implemented as a button cell within a button control, which is in a content view within a window. A user has no interest in this detailed containment hierarchy; she only needs to know that there’s a button in a window. If the application’s accessibility hierarchy contains an accessibility object for each of these intermediate objects, however, an assistive application has no choice but to present them to the user. This results in a poor user experience because the user is forced to take several steps to get from the window to the button. Figure 3-3 shows how such an inheritance hierarchy might look.

Figure 3-3  Complete inheritance hierarchy of a button in a window

Complete inheritance hierarchy of a button in a window

To exclude this unnecessary information, the accessibility protocol allows an application to specify some accessibility objects as ignored. Continuing the Cocoa button example, the application can designate as ignored the accessibility objects representing the button control and the content view. This allows an application to present to an assistive application only the significant objects in its accessibility hierarchy. Figure 3-4 shows how the same hierarchy shown in Figure 3-3 might be presented to an assistive application.

Figure 3-4  Appropriate accessibility hierarchy of a button in a window

Appropriate accessibility hierarchy of a button in a window

An assistive application helps a user perform tasks by telling an application’s accessibility objects to perform actions. For the most part, actions correspond to things a user can do with a single mouse click. Each accessibility object contains information about which actions it supports, if any. The accessibility object representing a button, for example, supports the press action. When a user wants to press a button, he communicates this to the assistive application. The assistive application determines that the button’s accessibility object supports the press action and sends a request to the button to perform it.

The Mac OS X accessibility protocol defines only a handful of actions that accessibility objects can support. At first, this might seem restrictive, because applications can perform huge numbers of tasks. It’s essential to remember, however, that an assistive application helps a user drive an application’s user interface, it does not simulate the application’s functions. Therefore, an assistive application has no interest in how your application responds to a button press, for example. Its job is to tell the user that the button exists and to tell the application when the user wants to press it. It’s up to your application to respond appropriately to that button press, just as it would if a user used a mouse to click the button.

The Accessibility Object

An accessibility object provides to assistive applications information about the user-interface object it represents. This information includes the object’s position in the accessibility hierarchy, its position on the display, details about what it is, and what actions it can perform. In addition, an accessibility object responds to messages sent by assistive applications and sends out notifications that describe changes in its state.

This section describes the accessibility object. It describes the information an accessibility object provides and the actions it can perform and outlines the communication between accessibility objects and assistive applications.

Note: Carbon and Cocoa implement the accessibility object in different native ways. This section describes the accessibility object in general terms that are not specific to either application framework. For framework-specific details about the implementation of the accessibility object and how to use it, see Accessibility Programming Guidelines for Cocoa and Accessibility Programming Guidelines for Carbon.

Attributes

An accessibility object has many attributes associated with it. The number and kind of attributes vary depending on the type of user interface object the accessibility object represents. A few attributes are required for every accessibility object, but most are optional.

Attributes have values that assistive applications use to find out about the user interface object. For example, an assistive application gets the value of an accessibility object’s role attribute to find out what type of user interface object it represents.

Some attribute values are settable by assistive applications. An example is the value attribute in an accessibility object that represents an editable text field. When a user types in the text field, an assistive application sets the value of the value attribute to the text the user enters.

If you use standard, noncustom Cocoa or Carbon objects most of the attribute values are already in place. There are a few attribute values, however, that you must provide because they contain application-specific information, such as the description of a user interface object’s function.

The AXAttributeConstants.h file in the HIServices framework defines all the accessibility object attributes in the Mac OS X accessibility protocol. The following sections describe some of the most common required attributes, paying particular attention to the attributes for which you must provide values:

The Role and Role Description Attributes

An accessibility object’s most important attribute is its role attribute. This is because the accessibility object’s role determines which other attributes the object contains and tells an assistive application how to handle it. You can think of the role as the accessibility object’s class—it defines a standard set of behaviors and capabilities to which the object conforms. For more information on the attributes associated with specific roles, see “Roles and Associated Attributes.”

The value of the role attribute is a nonlocalized string. An assistive application can programmatically test the value of the role attribute to find out what type of user interface object the accessibility object represents.

In AXRoleConstants.h (in the HIServices framework), Mac OS X defines a set standard roles that describe the vast majority of user interface object types. Although it may be tempting to define new roles for custom objects in your application, it is not recommended. An assistive application may not know how to handle an accessibility object with an arbitrary role and additional roles add unnecessary complexity to your code. Instead, you should examine the behavior of your objects and choose the standard role that best represents them.

The role description attribute contains a human-intelligible, localized string that names the accessibility object’s role. An assistive application presents this string to the user (a screen reader application, for example, speaks the string). Mac OS X defines a role description string for each role in AXRoleConstants.h, so you do not have to provide strings such as “button” or “window”. In the very unlikely event that your application needs to define a new role, however, you are responsible for providing the value of the role description attribute.

The Description Attribute

The description attribute is almost as important as the role attribute. The value of the description attribute is a human-intelligible, localized string that describes what the object does. Because it describes the application-specific function of a user interface object, the accessibility protocol cannot supply the description value. Therefore, it is essential to provide a description for all accessibility objects in your application that do not already include a title attribute (described in “Title Attributes”).

To see why the description attribute is so important, suppose your application window contains a button to left-justify text and which displays a left-pointing arrow. An assistive application can accurately identify this control as a button because it is represented by an accessibility object with the role attribute “button”. However, unless you provide an appropriate description, the assistive application has no way to know that this button left-justifies text, and therefore no way to communicate this to the user.

Title Attributes

The title attribute is required for user interface objects that include a text title in their display. For example, the title of a button is the text that appears on the button, such as the text “OK” on an OK button, and the title of a window is the text that appears in its title bar.

A user cannot change the title of such an object directly, but the title might change programmatically if the state of the object changes. For example, a Connect button’s title might change to Disconnect after a connection is made, but not because a user chose to change the button’s title. An accessibility object that represents such an object must include the title attribute and the attribute’s value must be the title string.

Many applications display static text that serves as the title for a user interface object, but that is not contained in that object. An example is the word “Search” displayed below a search field or “Address:” displayed next to a set of editable text fields. To a sighted user, the proximity of the string to the object (or objects) it describes is usually enough to imply the relationship between them. To an assistive application, however, these strings are unrelated to the objects they describe (if they are visible to the assistive application at all).

Mac OS X version 10.4 introduced two related attributes that give assistive applications information about such titles. The TitleUIElement and ServesAsTitleForUIElements attributes allow you to define the relationship between a piece of static text and the object (or objects) it describes.

The TitleUIElement attribute belongs in the accessibility object representing the object being described. The value of this attribute is the accessibility object you create to represent the static text. The ServesAsTitleForUIElements attribute belongs in the static text accessibility object you’ve created and its value is an array containing an arbitrary number of accessibility objects. This allows you to link the static text title with any number of user interface objects. Although these attributes are not required, you should provide them if your user interface includes such static text titles.

Relationship Attributes

To participate in the accessibility hierarchy, an accessibility object must include links to its immediate ancestor and descendants (if any). This helps an assistive application traverse the hierarchy. In addition, an accessibility object can express other relationships, such as between views that affect each other, but that are not linked by containment.

All accessibility objects, with the exceptions of the application-level and system-wide accessibility objects, include a parent attribute. The value of this attribute is usually the accessibility object representing the closest accessible container of the user interface object.

If a user interface object contains other accessible user interface objects, the UIElement representing it must include the children attribute. The value of this attribute is an array containing the UIElements of the accessible descendants.

Some relationships are conceptual rather than containment-based. For example, it’s not unusual for an application to display two separate views of the same or related content. One example is the Mac OS X Mail application. The Mail application’s upper view can display the subject, sender, and receive date of each message in the selected mail box. In the lower view, Mail can display the body of a message selected in the upper view. To a sighted user, the relationship between the selected message in the upper view and the message content in the lower view is apparent. To an assistive application, on the other hand, this direct relationship doesn’t exist. If an assistive application can’t express such a relationship to a blind user, for example, the user can’t jump back and forth between the related elements the way a sighted user can. Instead, such a user might have to step through all intervening controls and views to move between the message description and its content.

Mac OS X version 10.4 introduced the LinkedUIElements attribute to allow you to define such relationships. As you would expect, the UIElement of each related object should contain this attribute. The value of the attribute is an array of UIElements so you can specify one-to-one and one-to-many relationships.

Value Attributes

The optional value attribute describes the accessibility object’s state. The value might be the state of a check box or the contents of an editable text field.

The value attribute is often settable. For example, an accessibility object that represents a user-modifiable object, such as an editable text field, has a settable value attribute. This allows an assistive application to set the value of the accessibility object’s value attribute to contain the user’s input. Optionally, accessibility objects may also include attributes that define a range or set of values the object can accept, such as minimum and maximum values for a slider.

Actions

Technically, an action is an attribute of an accessibility object. However, the accessibility protocol supports actions differently from the way it supports attributes, so this chapter describes them separately.

An accessibility object can support one or more actions. An action describes how a user interacts with the user interface object the accessibility object represents. It does not describe the application-defined function of the object. This is because an assistive application is concerned with driving the user interface and the results of an action are irrelevant to it. If your application displays a print button, for example, the button’s accessibility object supports a press action, not a print action.

Because actions are generic and refer to the capabilities of user interface objects, there are only a few of them. This means that the set of actions an assistive application has to understand is small and well-defined.

In AXActionConstants.h (located in the HIServices framework), Mac OS X defines seven actions an accessibility object can support:

When a user performs an action, an assistive application sends a message to the accessibility object, requesting it to perform the action. Your application should invoke the same code to carry out this action as it does when the request comes directly from your user interface.

Each action attribute has a description property. An assistive application may speak this description to tell the user what action is available for a specific object. The value of this description property is similar to the value of the role description attribute in that it is a human-intelligible, localized word or short phrase. Unlike the role description, however, the action description is not automatically supplied by the accessibility protocol. If your application creates its own accessibility objects which support actions, you must supply the appropriate action descriptions.

Communication With Accessibility Objects

At the heart of accessibility is the communication between an assistive application and the accessibility objects that represent your application’s user interface. This communication can be divided into two categories:

If you use only standard, noncustom Cocoa or Carbon objects in your application, most of this communication is transparent to you. In some cases, you might have to create a custom response to a message, but this is unlikely. It is even less likely that you will have to handle notifications if you use only standard objects.

Messages

An assistive application communicates with your application by sending messages to accessibility objects. In Carbon, these messages are transmitted as Carbon events. In Cocoa, these messages result in calls to methods of the NSAccessibility protocol. For more details about the framework-specific implementation of messages, see Accessibility Programming Guidelines for Cocoa and Accessibility Programming Guidelines for Carbon.

In HIObject.h, Mac OS X defines a handful of messages. The following lists the types of messages an assistive application can send to an accessibility object:

Like the set of actions, the set of messages to which an accessibility object can respond is small. These messages give the assistive application a great deal of control, however. For example, by getting and setting attributes, an assistive application can do things like the following:

Notifications

In addition to responding to messages from assistive applications, accessibility objects also broadcast any significant changes that occur in the user interface objects they represent. For example, if the keyboard focus changes to a new text field, a new window becomes active, or a control’s title changes, the accessibility objects for these objects send out notifications.

An assistive application chooses to register for the notifications it is interested in.

An accessibility object can send notifications to indicate any of the following status changes:

Unless you are creating accessibility objects to represent custom user interface objects, it is unlikely you will have to write code to send notifications. Both Carbon and Cocoa automatically broadcast the appropriate notifications for standard objects.

Hit-Testing and Keyboard Focus

To a sighted user, the location of the cursor is easy to discern. Similarly, a sighted user can usually tell which object in the user interface has keyboard focus. An assistive application, on the other hand, must query an application to determine which object has keyboard focus or is under the mouse pointer. An accessibility object provides answers to these queries by returning the values of various attributes.

Although you implement hit-testing differently depending on whether you’re using Carbon or Cocoa, the basic procedure is for an assistive application to ask the application to return the accessibility object under the cursor. The request is recursively passed down the application’s accessibility hierarchy until it reaches the deepest, unignored accessibility object that contains the mouse pointer.

Accessibility objects also must support queries regarding keyboard focus. An accessibility object stores focus information in its focused attribute. The initial query from the assistive application is for the focused attribute of the application-level accessibility object. This query, too, is passed down the application’s accessibility hierarchy until it reaches the deepest, unignored accessibility object whose focused attribute is true. As with hit-testing, how the keyboard focus is actually determined from lower-level objects varies depending on whether you use Carbon or Cocoa.

An Example of Accessibility

This example gives a detailed description of how a fictitious screen reader with speech recognition and speech synthesis capability might communicate with your application:

  1. The user says, “Open Preferences window.”

  2. The screen reader sends a message to the application accessibility object, asking for its menu bar attribute, which is a reference to the menu bar accessibility object. It then queries the menu bar for a list of its children, and queries each child for its title attribute until it finds the one whose title is the application’s name (that is, the application menu). A second iteration lets it find the Preferences menu item within the application menu. Finally, the screen reader tells the Preferences menu item to perform the press action.

  3. The application opens the Preferences window and then the window sends a notification broadcasting that a new window is now visible and active.

  4. The screen reader, assuming that it registered to be notified when a new window opens, queries the window for a list of its attributes. Assuming that the window accessibility object contains a children attribute, it then queries the accessibility object for the value of its children attribute.

  5. To each child of the window accessibility object, the screen reader sends a query asking for a list of its attributes. It then queries the child for the values of its role, role description, and (if it exists) children attributes.

  6. Among the responses, the screen reader learns that the pane contains several children (for example, three checkboxes).

  7. The screen reader queries each checkbox, asking for the values of the following attributes:

    • role (checkBox in this case)

    • role description (‚Äúcheckbox‚Äù)

    • value (checked or unchecked)

    • children (none in this case)

  8. The screen reader, having learned what objects (controls in this case) are accessible in the window, reports this information to the user using speech synthesis.

  9. The user might then ask for more information about one of the checkboxes.

  10. The screen reader queries the specified checkbox, asking for the value of its help attribute (assuming it exists). It reports this string to the user using speech synthesis.

  11. The user then tells the screen reader to check the checkbox.

  12. The screen reader sends a message requesting that the checkbox’s value attribute be set to 1.

  13. The checkbox accessibility object broadcasts that the value of its value attribute has changed.




Last updated: 2008-03-11

Did this document help you? Yes It's good, but... Not helpful...