Introduction


In caring for patients, clinicians must rapidly and accurately integrate large, complex and variable sets of information-including patient-specific data, general medical knowledge and literature references, and management guidelines. Current changes in the health care system push responsibility for this decision making onto less expert providers who must function ever more efficiently. We believe that augmented reality (AR) interfaces will allow the creation of new integrated information system applications that empower clinicians at all levels to better navigate through, analyze, and manage this welter of information.

In contrast to occlusive virtual reality (VR) environments, augmented reality interfaces consist of registered computer-generated imagery superimposed on the user's real environment [1, 2]. Like VR, AR offers many compelling benefits of advanced three-dimensional interfaces such as new forms of data visualization across environment "landscapes", user "presence", and multi-modal interactivity. Mobile AR interfaces are becoming possible through emerging technologies such as the Virtual Retinal Display (VRD) [3], distributed network software architectures, multi-modal input systems and wireless communication methods. Users will be able to move through both her physical workspace as well as a context-appropriate informational workspace. For ambulatory disciplines such as medicine-for which "furniture" ("desktop", "windows") paradigms have imposed severe limits-these capabilities promise to revolutionize application development [4].

However, in order for these "wearable augmented reality medical" (WARM) applications to be more than renderings of conventional 2D screen displays hung in 3D environments, new representations of medical information and their "control surfaces" for interaction will have to be designed. Careful validation and verification developmental methodologies will have to be established for these systems to the extent they manage critical patient data and are regulated by the FDA.

We report here our first empirical work in WARM system development at the U. of Washington's Human Interface Technology Lab. For this study, electrocardiographic (ECG) data were chosen for several reasons. ECGs are ubiquitous in all but the most sedate clinical environments. Many different types of clinicians bear the responsibility of being able to interpret the information, but for the less experienced, this responsibility is at best a source of discomfort and at worst a source of significant error. ECG rhythms require varying levels of attention, from background surveillance when abnormalities are absent, to close examination when problems arise. Displays must make such attention shifts easy, since the speed and accuracy with which a clinician makes a decision about management can have grave clinical consequences. Finally, ECGs are real-time data and provide important design challenges beyond those posed by static data such as laboratory results.

It is our presumption that WARM interface technologies can provide better decision support for clinicians. More specifically, we hypothesize that new 3D displays of ECG information and new presentation modes of ECG display organization can produce faster and more accurate clinical decisions. To test these hypotheses, we have created a pair of ECG interface objects, a set of presentation mechanisms by which the interface objects can be placed in the 3D environment, and a test methodology that uses task loading through a simulated confounder clinical task to evaluate the speed and accuracy with which clinicians can make decisions under the different study conditions.


table of contents...


copyright © 1997 Stan Kaufman, M.D.