| Publications Page | HITL Home |
An example of a poor operator/computer Interface can be observed in modern fighter aircraft. In these systems the evolution of computing and display technologies have resulted in the so called "glass cockpit". In an attempt to convey more information to the pilot, conventional "round dial" instruments have been replaced by cathode-ray tubes and/or other electronic display devices which present alphanumerics and occasionally some graphics to a highly trained human operator. Generally, these electronic cockpits can display more data to the operator than the operator is able to assimilate (i.e. more data but not necessarily more information)! Data or symbols presented on these displays are highly codified requiring their meaning to be learned and responses trained at higher cognitive levels. In essence, these panel- mounted displays (and even narrow field-of-view head-up displays) provide only two dimensional "peep holes" into a three dimensional world. The failure of these present day attempts at information portrayal is especially evident during high stress conditions when pilots complain that their "brains tend to ooze out of their fingertips." The two main problems here are: one, the lack of a suitable portrayal mechanism for transferring information to the three dimensional world of the operator; and two, an inadequate means for facilitating a natural psychomotor control input to the machine.
One design of a virtual cockpit system concept is shown diagramatically in Figure 1. As in conventional cockpits, information regarding the status, location, orientation of the ownship aircraft and other outside aircraft and targets is decoded from a digital bus originating in the avionics, flight control and communications and weapon subsystems. From this information, electronic images,
instead of being presented on a panel display as in current systems, are generated on miniature cathode-ray tubes mounted on the helmet and projected through binocular optics Into the pilot's visual field. A magnetic helmet orientation/position tracking subsystem tells the computer where to locate the visual information within the projected visual scene based upon where the pilot has directed his head/helmet line-of-sight at any given time. In this way a global stereographic electronic world can be presented which Is registered with and can overlay the real world. Hence, information relevant to the states of the aircraft and environments are represented with spatially correct formats and in the proper locations (as they would appear in the real world) allowing the pilot to form easily a "gestalt" of this world that matches his normal interaction with the world. Figure 2 shows one headgear design incorporating helmet-mounted optics, CRTs, a magnetic tracking sensor and chemical protective equipment. The transparent visor allows the pilot to view the external environment during normal day operations with the images from the CRT superimposed over the world. During night operations the display scene will be synthesized from the sensor and data base information.
A pictorial representation of a possible "virtual cockpit" scene is shown in Figure 3. Depicted here is the visual scene which a pilot might view when flying a low altitude mission at night. The scene is projected to the pilot in three dimensions and overlays the real world with a one-to-one spatial registration. (Under daylight conditions the scene would be transparent and only portions of the symbols superimposed over the real world.) The instantaneous size and location of the "virtual window" into computer-generated world is equivalent to the field-of-view of the binocular optics and may be as great as 140 degrees horizontally by 60 degrees vertically. When the instantaneous display scene is driven by head/helmet movement, the entire 4 pi steradians field-of-regard is available to the operator.
In the "virtual cockpit" information from sensors, threat warning systems, terrain maps, and weapon delivery envelopes are organized and presented so that the spatial meaning of these artifacts is conveyed effectively. The pilot also interacts with the display spatially by pointing his head/helmet or hand/finger-mounted sensors at
objects in the display and gives verbal commands. Functions can also be activated by merely looking at a displayed switch and saying "select" or "on" or "off" or "go there" or "stop here". Hand orientation once placed in predetermined locations within the cockpit is sensed and used to command system functions. The visual display is augmented with both binaural auditory and tactile displays. The auditory display gives the pilot a three dimensional sound which provides localization cues to the directions of different targets, aircraft, threats, etc. Warning signals are perceived as coming from a particular point In the cockpit. A synthesized speaker can even whisper in his ear information which cannot be ignored. Tactile displays give the operator "touch" feedback that a virtual switch has indeed been pushed.
The advantages of such a virtual cockpit are enormous! Indeed the display medium truly offers a port for the computer or avionics to communicate a spatial awareness to the operator in all directions and in three dimensions. The pilot is able to interact using natural psychomotor skills, easily providing directional commands to the machine. There is a flexibility in the configuration of the cockpit as governed mainly by software rather than hardware restrictions, so that the cockpit can be dynamically reconfigured either for a different mission, due to pilot preference or as upgrades to the avionics are installed. Another advantage of the virtual cockpit is that the pilot can be fully encapsulated for protection in a hazardous environment; likewise, the pilot need not be present in the actual vehicle which he is piloting since with the appropriate data links a "remote" virtual cockpit would provide the visual, auditory and tactile "telepresence" cues as if he were located in the vehicle.
In 1977 AAMRL began a project to develop a fixed-base virtual space simulator to investigate the airborne virtual cockpit concepts described above and their associated technologies [4]. Designated the "Visually- Coupled Airborne Systems Simulator" or "VCASS", this revolutionary approach to simulation was achieved by significantly extending the performance of previously developed helmet-mounted tracking and
display systems [1,5,6].
The functional operation of the. VCASS is shown in Figure 4. The heart of the system is a special headgear assembly which includes two miniature cathode-ray tube (CRT) image sources, virtual image projection optics and a helmet attitude/position sensor. Computer-generated video and line graphics are displayed on the two CRTs (one for each eye). The diameter of the phosphor quality area of the CRT faceplate is 19mm with an overall diameter of 27mm including the ~-metal shield. Each tube weighs 135 grams and has a length of 125mm. The CRT is magnetically deflected producing an inscribed raster with a 4 x 3 aspect ratio with a nominal spot size of 20 microns (half power) at a beam current of 40 microamps. Raster line rates of up to 1225 lines at a60 Hz field rate with two to one interlace can be delivered on the CRT. Line graphics can also be drawn on the CRT either independently or in conjunction with the raster graphics. The phosphor screen consists of a small grain size P53 YAG type phosphor which produces luminances greater than 1500 foot- Lamberts through a fiber optics faceplate (at a 60 cps refresh rate and a 20,000 in/s writing rate) [7]. The fiber optics faceplate corrects for deflection defocusing in t ~e off-axis spot and improves contrast especially at high luminances. The external surface of the fiber optics faceplate is curved to eliminate residual field curvature in the relay lens of the infinity optical system.
The two CRT images are relayed to the operator's visual field by an infinity binocular optical system [8]. Each ocular magnifies, collimates and projects the real Image from each CRT so that it appears to the operator to subtend a visual angle of 60 degrees vertical by 80 degrees horizontal. The optics form a 15mm diameter pupil at an eye relief of 39mm. Interpupillary distance between the two oculars can be adjusted between 62-72mm. The optical axis of each ocular is rotated inward so as to produce a binocular overlap region which is adjustable between 20 to 60 degrees. (Normally the display optics are set for an instantaneous field-of-view of 120 degrees horizontal by 60 degrees vertical with 40 degrees of overlap.) Within the overlap region information can be presented in three dimensions using artificially generated retinal disparity to produce a sensation of depth.
The optical systems transmits a maximum of 0.86 percent of the luminance of the CRT to the eye of the operator while using an opaque beamsplitter. Alternative beamsplitter elements provide ambient transmissions of 1.1, 2.3 and 4.9 percent with corresponding CRT to eye transmittances of 0.75, 0.6 and 0.3 percent respectively. When operated with the CRT at full brightness and using the opaque beamsplitter element, approximately 16 foot-Lamberts reaches the eye. The minimum pixel size for full CRT luminance with a line width of 20 microns and 20 percent response subtends an angle of about 3.75 minutes-of-arc at the eye.
The instantaneous orientation and position (i.e. rotation and translation) of the helmet and optics are measured using an electromagnetic helmet orientation/tracking system [9]. A radiator assembly located in the cockpit generates a pulsed 3-axis oscillating magnetic field with each axis being excited sequentially. The sensor assembly located on the helmet transduces these fields for processing by a DEC PDP 11/55 CPU with high speed bipolar memory. The computer outputs the azimuth, elevation, roll and x,y,z location of the helmet sensor with a 14 bit precision. The helmet orientation is measured with a precision of 1.5 arc minutes. The static line-of-sight accuracy is 0.18 degrees (50 percent circular error) over the limits of a 6.2 cubic foot motion box for orientation angles limited to +/- 60 degrees in azimuth/elevation and +/- 30 degrees in roll. Over the same motion box with maximum orientation angles increased to +/- 145 degrees in azimuth, +/- 80 degrees in elevation, and +/- 45 degrees in roll, a static line- of-sight accuracy of 0.22 degrees Is realized. The maximum update rate of the system is approximately 90 Hz. All head orientation angles (i.e. +/- 180 degrees azimuth, +/- 90 degrees elevation and +/- 180 degrees In roll) can be measured within a head motion volume of 6.25 cubic feet. In addition to helmet orientation/position, a second (and perhaps third) transducer can be used to measure the location and orientation of the operator's hand or finger serving as a 3D mouse to interact with the display.
Figure 5 shows a pilot wearing the laboratory VCASS headgear. The combined weight of two CRTs, optics, magnetic sensor, HGU 26A/P helmet and 1 foot of unsupported cable is 5.86 pounds (94 ounces). The center
of gravity as referenced to the centerline of the helmet is +11.63cm in the X axis and -3.0 cm in the Y axis. The display is symmetrical in the Z axis. A negator spring assembly (not shown in the photograph) is used to counterbalance the rotational moment in the pitch axis produced on the head by the offset center of gravity.
ï The instantaneous orientation of the helmet and the attitude and location of the aircraft in three dimensional space are used by the graphics generator to create images on the CRT which appear to the operator to be stabilized in space. Currently the VCASS uses two Evans & Sutherland Picture System II's with dual DMA interfaces as line graphics generators (one for each eye) to feed the display electronics. Each picture system is hosted by a DEC PDP 11/55 CPU with high speed bipolar memory. The graphics processing system combines the instantaneous orientation of the helmet (and optics) with the location, altitude and attitude of the simulated vehicle to produce a virtual graphical world for the operator. Depending on the application, the individual graphics presented on the display may be stabilized one of four ways:
1) head stabilized: for head aiming weapons or selecting functions from cockpit stabilized switches;
2) cockpit stabilized: virtual rendering of cockpit instruments or switching panels, weapons stores, etc.;
3) earth stabilized: features which are superimposed and stabile relative to earth coordinates such as navigation waypoints, target locations, etc.;
4) space stabilized: other aircraft, in-flight missiles or compass heading information.
A total of eight processors are used in parallel to support the real-time operation of the VCASS. In addition to the three PDP 11/55 CPUs mentioned above, the following CPUs are used: POP 11/785 -- master control and data acquisition PDP 11/750-- vehicle plant dynamics PDP 11/750-- weapons plant dynamics POP 11/750 -- threat dynamics PDP 11/34 -- cockpit support
Each processor communicates via a custom designed shared memory system with eight ports. In the current configuration the system update rate is 15 Hz. This update rate is constrained primarily by the speed by which the graphics processors can generate the line graphics terrain scene and cockpit display presentations.
The total effect of the equipment above is to produce a panoramic portrayal of graphics information which "mimes" the world. The entire sphere of information is then available to the operator by moving the instantaneous window subtended by the display optics.
Interface Modalities
In addition to the generation of the display presentation described above, the VCASS also provides several control interfaces which are manipulated relative to the virtual scenes.
1) head-aimed control -- pilot positions head-stabilized reticle over virtual switch box and presses enabling switch to activate function.
2) voice actuated control -- pilot speaks control command into microphone to activate system function, or alternatively, pilot places head-stabilized reticle over switch or target and gives a specific selection from a menu of verbal enabling commands (e.g. select, mark, zoom, lock-on etc.).
3) touch sensitive panel -- pilot places finger on touch panel area to call up special virtual switch panels. Depending upon
selection, touch panel regions are redefined for different additional panels which may be called up and windowed within the display.
4) virtual hand controller -- in this mode the pilot moves his hand in three-dimensions. The magnetic tracker senses hand position and orientation with six degrees of freedom. When the hand is put into a pre-determined volume or region within the cockpit, a three dimensional virtual control panel is windowed into the visual display. The pilot then activates functions or makes vernier adjustments by moving his hand or placing a finger over the virtual switch. Auditory, visual and/or tactile (see below) feedback is given to indicate completed control action.
5) eye control system -- Although currently not installed in the VCASS, an eye position tracking system will be incorporated into the headgear which measures the instantaneous orientation of the eye relative to the helmet. When summed with the helmet orientation (obtained by the magnetic tracking system described above), eye fixation angles relative to the virtual display are known. With this system it is now possible for the pilot to "look" at virtual or real switches in order to activate them for designated targets or features in the outside world for system input.
In addition to the control and display interfaces described above, two other display systems are being developed for incorporation into the VCASS. The first is a tactile display system which provides to the operator a stimulus to his hand and/or fingers which signifies that he has touched or activated a virtual control panel. The second display system provides a three dimensional auditory surround to the panoramic visual display. Using simulated auditory localization techniques, the sound display will convey to the operator directional information regarding events which are taking place outside the simulated vehicle (e.g. threat warning) or within the vehicle (e.g. aircraft health monitoring). An auditory "ambience" display will also provide a background surround of vehicle state (velocity, ground rush, wind rush, etc.) to convey subliminally various states of the aircraft.
algorithms and switching modalities. Ground and airborne threats and weapon dynamics were programmed using a family of software modules and structured programming techniques. Preliminary operator interface investigations were made leading to procedures for alignment of the display oculars and operator adjustment of stereo viewing distance. Following these preliminary investigations, several piloted experiments were conducted to assess the configuration and utility of virtual cockpit principles.
From the investigations which have been performed on the VCASS, the following observations can be made:
1) There was a universal acceptance and endorsement of the simulated virtual cockpit designs by crew members which have participated in piloted simulation experiments. Generally, the virtual interfaces were viewed as being "intuitive" and easy to learn. In many cases, subjects were flying complex missions with only one hour of training on the system.
2) When given the opportunity to control cockpit functions with head position versus voice switching modalities, pilots generally preferred the voice switching approach. Shorter response times were observed, however, for those head switching functions which did not require the pilot to aim at a virtual switch location greater than 10 degrees from his current tracking task.
3) When instantaneous horizontal field-of-view was manipulated in one experiment (i.e. 40 degrees monocular, 40 degrees binocular, 90 degrees binocular and 120 degrees binocular), pilots preferred the 120 degree display but indicated little difference between the 120 and 90 degree conditions. Pilots also rated over a two-to-one advantage of the binocular 40-degree display relative to the 40 monocular display.(Note: In each display condition the minimum angular resolution remained constant.)
4) The display update rate of between 12 to 15 Hz using a line graphics presentation is usable but not desirable. An update rate of at least 20 Hz and preferably 30 Hz is desired to eliminate the apparent lag in the display image. Sensing head acceleration and using predictive algorithms to predetermine image position prior to update may help. The addition of a raster graphics scene background system to relieve the load on the line graphics system will significantly
increase update rate.
5) At least 14-bit precision is needed to provide adequate spatial stabilization and smoothly perceived movement of the display image in virtual space for a total field-of-regard of 4 ~ steradians..
6) An exit pupil of 15 mm diameter is marginal for an ocular field- of-view of 80 degrees. Under these conditions, the eye translates within the pupil to view the extremes of the visual field. In order to see the total instantaneous FOV, helmet to head fit and adjustment are therefore critical, allowing for little slippage of the helmet on the head. Based upon these considerations an exit pupil of at least 19mm is needed for a field-of-view of 80 degrees.
7) The eye relief of 39mm was found sufficient to accommodate all users with spectacles.
8) A "luning" effect was observed at the boundary of the binocular visual field. This can be reduced by thinning the opaque areas of the combiner housing. The correspondence of see-through and display field- of-view will also be improved with this modification.
9) Generally the resolution, luminance and overall field-of- view were perceived as being adequate for all simulations conducted.
10) The use of line graphics to simulate terrain features is undesirable due to the possible confounding of the cockpit displays with the outside terrain scene. This problem was reduced somewhat by displaying the cockpit scene at an apparent viewing distance different from the terrain using stereo disparity between the two oculars. It is recommended that a raster scene background system be used instead to generate the terrain display while the with stroke graphics are still used to depict cockpit graphics Information.
11) The helmet weight was not determined to be a problem in ground- based simulation; however, the rotational moment of inertia of the headgear was noticed by most subjects. Although this was an artifact in the piloted experiments, subjects commented that large head movements were usually not necessary due to the wide instantaneous field-of-view presentation.
Many areas are yet to be explored while simulating in virtual space. Although described briefly above, the binaural auditory surround, eye position sensing, 3D virtual hand controller and tactile displays are yet to be incorporated into the VCASS. Further improvements in the resolution and luminance of the image source including the addition of color and a full raster image generation system (to enhance the line graphics system currently used) are considered to be high priority. Future experimental investigations will continue to explore the effectiveness of different interactive modalities including eye control, voice control, touch panel and virtual hand controller options. Eventually a design guide for information portrayal in three-dimensional virtual space will be developed.
Airborne Virtual Panoramic Display
Currently under development by the Laboratory is an airborne version of the VCASS. This development, termed the "Virtual Panoramic Display", makes possible the flight testing of a virtual cockpit. In this development, each of the VCASS components is being re-engineered for flight operation including the development of a new lightweight helmet, optics and associated components. The design goal for the total helmet weight is less than 3.5 pounds for a rotary-wing application [10]. A smaller field-of-view helmet-mounted head-up display weighing less than 2.5 pounds is being developed for fixed-wing fighter applications.
Virtual Terminal
In a separate effort, a portable or "micro" version of the VCASS is being developed. It is envisioned that this system will be packaged into a "briefcase" and eventually serve as a portable 3D virtual terminal for computer-aided design, instructional and command, control, communications applications. This terminal would provide a fully interactive "virtual computer space" wherein the human and the computer I/O live together. This space can be thought of as being like a visual (and auditory) "living room" in which computer-generated objects or symbols would appear (as 3D virtual Images) to the operator as if they were physical realities. The operator can look around the room (e.g. the furniture, etc.) and reach out and "touch" these virtual objects. He can change this world by giving verbal commands while simultaneously touching or looking at the objects within the synthesized world. Eventually the operator will also~be able to translate within the
computer generated world by physically walking around in the "virtual room
Perhaps the main application of the virtual terminal technology would be its role in computer-aided design where existing two- dimensional design terminals are lacking. Using this system, the operator becomes part of a "designer's world11 created in virtual space. Within this world he touches, changes, interacts and even operates computer-portrayed designs such as an automobile, aircraft cockpit or part to be machined before the design is committed to automatic drafting. In a see-through display presentation the virtual terminal could superimpose computer-generated design information which exactly overlays the space where a building or other object Is being constructed or maintained. Another virtual terminal application would be as a portable command post or air-traffic control display which provides to the controller a 3D surround showing all air traffic in properly scaled spatial relationships. For other remote control applications it would be possible for the operator to "wear" his control station, with a wireless data link to the base station.
The virtual terminal also provides a means for rapidly communicating spatial information between the operator and an expert system or knowledge base. The knowledge bases or actions could be represented as spatial metaphors or icons which surround the operator, placing the full intellect of the intelligent machine literally at his "finger-tips".
virtual television. Regardless, the further development of these virtual interface technologies are key to our harnessing the ultimate power of the combined intellects of the human and the computer.
2. Virtual cockpit's panoramic displays afford advanced mission capabilities, Aviation Week and Space Technology, January 14, 1985, 143-152. -- _____ __________
3. Mills, R.B., CRTS give new look to cockpit of the future, Machine Design, June 6, 1985, 34-40.
4. Kocian, D.F., VCASS: an approach to visual simulation, Proceedings of the 1977 IMAGE Conference, Air Force Human Resources Laboratory, Williams AFB, AZ, May 1977.
5. Birt, J.A., and Task, H.L., Proceedings of a Symposium on Visually- Coupled Systems: Development and Application. Aerospace Medical Division Technical Report: AMD-TR-73-1, Brooks AFB1 Texas, September 1973.
6. Birt, JA. and Furness, T. A., Visually-coupled systems, Air University Review, 20 (3),1974, 28-40.
7. Sanford, E., Seats, P., Miniature CRT improvement study, Air Force Aerospace Medical Reserch Laboratory Technical Report: AFAMRL-TR-83-075, Wright-Patterson AFB, Ohio, September 1983.
8. Buchroeder, R., An optical analysis of the Farrand VCASS helmet- mounted display, Air Force Aerospace Medical Research Laboratory Technical Report: AFAMRL-TR-83-072, Wright-Patterson AFB, Ohio, October 1983.
9. Raab, F.H., Blood, E.B., Steiner, T.0., and Jones, H.R., Magnetic position and orientation tracking system, IEEE Transactions on Aerospace and Electronic Systems, 15 (5), September 1979, 76~718.
10. Furness, TA., Virtual panoramic display for the LHX, Army Aviation, June 30, 1985, 63-66.