There is such an enormous amount of innovative technology that has
been developed for the Brain Opera that it is impossible to mention it
all here. Brain Opera technology is a natural extension of the
Hyperinstruments project, started at the MIT Media Lab in 1986 by Tod Machover
and Joe Chung, and joined by Neil Gershenfeld in 1991 and myself in 1993. At
first designed to enhance the virtuosity of some of the world's
greatest performers, from Yo-Yo Ma to Prince, hyperinstruments
started evolving in 1991 towards the development of expressive
tools for non-professional musicians. The Brain Opera is the
culmination to date of this work, and points the way to the further
development of expressive objects (furniture, remote controls,
clothing, etc.) and responsive environments (including living rooms, concert
halls, and department stores).
Among the more significant new hardware developments for the
Brain Opera are the Harmonic Driving system, the Melody
Easel, the Rhythm Tree, the Gesture Wall, the Digital Baton, the Singing and Speaking Trees, and the Sensor Carpet. Among the project's
numerous software innovations are the Singing Trees (analysis of
every nuance and "feeling" of vocal quality); Harmonic Driving
(parametric algorithms that allow a piece of music to be shaped and
"personalized" while it is playing); the Rhythm Tree (which analyzes
multiple-person behavior to create a complex systemic reaction); the
Performance Hyperinstruments (which forge an array of continuous
gesture and discrete positional information into intuitive, natural
controls); and the entire Brain Opera system, which is itself a
complex networked environment capable of integrating new
elements into an existing structure automatically or in human-assisted fashion.
Below is a graphical and technical discussion of the technological developments of each of the individual Brain Opera experiences:
The presence of a seated participant is detected when a light beam
pointed at the chair is interrupted. The user controls the experience
with a novel joystick made from a large, bendable spring. Two-axis
bending angles are measured using capacitive sensing to
detect the relative displacement between the spring's coils at its
midpoint. Twist is also measured with a potentiometer that rotates through
the relative angle between the top and bottom of the spring.
We use a pressure-sensitive IntelliTouch Screen from ELO,
based on ultrasound propagation through the screen surface.
A simple microprocessor on each pad analyzes the signal coming from a
piezoelectric strip, which picks up the strike. Parameters describing
this signal (which reflect the way in which the pad was hit, allowing
dexterous and expressive control going beyond simple strike velocity) are
sent across a shared serial network to a host processor, which formats the data
into MIDI and passes it to the main computer running ROGUS. Up to 32
pads can be daisy-chained (like a string of Christmas lights) onto a
single host and bus line. We will have 10 such strings running in the
Brain Opera Lobby. Each pad also houses a bright LED, which can be
illuminated with a dynamically variable intensity.
A performer steps onto a plate that has a harmless low-frequency (50 Khz),
low-voltage (10 Volts) RF signal applied to it. This signal is then
couples through the performer's shoes and is broadcast through their body
to a set of four pickup antennas located on goosnecks around the
perimeter of the screen. These signals change with the distance of the
performer from the respective sensors (an LED mounted in each
sensor glows with increasing intensity as the performer's body
approaches). The sensor data is transferred to a PC
running ROGUS, where it is analyzed for gestural characteristics.
Before starting the experience, the user must calibrate out the coupling
strength of their shoes and body mass, which vary considerably from
person to person. This is accomplished by touching a reference pickup
electrode, which adjusts the transmitted signal such that every
participant radiates equally.
A small microprocessor in the baton samples signals from 5
pressure-sensitive resistors potted into the baton skin (to measure finger
and hand pressure) and 3 orthogonal accelerometers in the baton base (to
measure sweeping gestures and beats). These signals are sent through a
wire to the host computer running ROGUS. A camera housing a
position-sensitive photodiode looks at an infrared LED mounted at the
baton tip. This camera is only sensitive to the 20 KHz signal emitted from
the LED; all other light sources are ignored. The photodiode in the
camera directly produces a signal that determines the horizontal and
vertical coordinates of the baton tip; no video processing is required.
Click here to see an older schematic diagram of the Digital Baton.
A Sensor Floor, composed of a mat surface atop a matrix of 64
pressure-sensitive piezoelectric wires, measures the position and
intensity of footsteps. The upper body motion is sensed in this region
with a system based on Doppler radars and/or ranging sonars.
A floormat switch detects the user's presence, and starts the
experience. The user then navigates through the interactive database
using a hand-held piezoresistive mouse that detects the center of
pressure of the thumb, and moves the pointer accordingly.