FEATURE
BAR

PERFORMANCE ANIMATION

D'Cuckoo and RiGBy On Stage

Multimedia Musical Ensemble Ventures into Live Performance Animation

by Linda Jacobson
Virtual Reality Evangelist, Silicon Graphics

"The 'trick' in ventriloquism," according to Ventriloquism Made Easy, "is learning to produce all of the sounds which the mouth modifies by using the lips in such a manner that is not discernable to the audience--which requires control over your larynx and other speech organs, proper breathing, and development of stage presence and timing."

I'm a "virtual ventriloquist." My alter ego, RiGBy, performs with the San Francisco-based musical multimedia ensemble, D'Cuckoo. If you think about it, you realize that ventriloquism is fundamentally a communications tool. So is the computer. Combine the two and you can have realtime, interactive character animation, or "digital puppetry." RiGBy is a digital puppet.

RIGBY MOVIE

With a face that resembles a colorful African mask, RiGBy lives inside a Silicon Graphics workstation. At showtime, RiGBy appears on a screen above the band, bantering with the players between songs and joshing the audience.

Hidden from the audience, I provide RiGBy's voice. Her movements--arching eyebrows, blinking eyelids, and spinning head--are controlled live by digital puppeteer and RiGBy co-inventor Ron Fischer. I speak into an effects-laden microphone; its output goes into the Silicon Graphics workstation, which reads the amplitude of my voice to open and shut RiGBy's mouth. Ron controls the position and orientation of RiGBy's features and head by manipulating a mouse and a Spaceball 3D input controller. Nothing in this setup is canned. Ron and I perform RiGBy spontaneously, and what RiGBY does and says depends on our mood, the absurdity of current events, and who's in the audience.

RiGBy exists because D'Cuckoo always seeks to create new participatory, interactive experiences for the audience. We were inspired by the work of SimGraphics Engineering, whose products we couldn't afford in 1992; the band operated on a shoestring and SimGraphics' early performance animation system was priced for champagne budgets. SimGraphics pioneered realtime, computer-generated character animation. The company formed in 1989 to develop virtual reality applications. In 1992 it teamed with the Hollywood special effects firm The Character Shop to create the VActor ("virtual actor") performance animation system.

VActor
SimGraphics'
Sayori
VActor lets puppeteers control computer-generated, animated characters in a live performance. The character's movements are controlled by the puppeteer's movements, special input devices, and motion and position sensors. Besides an SGI computer, the main hardware is a facial armature, a contraption that positions the sensors on the puppeteer's forehead, chin, lips, and jaw to detect muscle movements, which are then translated into the puppet's facial expressions. The puppeteer also moves a 3D mouse to move the puppet's head.

Another type of live-performance animation involves attaching wired sensors (position-tracking technology) to the puppeteer's limbs. The visual effect resembles a giant insect. Colossal Pictures uses that approach to give life to "Moxy," the animated dog host of the Cartoon Network. Moxy performs in real time; his act is captured on videotape, then broadcast later on the cartoon show.

In the past year, systems that employ optical technologies have hit the film and video marketplace. They capture facial movements through an optical scanner worn on the head, or by tracking reflective dots placed on the face. A new system, Eptron Optical BioStudio, does character animation with a conventional video camera and special artificial-vision software. Other types of optical "motion capture" systems are used widely in production of video games and animated cartoons, but they don't operate in real time; instead they capture motion, video frame by video frame, for later editing and playback off CD or cartridge.

Today, digital puppets are used in broadcast television, at trade show and megabuck entertainment events, and, as mentioned, for non-real time character animation in CD-ROM productions. When the price dips, we'll see realtime characters used for more personal applications, such as the "proof-of-concept" project that spawned Eggwardo.

Two years ago at Loma Linda University Medical Center in southern California, Eggwardo--a mustachioed talking egg--appeared in a hospital room on a TV screen. While watching him on the screen, the patient--a terminally ill child--could hear his voice and talk to him on the phone. A video camera, unobtrusively mounted on the TV, picked up the child's image and transmitted it to another room, where a psychologist and puppeteer communicated with the child through Eggwardo. It seems that sick kids are much more willing to discuss their aches and worries with cartoon characters than they are with grown-ups.

As the price of the necessary hardware falls with the development of systems like the Indigo2 Maximum Impact, and the input/output devices evolve into friendlier interfaces, health centers are increasingly able to hire Eggwardos and resident puppeteers. And it won't be long before we can all transform a bunch of bits into a face on our computer screens -- and experience, as VR pioneer Brenda Laurel wrote in 1991, how "one discovers through the character a new version of oneself."

Updated February, 1996 Copyright © 1995, 1996 Silicon Graphics, Inc. -- All Rights Reserved