home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
HaCKeRz KrOnIcKLeZ 3
/
HaCKeRz_KrOnIcKLeZ.iso
/
anarchy
/
essays
/
term
/
vtrealty.txt
< prev
next >
Wrap
Text File
|
1996-04-27
|
23KB
|
530 lines
Virtual Reality - What it is and How it Works
Imagine being able to point into the sky and fly. Or
perhaps walk through space and connect molecules together.
These are some of the dreams that have come with the
invention of virtual reality. With the introduction of
computers, numerous applications have been enhanced or
created. The newest technology that is being tapped is that
of artificial reality, or "virtual reality" (VR). When
Morton Heilig first got a patent for his "Sensorama
Simulator" in 1962, he had no idea that 30 years later
people would still be trying to simulate reality and that
they would be doing it so effectively. Jaron Lanier first
coined the phrase "virtual reality" around 1989, and it has
stuck ever since. Unfortunately, this catchy name has
caused people to dream up incredible uses for this
technology including using it as a sort of drug. This became
evident when, among other people, Timothy Leary became
interested in VR. This has also worried some of the
researchers who are trying to create very real applications
for medical, space, physical, chemical, and entertainment
uses among other things.
In order to create this alternate reality, however, you
need to find ways to create the illusion of reality with a
piece of machinery known as the computer. This is done with
several computer-user interfaces used to simulate the
senses. Among these, are stereoscopic glasses to make the
simulated world look real, a 3D auditory display to give
depth to sound, sensor lined gloves to simulate tactile
feedback, and head-trackers to follow the orientation of the
head. Since the technology is fairly young, these
interfaces have not been perfected, making for a somewhat
cartoonish simulated reality.
Stereoscopic vision is probably the most important
feature of VR because in real life, people rely mainly on
vision to get places and do things. The eyes are
approximately 6.5 centimeters apart, and allow you to have a
full-colour, three-dimensional view of the world.
Stereoscopy, in itself, is not a very new idea, but the new
twist is trying to generate completely new images in real-
time. In 1933, Sir Charles Wheatstone invented the first
stereoscope with the same basic principle being used in
today's head-mounted displays. Presenting different views
to each eye gives the illusion of three dimensions. The
glasses that are used today work by using what is called an
"electronic shutter". The lenses of the glasses interleave
the left-eye and right-eye views every thirtieth of a
second. The shutters selectively block and admit views of
the screen in sync with the interleaving, allowing the
proper views to go into each eye. The problem with this
method though is that you have to wear special glasses.
Most VR researchers use complicated headsets, but it is
possible to create stereoscopic three-dimensional images
without them. One such way is through the use of lenticular
lenses. These lenses, known since Herman Ives experimented
with them in 1930, allow one to take two images, cut them
into thin vertical slices and interleave them in precise
order (also called multiplexing) and put cylinder shaped
lenses in front of them so that when you look into them
directly, the images correspond with each eye. This
illusion of depth is based on what is called binocular
parallax. Another problem that is solved is that which
occurs when one turns their head. Nearby objects appear to
move more than distant objects. This is called motion
parallax. Lenticular screens can show users the proper
stereo images when moving their heads well when a head-
motion sensor is used to adjust the effect.
Sound is another important part of daily life, and thus
must be simulated well in order to create artificial
reality. Many scientists including Dr. Elizabeth Wenzel, a
researcher at NASA, are convinced the 3D audio will be
useful for scientific visualization and space applications
in the ways the 3D video is somewhat limited. She has come
up with an interesting use for virtual sound that would
allow an astronaut to hear the state of their oxygen, or
have an acoustical beacon that directs one to a trouble spot
on a satellite. The "Convolvotron" is one such device that
simulates the location of up to four audio channels with a
sort of imaginary sphere surrounding the listener. This
device takes into account that each person has specialized
auditory signal processing, and personalizes what each
person hears.
Using a position sensor from Polhemus, another VR
research company, it is possible to move the position of
sound by simply moving a small cube around in your hand.
The key to the Convolvotron is something called the "Head-
Related Transfer Function (HRTF)", which is a set of
mathematically modelable responses that our ears impose on
the signals they get from the air. In order to develop the
HRTF, researchers had to sit people in an anechoic room
surrounded with 144 different speakers to measure the
effects of hearing precise sounds from every direction by
using tiny microphone probes placed near the eardrums of the
listener. The way in which those microphones distorted the
sound from all directions was a specific model of the way
that person's ears impose a complex signal on incoming sound
waves in order to encode it in their spatial environment.
The map of the results is then converted to numbers and a
computer performs about 300 million operations per second
(MIPS) to create a numerical model based on the HRTF which
makes it possible to reconfigure any sound source so that it
appears to be coming from any number of different points
within the acoustic sphere.
This portion of a VR system can really enhance the visual
and tactile responses. Imagine hearing the sound of
footsteps behind you in a dark alley late at night. That is
how important 3D sound really is.
The third important sense that we use in everyday life is
that of touch. There is no way of avoiding the feeling of
touch, and thus this is one of the technologies that is
being researched upon most feverishly. The two main types
of feedback that are being researched are that of force-
reflection feedback and tactile feedback. Force feedback
devices exert a force against the user when they try to push
something in a virtual world that is 'heavy'. Tactile
feedback is the sensation of feeling an object such as the
texture of sandpaper. Both are equally important in the
development of VR.
Currently, the most successful development in force-
reflective feedback is that of the Argonne Remote
Manipulator (ARM). It consists of a group of articulated
joints, encoiled by long bunches of electrical cables. The
ARM allows for six degrees of movement (position and
orientation) to give a true feel of movement. Suspended
from the ceiling and connected by a wire to the computer,
this machine grants a user the power to reach out and
manipulate 3D objects that are not real. As is the case at
the University of North Carolina, it is possible to "dock
molecules" using VR. Simulating molecular forces and
translating them into physical forces allows the ARM to push
back at the user if he tries to dock the molecules
incorrectly.
Tactile feedback is just as important as force feedback
in allowing the user to "feel" computer-generated objects.
There are several methods for providing tactile feedback.
Some of these include inflating air bladders in a glove,
arrays of tiny pins moved by shape memory wires, and even
fingertip piezoelectric vibrotactile actuators. The latter
method uses tiny crystals that vibrate when an electric
current stimulates them. This design has not really taken
off however, but the other two methods are being more
actively researched. According to a report called "Tactile
Sensing in Humans and Robots," distortions inside the skins
cause mechanosensitive nerve terminals to respond with
electrical impulses. Each impulse is approximately 50 to
100mV in magnitude and 1 ms in duration. However, the
frequency of the impulses (up to a maximum of 500/s) depends
on the intensity of the combination of the stresses in the
area near the receptor which is responsive. In other words,
the sensors which affect pressure in the skin are all
basically the same, but can convey a message over and over
to give the feeling of pressure. Therefore, in order to
have any kind of tactile response system, there must be a
frequency of about 500 Hz in order to simulate the tactile
accuracy of the human.
Right now however, the gloves being used are used as
input devices. One such device is that called the
DataGlove. This well-fitting glove has bundles of optic
fibers attached at the knuckles and joints. Light is passed
through these optic fibers at one end of the glove. When a
finger is bent, the fibers also bend, and the amount of
light that is allowed through the fiber can be converted to
determine the location at which the user is. The type of
glove that is wanted is one that can be used as an input and
output device. Jim Hennequin has worked on an "Air Muscle"
that inflates and deflates parts of a glove to allow the
feeling of various kinds of pressure. Unfortunately at this
time, the feel it creates is somewhat crude. The company
TiNi is exploring the possibility of using "shape memory
alloys" to create tactile response devices. TiNi uses an
alloy called nitinol as the basis for a small grid of what
look like ballpoint-pen tips. Nitinol can take the shape of
whatever it is cast in, and can be reshaped. Then when it
is electrically stimulated, the alloy it can return to its
original cast shape. The hope is that in the future some of
these techniques will be used to form a complete body suit
that can simulate tactile sensation.
Being able to determine where in the virtual world means
you need to have orientation and position trackers to follow
the movements of the head and other parts of the body that
are interfacing with the computer. Many companies have
developed successful methods of allowing six degrees of
freedom including Polhemus Research, and Shooting Star
Technology. Six degrees of freedom refers to a combination
cartesian coordinate system and an orientation system with
rotation angles called roll, pitch and yaw. The ADL-1 from
Shooting Star is a sophisticated and inexpensive (relative
to other trackers) 6D tracking system which is mounted on
the head, and converts position and orientation information
into a readable form for the computer. The machine
calculates head/object position by the use of a lightweight,
multiply-jointed arm. Sensors mounted on this arm measure
the angles of the joints. The computer-based control unit
uses these angles to compute position-orientation
information so that the user can manipulate a virtual world.
The joint angle transducers use conductive plastic
potentiometers and ball bearings so that this machine is
heavy duty. Time-lag is eliminated by the direct-reading
transducers and high speed microprocessor, allowing for a
maximum update rate of approximately 300
measurements/second.
Another system developed by Ascension Technology does
basically the same thing as the ADL-1, but the sensor is in
the form of a small cube which can fit in the users hand or
in a computer mouse specially developed to encase it. The
Ascension Bird is the first system that generates and senses
DC magnetic fields. The Ascension Bird first measures the
earth's magnetic field and then the steady magnetic field
generated by the transmitter. The earth's field is then
subtracted from the total, which allows one to yield true
position and orientation measurements. The existing
electromagnetic systems transmit a rapidly varying AC field.
As this field varies, eddy currents are induced in nearby
metals which causes the metals to become electromagnets
which distort the measurements. The Ascension Bird uses a
steady DC magnetic filed which does not create an eddy
current. The update rate of the Bird is 100
measurements/second. However, the Bird has a small lag of
about 1/60th of a second which is noticeable.
Researchers have also thought about supporting the other
senses such as taste and smell, but have decided that it is
unfeasible to do. Smell would be possible, and would
enhance reality, but there is a certain problem with the
fact that there is only a limited spectrum of smells that
could be simulated. Taste is basically a disgusting premise
from most standpoints. It might be useful for entertainment
purposes, but has almost no purpose for researchers or
developers. For one thing, people would have to put some
kind of receptors in their mouths and it would be very
unsanitary. Thus, the main senses that are relied on in a
virtual reality are sight, touch, and hearing.
Applications of Virtual Reality
Virtual Reality has promise for nearly every industry
ranging from architecture and design to movies and
entertainment, but the real industry to gain from this
technology is science, in general. The money that can be
saved examining the feasibility of experiments in an
artificial world before they are done could be great, and
the money saved on energy used to operate such things as
wind tunnels quite large.
The best example of how VR can help science is that of
the "molecular docking" experiments being done in Chapel
Hill, North Carolina. Scientists at the University of North
Carolina have developed a system that simulated the bonding
of molecules. But instead of using complicated formulas to
determine bonding energy, or illegible stick drawings, the
potential chemist can don a high-tech head-mounted display,
attach themselves to an artificial arm from the ceiling and
actually push the molecules together to determine whether or
not they can be connected. The chemical bonding process
takes on a sort of puzzle-like quality, in which even
children could learn to form bonds using a trial and error
method.
Architectural designers have also found that VR can be
useful in visualizing what their buildings will look like
when they are put together. Often, using a 2D diagram to
represent a 3D home is confusing, and the people that fund
large projects would like to be able to see what they are
paying for before it is constructed. An example which is
fascinating would be that of designing an elementary school.
Designers could walk in the school from a child's
perspective to gain insight on how high that water fountain
is, or how narrow the halls are. Product designers could
also use VR in similar ways to test their products.
NASA and other aerospace facilities are concentrating
research on such things as human factors engineering,
virtual prototyping of buildings and military devices,
aerodynamic analysis, flight simulation, 3D data
visualization, satellite position fixing, and planetary
exploration simulations. Such things as virtual wind
tunnels have been in development for a couple years and
could save money and energy for aerospace companies.
Medical researchers have been using VR techniques to
synthesize diagnostic images of a patient's body to do
"predictive" modeling of radiation treatment using images
created by ultrasound, magnetic resonance imaging, and X-
ray. A radiation therapist in a virtual would could view
and expose a tumour at any angle and then model specific
doses and configurations of radiation beams to aim at the
tumour more effectively. Since radiation destroys human
tissue easily, there is no allowance for error.
Also, doctors could use "virtual cadavers" to practice
rare operations which are tough to perform. This is an
excellent use because one could perform the operation over
and over without the worry of hurting any human life.
However, this sort of practice may have it's limitations
because of the fact that it is only a virtual world. As
well, at this time, the computer-user interfaces are not
well enough developed and it is estimated that it will take
5 to 10 years to develop this technology.
In Japan, a company called Matsushita Electric World Ltd.
is using VR to sell their products. They employ a VPL
Research head-mounted display linked to a high-powered
computer to help prospective customers design their own
kitchens. Being able to see what your kitchen will look
like before you actually refurnish could help you save from
costly mistakes in the future.
The entertainment industry stands to gain a lot from VR.
With the video game revolution of bigger and better games
coming out all the time, this could be the biggest
breakthrough ever. It would be fantastic to have sword
fights which actually feel real. As well, virtual movies
(also called vroomies) are being developed with allow the
viewer to interact with the characters in the movie.
Universal Studios among others is developing a virtual
reality amusement park which will incorporate these games
and vroomies.
As it stands, almost every industry has something to gain
from VR and in the years to comes, it appears that the
possibilities are endless.
The Future of Virtual Reality
In the coming years, as more research is done we are
bound to see VR become as mainstay in our homes and at work.
As the computers become faster, they will be able to create
more realistic graphic images to simulate reality better.
As well, new interfaces will be developed which will
simulate force and tactile feedback more effectively to
enhance artificial reality that much more. This is the
birth of a new technology and it will be interesting to see
how it develops in the years to come. However, it may take
longer than people think for it to come into the mainstream.
Millions of dollars in research must be done, and only
select industries can afford to pay for this. Hopefully, it
will be sooner than later though.
It is very possible that in the future we will be
communicating with virtual phones. Nippon Telephone and
Telegraph (NTT) in Japan is developing a system which will
allow one person to see a 3D image of the other using VR
techniques. In the future, it is conceivable that
businessmen may hold conferences in a virtual meeting hall
when they are actually at each ends of the world. NTT is
developing a new method of telephone transmission using
fiber optics which will allow for much larger amounts of
information to be passed through the phone lines. This
system is called the Integrated Services Digital Network
(ISDN) which will help allow VR to be used in conjunction
with other communication methods.
Right now, it is very expensive to purchase, with the
head-mounted display costing anywhere from about $20,000 to
$1,000,000 for NASA's Super Cockpit. In the future, VR will
be available to the end-user at home for under $1000 and
will be of better quality than that being developed today.
The support for it will be about as good as it is currently
for plain computers, and it is possible that VR could become
a very useful teaching tool.
Sources of Information
Books and Periodicals
Benningfield, Damond. "The Virtues of Virtual Reality."
Star Date, July/Aug. 1991, pp. 14-15.
Burrill, William. "Virtual Reality." Toronto Star, 13 July
1991, pp. J1-3.
Brill, Louis M. "Facing Interface Issues." Computer
Graphics World, April 1992, pp. 48-58.
Daviss, Bennett. "Grand Illusions." Discover, June 1990,
pp. 36-41.
Emmett, Arielle. "Down to Earth: Practical Applications of
Virtual Reality Find Commercial Uses."
Computer Graphics World, March 1992, pp. 46-54.
Peterson, Ivars. "Recipes for Artificial Realities."
Science News, 24 Nov. 1990, pp. 328-329.
Peterson, Ivars. "Looking-Glass Worlds." Science News,
4 Jan 1992, pp. 8-15.
Porter, Stephen. "Virtual Reality." Computer Graphics
World, March 1992, pp. 42-43.
Rheingold, Howard. Virtual Reality. Toronto: Summit Books,
1991.
Tisdale, Sallie. "It's Been Real." Esquire, April 1991,
pp. 36-40.
Various. Virtual Reality Special Report. San Francisco:
Meckler Publishing, 1992.
Companies Contacted:
Ascension Technology Corp.
P.O Box 527
Burlington, VT 05402
(802)655-7879
Polhemus Inc.
P.O Box 560
Colchester, VT 05446
(802)655-3159
Shooting Star Technology
1921 Holdom Ave.
Burnaby, BC V5B 3W4
(604)298-8574
Virtual Technologies
P.O. Box 5984
Stanford, CA 94309
(415)599-2331
VPL Research Inc.
656 Bair Island Rd. Third Floor
Redwood City, CA 94063
(415)361-1710