The concept
behind this installation is to create a multi-user game-like interface
which is educational and fun in which players are subjected to the perceptual
characteristics of the animals' bodies that they virtually inhabit.
Have
you ever heard what whale-calls might sound like to a whale? This is
an example of what you might hear using this system. Hear
a whale call sped up to the human range of hearing. Have you ever seen
through a fly's eyes? Initially, compound eyes would be hard for people
to adapt to, but people would get the chance to try to use them in this
system, and maybe some would even thrive with them. This installation
offers so many adventures to be experienced and as time goes on, new
3D models and perceptual mappings for more animals can be entered into
the system and experimented with.
The actions
will take place in a virtual wilderness where the user inhabits the
body of an animal, moving in the same ways that the animal can move
(i.e., walk, swim, fly). The user can eat other animals in the wilderness,
or be eaten. If eaten, the user will take control of the animal that
ate him or her. If the creature that ate the user is being controlled
by another user, then the user will be respawned as a random animal.
The user will start out either as a random creature or one selected
by the user.
Users
will see and hear the virtual world the way their animals see and hear.
For example, chickens hear much less of the frequency range than humans
do. So, a user who is a chicken in this world will hear only 125 to
2000 Hz instead of the 20Hz to 20kHz that humans can hear. To accomplish
this, the audio sent to the user's headphones will be filtered appropriately.
Hear the difference between what a human
would hear and what a chicken
would hear. View a chart (obtained
from Busch Gardens) that gives a general idea of the hearing range of
several animals. Also, the hearing ability of animals changes depending
on environmental differences. For example, seals
hear differently in air and in water. These differences will be
accounted for in the audio playback engine.
Visual
data would also be altered by visual transfer functions for each animal.
For example, color would be mapped on to the human scale. Some animals
can see more ultraviolet or infrared colors than humans, while others
can't see certain colors in the spectrum. The colors will be mapped
in order to give the human user the same type of color and the same
color resolution as the animal. For example, the ultraviolet light the
bees see would be mapped to a red color for humans to view. This works
because bees cannot see the full range of red that humans can. For some
animals, movement of objects in the visual field is critical to being
able to see them. Compound eyes on insects also have an effect on the
way they see the world.All these visual effects will be accounted for
in the graphics engine.
The system
will consist of several identical cubicles spaced out in a ring with
the entrances
to the cubicles facing outward from the center. Each cubicle will be
designed to minimize acoustical trasmission through the walls. Someone
who desires to interact with the system will walk into one of the cubicles.
In the cubicle the user will find a VR headset, headphones, two force-feedback
gloves with trackers, and pressure-sensitive pads on the floor.
When
a user enters the cubicle, a screen on an inside wall will present instructions
on how to put on all the equipment and then begin the simulation. The
user can also go through an optional tutorial on how to use the equipment
and move around in the virtual environment using the glove and pads
on the floor.
On the
outside above each cubicle, there will be a large display screen to
allow the people outside to monitor what each user is experiencing visually.