Project Title:
Human-Machine Interface providing live human representation and interactivity in a Virtual Reality Environment
94-1 07.05 9797
Human-Machine Interface providing live human representation and
interactivity in a Virtual Reality Environment
Abstract:
It is desirable for Virtual reality input devices to be minimally
invasive to the user while being cost effective both in hardware
and computational overhead. Additionally it is desirable to
precisely know the position input information while faithfully
representing a users own hands or body in the visual space. A
traditional fiber optic glove system encumbers the user while
providing relatively low position resolution. Other exoskeleton
systems are more accurate at the expense of complexity and
encumbrance. Rendering an accurate image for all such systems
requires significant computational resources. The approach proposed
here, using a combined laser ranging and stereoscopic imaging
system, promises to optimize hardware cost, image quality,
computational resource requirements and position input accuracy in
a nonencumbering system of modest cost.
Anticipated direct application includes virtual reality input and
gesture recognition device for broad virtual reality utilization.
Specific early direct applications include local and remote
interactive design, interactive training simulations and video
conferencing. Indirect applications
include computerized gesture recognition applied to widely
ranging problems such as recognizing the American sign
language for the deaf and "keyboard free" data input at
Automated Teller Machines.
Key Words
CyberSim Systems, Inc.
3334 Richmond
Suite 205
Houston, Texas 77098