Using commercially available position trackers (e.g., HTC Vive trackers or OptiTrack), it is possible to include physical object in the virtual world. By tracking real-world objects and creating their virtual counterparts, users can touch a virtual object in VR and at the same time touch the physical objects. This greatly increases immersion in virtual environments.
Perfectly tracking objects and user’s hands, however, is challenging. Imperfections in positional tracking can lead to offsets between a virtual and a physical object, even though they should be perfectly aligned. There exists a large body of work on improving tracking hardware, e.g., using magnetic or optical tracking. It is unclear, however, what the influence of tracking errors are, and how accurate position tracking actually has to be for users to perceived it as errorless.
The goal of this project is to simulate tracking errors of different magnitude, build a data set of tracking errors and create a model of perceived error. The model should incorporate a relationship between geometry and perceived errors. As an ex-ample, perceived errors might be stronger at edges and with respect to texture, and less pronounced on flat surfaces. The experimental setup should be implemented in Unity, VR is provided through an HTC Vive Pro headset.
Implement a VR environment to test the influence of tracking errors in Unity.
# Literature review on visuo-haptic perception and VR
#Implementation of a VR test environment in Unity
## Track hands using Leap motion or similar
## Track physical objects using HTC Vive tracker or similar
## Manipulate alignment between virtual and physical object
# Performing visuo-haptic experiment
David Lindlbauer (email@example.com, https://ait.ethz.ch/people/lindlbauer/)
IDEA League Student Grant (IDL)
CLS Student Project (MPG ETH CLS)
ETH Organization's Labels (ETHZ)
Information, Computing and Communication Sciences