How Close is Close Enough? Investigating Spatial Mismatch for Haptics in VR


830dc6ce 127f 4bec 8f17 668946f6d41e.jpg 500 500

Starting Date: earliest start: (2019-01-13) latest end: (2019-09-30)

Organization: Advanced Interactive Technologies

Involved Host(s): Lindlbauer David

Abstract: The goal of this project is to simulate tracking errors in VR of different magnitude, build a data set of tracking errors and create a model of perceived error.

Description: Using commercially available position trackers (e.g., HTC Vive trackers or OptiTrack), it is possible to include physical object in the virtual world. By tracking real-world objects and creating their virtual counterparts, users can touch a virtual object in VR and at the same time touch the physical objects. This greatly increases immersion in virtual environments. Perfectly tracking objects and user’s hands, however, is challenging. Imperfections in positional tracking can lead to offsets between a virtual and a physical object, even though they should be perfectly aligned. There exists a large body of work on improving tracking hardware, e.g., using magnetic or optical tracking. It is unclear, however, what the influence of tracking errors are, and how accurate position tracking actually has to be for users to perceived it as errorless. The goal of this project is to simulate tracking errors of different magnitude, build a data set of tracking errors and create a model of perceived error. The model should incorporate a relationship between geometry and perceived errors. As an ex-ample, perceived errors might be stronger at edges and with respect to texture, and less pronounced on flat surfaces. The experimental setup should be implemented in Unity, VR is provided through an HTC Vive Pro headset.

Goal: Implement a VR environment to test the influence of tracking errors in Unity. Work packages # Literature review on visuo-haptic perception and VR #Implementation of a VR test environment in Unity ## Track hands using Leap motion or similar ## Track physical objects using HTC Vive tracker or similar ## Manipulate alignment between virtual and physical object # Performing visuo-haptic experiment

Contact Details: David Lindlbauer (david.lindlbauer@inf.ethz.ch, https://ait.ethz.ch/people/lindlbauer/)

Virtual reality haptics user modeling psychophysics


Labels: IDEA League Student Grant (IDL) Semester Project Bachelor Thesis Master Thesis CLS Student Project (MPG ETH CLS) ETH Organization's Labels (ETHZ)
Topics: Information, Computing and Communication Sciences
Applicant Organizations: ETH Zurich