Human cognitive load (e.g., how difficult is a task) can be estimated by measuring how often the pupil contracts or expands. This encoded in the Index of Pupillary Activity (IPA). It is unclear, however, how this translates to task performed in Virtual Reality (VR), and what the signal-to-noise ratio of the IPA is.
The goal of this project is to re-implement the IPA (Duchowski et al., The Index of Pupillary Activity, CHI 2018) and create a flexible framework to perform cognitive load tests in VR. The source code for computing the IPA is available (approx. 50 lines of Python code include wavelet decomposition) and should be ported or connected to Unity. Multiple psychologic tests (e.g., n-back, Stroop) and a logging mechanism should be implemented to ensure a flexible usage of the framework. In the last step of the project, a user test should be performed to validate the IPA in VR. The experiments will be performed using an HTC Vive Pro headset equipped with a Pupil eye-tracker.
The project will result in an implementation of the IPA in VR, a ready-to-use Unity plugin, and a cognitive load study in VR.
Reimplement the IPA in VR and perform a cognitive load experiment.
# Literature review on human cognitive load estimation and eye-tracking
# Implementation of the IPA in VR.
## Input is the eye-tracking data collected with a gaze tracker included in the VR headset
## Implementation of a logging mechanism
## Implementation as Unity package
# Performing a cognitive load experiment
David Lindlbauer (firstname.lastname@example.org, https://ait.ethz.ch/people/lindlbauer/)
Cognitive load measurements
index of pupillary activity
IDEA League Student Grant (IDL)
CLS Student Project (MPG ETH CLS)
ETH Organization's Labels (ETHZ)
Information, Computing and Communication Sciences