📅

This is my first journal paper, completed as part of my PhD. It can be summarized as follows:

Experiences of virtual reality (VR) can easily break if the method of evaluating subjective user states is intrusive. Behavioral measures are increasingly used to avoid this problem. One such measure is eye tracking, which recently became more standard in VR, and is often used for content-dependent analyses. This research is an endeavor to utilize content-independent eye metrics, such as pupil size and blinks, for identifying mental load in VR users. We generated mental load independently from visuals through auditory stimuli. We also defined and measured a new eye metric, focus offset, which seeks to measure the phenomenon of “staring into the distance” without focusing on a specific surface. In the experiment, VR-experienced participants listened to two native and two foreign language stimuli inside a virtual phone booth. The results show that with increasing mental load, mean and variance of pupil size increased. To a lesser extent, mental load led to fewer fixations, less voluntary gazing at distracting content, and a larger focus offset as if looking through surfaces. These results are in agreement with previous studies. Overall, we encourage further research on content-independent eye metrics, hoping future hardware and algorithms to further increase tracking stability.

The paper is open-access and free to read at the MDPI Sensors. You can also directly download the full text. There is one separate PDF document providing supplementary material. The full source code for the virtual reality experiment can be found here on GitHub. Also check out the Open Science Framework repository with data, code, and additional information.

Don’t forget to cite if you make use of this work! 🎁

@article{2023-Schirm-LanguageBoothVR,
  author = {Schirm, Johannes and Gómez-Vargas, Andrés Roberto and Perusquía-Hernández, Monica and Skarbez, Richard T. and Isoyama, Naoya and Uchiyama, Hideaki and Kiyokawa, Kiyoshi},
  title = {Identification of Language-Induced Mental Load from Eye Behaviors in Virtual Reality},
  journal = {Sensors},
  year = {2023},
  volume = {23},
  number = {15},
  issn = {1424-8220},
  doi = {10.3390/s23156667},
  article-number = {6667},
}