Assessing Visual Attention in Gaze-Based VR Learning Through Eye-Tracking Measures

College

College of Sciences

Department

Computer Science

Graduate Level

Doctoral

Presentation Type

Poster Presentation

Abstract

Virtual and Augmented Reality (VR/AR) are becoming increasingly ubiquitous, with consumer-grade VR head-mounted displays (HMDs) making immersive experiences more accessible for everyday use. VR serves as an effective tool for learning and training by providing immersion, a sense of presence, and the ability to simulate risk-free environments that would otherwise be inaccessible or hazardous in real life. Gaze-based interaction in VR enhances user engagement by detecting and responding to their visual attention within the virtual environment. A key aspect of such interaction is gaze-driven content rendering, which dynamically presents virtual objects or information within the user’s field of view (FOV) based on their gaze direction. In VR learning and training environments, gaze-driven content rendering can improve the learning experience by adapting to the user’s attention, minimizing distractions, and ensuring learners remain engaged with specific visual elements by displaying content precisely where they are looking.

Understanding users' focus and visual scanning behavior can help optimize their engagement in immersive environments. Eye-tracking is a widely used, non-invasive technique for measuring human visual attention, offering valuable insights into how individuals process visual information. In gaze-based VR learning environments, eye-tracking measures can help analyze how learners interact with the virtual environment, focus on content, and engage with learning materials. In this study, we present a framework for assessing visual attention in a gaze-based VR learning environment using eye-tracking measures. Our framework computes basic and advanced gaze metrics, including fixation duration, saccade amplitude, and the ambient/focal attention coefficient K, as indicators of visual attention in VR. To facilitate analysis, the generated gaze data are visualized in an advanced gaze analytics dashboard, allowing us to examine users' gaze behaviors and attention patterns during interactive VR learning tasks.

We conducted a pilot user study to evaluate the utility of the proposed framework. We designed a VR learning application with a gaze-driven content rendering feature to facilitate gaze-based interaction. The VR learning environment was developed using a consumer-grade Meta Quest Pro VR headset, with eye-tracking data captured through its built-in eye tracker. The proposed framework was applied to generate gaze measures for analyzing users' visual attention during the gaze-based VR learning task. This study contributes by introducing a novel approach to integrating advanced eye-tracking technology into VR learning environments, specifically leveraging consumer-grade HMDs.

Keywords

Human-Computer Interaction, Eye Tracking, Visual Attention, Virtual Reality (VR), VR Learning, Gaze-Based Interaction, Gaze Measures, Meta Quest Pro

This document is currently not available here.

Share

COinS
 

Assessing Visual Attention in Gaze-Based VR Learning Through Eye-Tracking Measures

Virtual and Augmented Reality (VR/AR) are becoming increasingly ubiquitous, with consumer-grade VR head-mounted displays (HMDs) making immersive experiences more accessible for everyday use. VR serves as an effective tool for learning and training by providing immersion, a sense of presence, and the ability to simulate risk-free environments that would otherwise be inaccessible or hazardous in real life. Gaze-based interaction in VR enhances user engagement by detecting and responding to their visual attention within the virtual environment. A key aspect of such interaction is gaze-driven content rendering, which dynamically presents virtual objects or information within the user’s field of view (FOV) based on their gaze direction. In VR learning and training environments, gaze-driven content rendering can improve the learning experience by adapting to the user’s attention, minimizing distractions, and ensuring learners remain engaged with specific visual elements by displaying content precisely where they are looking.

Understanding users' focus and visual scanning behavior can help optimize their engagement in immersive environments. Eye-tracking is a widely used, non-invasive technique for measuring human visual attention, offering valuable insights into how individuals process visual information. In gaze-based VR learning environments, eye-tracking measures can help analyze how learners interact with the virtual environment, focus on content, and engage with learning materials. In this study, we present a framework for assessing visual attention in a gaze-based VR learning environment using eye-tracking measures. Our framework computes basic and advanced gaze metrics, including fixation duration, saccade amplitude, and the ambient/focal attention coefficient K, as indicators of visual attention in VR. To facilitate analysis, the generated gaze data are visualized in an advanced gaze analytics dashboard, allowing us to examine users' gaze behaviors and attention patterns during interactive VR learning tasks.

We conducted a pilot user study to evaluate the utility of the proposed framework. We designed a VR learning application with a gaze-driven content rendering feature to facilitate gaze-based interaction. The VR learning environment was developed using a consumer-grade Meta Quest Pro VR headset, with eye-tracking data captured through its built-in eye tracker. The proposed framework was applied to generate gaze measures for analyzing users' visual attention during the gaze-based VR learning task. This study contributes by introducing a novel approach to integrating advanced eye-tracking technology into VR learning environments, specifically leveraging consumer-grade HMDs.