2025 Webinar: Eye tracking for human performance improvements

Wednesday June 11, 2025 @ 10am EST

Register here: https://learn.hfes.org/products/hfes-perception-and-performance-technical-group-webinar-eye-tracking-how-to-capture-and-interpret-users-point-of-view-and-operators-gaze-strategies

Abstract

Eye tracking technologies are becoming more accessible and capable, enabling deeper insights into human attention and information processing. The upcoming webinar hosted by the Perception and Performance Technical Group (PPTG) aims to explore the multifaceted applications of eye tracking technologies across different domains. Dr. Sampath Jayarathna will introduce Meta’s Project Aria, highlighting AR and AI advancements through egocentric data collection with smart glasses, enhancing spatial awareness and real-world interactions. Ir. Rutger Stuut will discuss a study to better understand gaze strategies in bridge operators for improved error management, emphasizing the significance of gaze behavior research in understanding and improving task performance.

Speaker 1:

Our first speaker is Dr. Sampath Jayarathna. He is an Associate Professor of Computer Science at Old Dominion University, where he directs the Neuro-Information Retrieval and Data Science Lab. Dr. Jayarathna earned a Ph.D. in Computer science from the Texas A&M University College Station in 2016.Dr. Jayarathna is currently a Computer Research Scientist at NASA Langley (on sabbatical leave) and completed a summer fellowship at Naval Surface Warfare Center Dahlgren Division Dam Neck Activity funded by the Office Naval Research Faculty Fellowship program. His research interests include eye tracking, XR, drones, applied machine learning, information retrieval, and HCI. 

Abstract: Egocentric systems are technologies designed to capture and interpret data from a first-person perspective, simulating the view or experience from the user’s point of view. These systems use wearable devices, such as smart glasses or head-mounted cameras, to collect visual and audio information that reflects what the user sees and hears, enabling a more immersive and contextually relevant experience. His talk will introduce Meta’s Project Aria, an ambitious initiative aimed at advancing AR and AI by leveraging egocentric data collection through smart glasses. These glasses are  designed to capture and analyze data from the wearer’s perspective, enabling researchers to develop context-aware AR experiences and refine AI algorithms for spatial awareness, object recognition, and real-world interactions.

Speaker 2:

Our second speaker is Ir. Rutger Stuut. He is a senior human factors advisor at the executive agency of the Ministry of Infrastructure and Water Management (Rijkswaterstaat), with a background in Human-Technology Interaction (MSc) and aviation engineering (BEng, Honours). He is also an (external) PhD candidate at Utrecht University (experimental psychology), supported by Rijkswaterstaat. His PhD research examines (systematic) gaze strategies in remote nautical object (bridge/lock) control. 

Abstract: Gaining a better understanding of how professional operators view their closed-circuit television (CCTV) streams for remote operation is useful, because accident analyses showed that crucial information is sometimes missed. However, eye gaze research in this domain so far is limited. In his previous qualitative finding, he found that operators use different gaze patterns when operating locks, but these patterns were not always considered systematic by expert evaluations (Stuut et al., 2024, JCEDM). It is generally believed that a complete and systematic viewing approach will mitigate the risk of observer errors, and this is therefore also part of operator training. However, the effectiveness of this approach has not been empirically validated. His current work focuses on quantitatively testing the effectiveness of systematic gaze for bridge control in a controlled simulation and how this transfers to task execution in real control rooms. His talk will specifically focus on the simulation study.

Leave a Reply