Naila Ayala, Suzanne Kearns, Elizabeth Irving, Shi Cao, Ewa Niechwiej-Szwedo
{"title":"眼动追踪在复杂航空任务中的信息处理。","authors":"Naila Ayala, Suzanne Kearns, Elizabeth Irving, Shi Cao, Ewa Niechwiej-Szwedo","doi":"10.3791/66886","DOIUrl":null,"url":null,"abstract":"<p><p>Eye tracking has been used extensively as a proxy to gain insight into the cognitive, perceptual, and sensorimotor processes that underlie skill performance. Previous work has shown that traditional and advanced gaze metrics reliably demonstrate robust differences in pilot expertise, cognitive load, fatigue, and even situation awareness (SA). This study describes the methodology for using a wearable eye tracker and gaze mapping algorithm that captures naturalistic head and eye movements (i.e., gaze) in a high-fidelity flight motionless simulator. The method outlined in this paper describes the area of interest (AOI)-based gaze analyses, which provides more context related to where participants are looking, and dwell time duration, which indicates how efficiently they are processing the fixated information. The protocol illustrates the utility of a wearable eye tracker and computer vision algorithm to assess changes in gaze behavior in response to an unexpected in-flight emergency. Representative results demonstrated that gaze was significantly impacted when the emergency event was introduced. Specifically, attention allocation, gaze dispersion, and gaze sequence complexity significantly decreased and became highly concentrated on looking outside the front window and at the airspeed gauge during the emergency scenario (all p values < 0.05). The utility and limitations of employing a wearable eye tracker in a high-fidelity motionless flight simulation environment to understand the spatiotemporal characteristics of gaze behavior and its relation to information processing in the aviation domain are discussed.</p>","PeriodicalId":48787,"journal":{"name":"Jove-Journal of Visualized Experiments","volume":" 218","pages":""},"PeriodicalIF":1.2000,"publicationDate":"2025-04-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Eye Tracking During A Complex Aviation Task For Insights Into Information Processing.\",\"authors\":\"Naila Ayala, Suzanne Kearns, Elizabeth Irving, Shi Cao, Ewa Niechwiej-Szwedo\",\"doi\":\"10.3791/66886\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Eye tracking has been used extensively as a proxy to gain insight into the cognitive, perceptual, and sensorimotor processes that underlie skill performance. Previous work has shown that traditional and advanced gaze metrics reliably demonstrate robust differences in pilot expertise, cognitive load, fatigue, and even situation awareness (SA). This study describes the methodology for using a wearable eye tracker and gaze mapping algorithm that captures naturalistic head and eye movements (i.e., gaze) in a high-fidelity flight motionless simulator. The method outlined in this paper describes the area of interest (AOI)-based gaze analyses, which provides more context related to where participants are looking, and dwell time duration, which indicates how efficiently they are processing the fixated information. The protocol illustrates the utility of a wearable eye tracker and computer vision algorithm to assess changes in gaze behavior in response to an unexpected in-flight emergency. Representative results demonstrated that gaze was significantly impacted when the emergency event was introduced. Specifically, attention allocation, gaze dispersion, and gaze sequence complexity significantly decreased and became highly concentrated on looking outside the front window and at the airspeed gauge during the emergency scenario (all p values < 0.05). The utility and limitations of employing a wearable eye tracker in a high-fidelity motionless flight simulation environment to understand the spatiotemporal characteristics of gaze behavior and its relation to information processing in the aviation domain are discussed.</p>\",\"PeriodicalId\":48787,\"journal\":{\"name\":\"Jove-Journal of Visualized Experiments\",\"volume\":\" 218\",\"pages\":\"\"},\"PeriodicalIF\":1.2000,\"publicationDate\":\"2025-04-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Jove-Journal of Visualized Experiments\",\"FirstCategoryId\":\"103\",\"ListUrlMain\":\"https://doi.org/10.3791/66886\",\"RegionNum\":4,\"RegionCategory\":\"综合性期刊\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"MULTIDISCIPLINARY SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Jove-Journal of Visualized Experiments","FirstCategoryId":"103","ListUrlMain":"https://doi.org/10.3791/66886","RegionNum":4,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
Eye Tracking During A Complex Aviation Task For Insights Into Information Processing.
Eye tracking has been used extensively as a proxy to gain insight into the cognitive, perceptual, and sensorimotor processes that underlie skill performance. Previous work has shown that traditional and advanced gaze metrics reliably demonstrate robust differences in pilot expertise, cognitive load, fatigue, and even situation awareness (SA). This study describes the methodology for using a wearable eye tracker and gaze mapping algorithm that captures naturalistic head and eye movements (i.e., gaze) in a high-fidelity flight motionless simulator. The method outlined in this paper describes the area of interest (AOI)-based gaze analyses, which provides more context related to where participants are looking, and dwell time duration, which indicates how efficiently they are processing the fixated information. The protocol illustrates the utility of a wearable eye tracker and computer vision algorithm to assess changes in gaze behavior in response to an unexpected in-flight emergency. Representative results demonstrated that gaze was significantly impacted when the emergency event was introduced. Specifically, attention allocation, gaze dispersion, and gaze sequence complexity significantly decreased and became highly concentrated on looking outside the front window and at the airspeed gauge during the emergency scenario (all p values < 0.05). The utility and limitations of employing a wearable eye tracker in a high-fidelity motionless flight simulation environment to understand the spatiotemporal characteristics of gaze behavior and its relation to information processing in the aviation domain are discussed.
期刊介绍:
JoVE, the Journal of Visualized Experiments, is the world''s first peer reviewed scientific video journal. Established in 2006, JoVE is devoted to publishing scientific research in a visual format to help researchers overcome two of the biggest challenges facing the scientific research community today; poor reproducibility and the time and labor intensive nature of learning new experimental techniques.