{"title":"Do you need help? Identifying and responding to pilots’ troubleshooting through eye-tracking and Large Language Model","authors":"Mengtao Lyu, Fan Li","doi":"10.1016/j.ijhcs.2025.103617","DOIUrl":null,"url":null,"abstract":"<div><div>In-time automation support is crucial for enhancing pilots’ performance and flight safety. While extensive research has been conducted on providing automation support to mitigate risks associated with the Out-of-the-Loop (OOTL) phenomenon, limited attention has been given to supporting pilots who are actively engaged, known as In-the-Loop (ITL) status. Despite their active engagement, ITL pilots face challenges in managing multiple tasks simultaneously without additional support. For instance, providing critical information through in-time automation support can significantly improve efficiency and flight safety when pilots need to visually troubleshoot unexpected incidents while monitoring the aircraft’s flying status. This study addresses the gap in ITL support by introducing a method that utilizes eye-tracking data tokenized into Visual Attention Matrices (VAMs), integrated with a Large Language Model (LLM) to identify and respond to troubleshooting activities of ITL pilots. We address two primary challenges: capturing the complex troubleshooting status of pilots, which blends with normal monitoring behaviors, and effectively processing non-semantic eye-tracking data using LLM. The proposed VAM approach provides a structured representation of visual attention that supports LLM reasoning, while empirical VAMs enhance the model’s ability to efficiently identify critical features. A case study involving 19 licensed pilots validates the efficacy of the proposed approach in identifying and responding to pilots’ troubleshooting activities. This research contributes significantly to adaptive Human–Computer Interaction (HCI) in aviation by improving support for ITL pilots, thereby laying a foundation for future advancements in human–AI collaboration within automated aviation systems.</div></div>","PeriodicalId":54955,"journal":{"name":"International Journal of Human-Computer Studies","volume":"205 ","pages":"Article 103617"},"PeriodicalIF":5.1000,"publicationDate":"2025-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Human-Computer Studies","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1071581925001740","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, CYBERNETICS","Score":null,"Total":0}
引用次数: 0
Abstract
In-time automation support is crucial for enhancing pilots’ performance and flight safety. While extensive research has been conducted on providing automation support to mitigate risks associated with the Out-of-the-Loop (OOTL) phenomenon, limited attention has been given to supporting pilots who are actively engaged, known as In-the-Loop (ITL) status. Despite their active engagement, ITL pilots face challenges in managing multiple tasks simultaneously without additional support. For instance, providing critical information through in-time automation support can significantly improve efficiency and flight safety when pilots need to visually troubleshoot unexpected incidents while monitoring the aircraft’s flying status. This study addresses the gap in ITL support by introducing a method that utilizes eye-tracking data tokenized into Visual Attention Matrices (VAMs), integrated with a Large Language Model (LLM) to identify and respond to troubleshooting activities of ITL pilots. We address two primary challenges: capturing the complex troubleshooting status of pilots, which blends with normal monitoring behaviors, and effectively processing non-semantic eye-tracking data using LLM. The proposed VAM approach provides a structured representation of visual attention that supports LLM reasoning, while empirical VAMs enhance the model’s ability to efficiently identify critical features. A case study involving 19 licensed pilots validates the efficacy of the proposed approach in identifying and responding to pilots’ troubleshooting activities. This research contributes significantly to adaptive Human–Computer Interaction (HCI) in aviation by improving support for ITL pilots, thereby laying a foundation for future advancements in human–AI collaboration within automated aviation systems.
期刊介绍:
The International Journal of Human-Computer Studies publishes original research over the whole spectrum of work relevant to the theory and practice of innovative interactive systems. The journal is inherently interdisciplinary, covering research in computing, artificial intelligence, psychology, linguistics, communication, design, engineering, and social organization, which is relevant to the design, analysis, evaluation and application of innovative interactive systems. Papers at the boundaries of these disciplines are especially welcome, as it is our view that interdisciplinary approaches are needed for producing theoretical insights in this complex area and for effective deployment of innovative technologies in concrete user communities.
Research areas relevant to the journal include, but are not limited to:
• Innovative interaction techniques
• Multimodal interaction
• Speech interaction
• Graphic interaction
• Natural language interaction
• Interaction in mobile and embedded systems
• Interface design and evaluation methodologies
• Design and evaluation of innovative interactive systems
• User interface prototyping and management systems
• Ubiquitous computing
• Wearable computers
• Pervasive computing
• Affective computing
• Empirical studies of user behaviour
• Empirical studies of programming and software engineering
• Computer supported cooperative work
• Computer mediated communication
• Virtual reality
• Mixed and augmented Reality
• Intelligent user interfaces
• Presence
...