{"title":"AF-Mix: A gaze-aware learning system with attention feedback in mixed reality","authors":"Shi Liu, Peyman Toreini, Alexander Maedche","doi":"10.1016/j.ijhcs.2025.103467","DOIUrl":null,"url":null,"abstract":"<div><div>Mixed Reality (MR) has demonstrated its potential in various learning contexts. MR-based learning environments empower users to actively explore learning content visualized in multiple formats, such as 3D models, videos, and images. Nonetheless, the sophisticated visualizations in MR learning environments may result in potential visual overload, posing a challenge for users in efficiently allocating their attention. In this paper, we present AF-Mix, a learning support system that leverages eye tracking sensors in Microsoft HoloLens 2 to offer attention feedback for learners. Aiming to design AF-Mix, we conducted a participatory design study and integrated the attention feedback into our system, following users’ needs and suggestions. Furthermore, we evaluated AF-Mix in an evaluation study (n <span><math><mo>=</mo></math></span> 22) following a quantitative analysis of users’ visual behavior, as well as a qualitative analysis of interview transcripts. Our findings show that providing feedback to support the learning process can be achieved effectively with eye tracking. In specific, attention feedback assists learners in retrieving previously missed information and encourages learners to reallocate their attention in the review process. Moreover, providing personalized feedback based on previous attention allocation is more effective in supporting users than a self-review approach without gaze-aware assistance in MR. Such feedback facilitates users in managing their limited attentional resources better and supports the reflection of their learning journey more effectively.</div></div>","PeriodicalId":54955,"journal":{"name":"International Journal of Human-Computer Studies","volume":"198 ","pages":"Article 103467"},"PeriodicalIF":5.3000,"publicationDate":"2025-02-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Human-Computer Studies","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1071581925000242","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, CYBERNETICS","Score":null,"Total":0}
引用次数: 0
Abstract
Mixed Reality (MR) has demonstrated its potential in various learning contexts. MR-based learning environments empower users to actively explore learning content visualized in multiple formats, such as 3D models, videos, and images. Nonetheless, the sophisticated visualizations in MR learning environments may result in potential visual overload, posing a challenge for users in efficiently allocating their attention. In this paper, we present AF-Mix, a learning support system that leverages eye tracking sensors in Microsoft HoloLens 2 to offer attention feedback for learners. Aiming to design AF-Mix, we conducted a participatory design study and integrated the attention feedback into our system, following users’ needs and suggestions. Furthermore, we evaluated AF-Mix in an evaluation study (n 22) following a quantitative analysis of users’ visual behavior, as well as a qualitative analysis of interview transcripts. Our findings show that providing feedback to support the learning process can be achieved effectively with eye tracking. In specific, attention feedback assists learners in retrieving previously missed information and encourages learners to reallocate their attention in the review process. Moreover, providing personalized feedback based on previous attention allocation is more effective in supporting users than a self-review approach without gaze-aware assistance in MR. Such feedback facilitates users in managing their limited attentional resources better and supports the reflection of their learning journey more effectively.
期刊介绍:
The International Journal of Human-Computer Studies publishes original research over the whole spectrum of work relevant to the theory and practice of innovative interactive systems. The journal is inherently interdisciplinary, covering research in computing, artificial intelligence, psychology, linguistics, communication, design, engineering, and social organization, which is relevant to the design, analysis, evaluation and application of innovative interactive systems. Papers at the boundaries of these disciplines are especially welcome, as it is our view that interdisciplinary approaches are needed for producing theoretical insights in this complex area and for effective deployment of innovative technologies in concrete user communities.
Research areas relevant to the journal include, but are not limited to:
• Innovative interaction techniques
• Multimodal interaction
• Speech interaction
• Graphic interaction
• Natural language interaction
• Interaction in mobile and embedded systems
• Interface design and evaluation methodologies
• Design and evaluation of innovative interactive systems
• User interface prototyping and management systems
• Ubiquitous computing
• Wearable computers
• Pervasive computing
• Affective computing
• Empirical studies of user behaviour
• Empirical studies of programming and software engineering
• Computer supported cooperative work
• Computer mediated communication
• Virtual reality
• Mixed and augmented Reality
• Intelligent user interfaces
• Presence
...