{"title":"在旅途中使用增强现实:了解移动性对用户性能的影响以及眼睛、头部和手部光线指向的主观工作量","authors":"Yonghwan Shin , Augusto Esteves , Ian Oakley","doi":"10.1016/j.ijhcs.2025.103597","DOIUrl":null,"url":null,"abstract":"<div><div>Augmented Reality (AR) HMDs are the latest iteration in wearable computing, and the lightweight and portable form factors currently emerging are particularly suited for mobile use — they offer the potential for seamless, discreet, and contextual information to users on the go. Despite this potential, studies of input on HMDs rarely consider mobility issues. This paper seeks to rectify this omission in the context of ray pointing, one of the most essential and general-purpose input modalities in this space. We present the first study (N=24) contrasting user performance on HMDs across <em>eye</em>, <em>head</em>, and <em>hand</em> ray pointing while standing and walking, for both dwell and pinch gesture activations. Our results indicate walking is highly disruptive to interactions with conventional HMD UIs — in general, success rates fall precipitously while selection times rise steeply while users walk. Variations in performance between modalities and activation techniques shed light on how input techniques that are more resilient to motion could be constructed. Building on these findings, we discuss design considerations for ray pointing and interface and interaction technique designs for HMDs that may be better suited to mobile scenarios.</div></div>","PeriodicalId":54955,"journal":{"name":"International Journal of Human-Computer Studies","volume":"204 ","pages":"Article 103597"},"PeriodicalIF":5.1000,"publicationDate":"2025-08-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Using Augmented Reality on the go: Understanding the effects of mobility on user performance and subjective workload across eye, head, and hand ray pointing\",\"authors\":\"Yonghwan Shin , Augusto Esteves , Ian Oakley\",\"doi\":\"10.1016/j.ijhcs.2025.103597\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Augmented Reality (AR) HMDs are the latest iteration in wearable computing, and the lightweight and portable form factors currently emerging are particularly suited for mobile use — they offer the potential for seamless, discreet, and contextual information to users on the go. Despite this potential, studies of input on HMDs rarely consider mobility issues. This paper seeks to rectify this omission in the context of ray pointing, one of the most essential and general-purpose input modalities in this space. We present the first study (N=24) contrasting user performance on HMDs across <em>eye</em>, <em>head</em>, and <em>hand</em> ray pointing while standing and walking, for both dwell and pinch gesture activations. Our results indicate walking is highly disruptive to interactions with conventional HMD UIs — in general, success rates fall precipitously while selection times rise steeply while users walk. Variations in performance between modalities and activation techniques shed light on how input techniques that are more resilient to motion could be constructed. Building on these findings, we discuss design considerations for ray pointing and interface and interaction technique designs for HMDs that may be better suited to mobile scenarios.</div></div>\",\"PeriodicalId\":54955,\"journal\":{\"name\":\"International Journal of Human-Computer Studies\",\"volume\":\"204 \",\"pages\":\"Article 103597\"},\"PeriodicalIF\":5.1000,\"publicationDate\":\"2025-08-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Human-Computer Studies\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1071581925001545\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, CYBERNETICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Human-Computer Studies","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1071581925001545","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, CYBERNETICS","Score":null,"Total":0}
Using Augmented Reality on the go: Understanding the effects of mobility on user performance and subjective workload across eye, head, and hand ray pointing
Augmented Reality (AR) HMDs are the latest iteration in wearable computing, and the lightweight and portable form factors currently emerging are particularly suited for mobile use — they offer the potential for seamless, discreet, and contextual information to users on the go. Despite this potential, studies of input on HMDs rarely consider mobility issues. This paper seeks to rectify this omission in the context of ray pointing, one of the most essential and general-purpose input modalities in this space. We present the first study (N=24) contrasting user performance on HMDs across eye, head, and hand ray pointing while standing and walking, for both dwell and pinch gesture activations. Our results indicate walking is highly disruptive to interactions with conventional HMD UIs — in general, success rates fall precipitously while selection times rise steeply while users walk. Variations in performance between modalities and activation techniques shed light on how input techniques that are more resilient to motion could be constructed. Building on these findings, we discuss design considerations for ray pointing and interface and interaction technique designs for HMDs that may be better suited to mobile scenarios.
期刊介绍:
The International Journal of Human-Computer Studies publishes original research over the whole spectrum of work relevant to the theory and practice of innovative interactive systems. The journal is inherently interdisciplinary, covering research in computing, artificial intelligence, psychology, linguistics, communication, design, engineering, and social organization, which is relevant to the design, analysis, evaluation and application of innovative interactive systems. Papers at the boundaries of these disciplines are especially welcome, as it is our view that interdisciplinary approaches are needed for producing theoretical insights in this complex area and for effective deployment of innovative technologies in concrete user communities.
Research areas relevant to the journal include, but are not limited to:
• Innovative interaction techniques
• Multimodal interaction
• Speech interaction
• Graphic interaction
• Natural language interaction
• Interaction in mobile and embedded systems
• Interface design and evaluation methodologies
• Design and evaluation of innovative interactive systems
• User interface prototyping and management systems
• Ubiquitous computing
• Wearable computers
• Pervasive computing
• Affective computing
• Empirical studies of user behaviour
• Empirical studies of programming and software engineering
• Computer supported cooperative work
• Computer mediated communication
• Virtual reality
• Mixed and augmented Reality
• Intelligent user interfaces
• Presence
...