Exploration de la Physicalité des Widgets pour l’Interaction Basée sur des mouvements de la Tête le Cas des Menus en Réalité Mixte: Exploring the Physicality of Widgets for Head-Based Interaction the Case of Menu in Mixed Reality

C. Bailly, F. Leitner, Laurence Nigay
{"title":"Exploration de la Physicalité des Widgets pour l’Interaction Basée sur des mouvements de la Tête le Cas des Menus en Réalité Mixte: Exploring the Physicality of Widgets for Head-Based Interaction the Case of Menu in Mixed Reality","authors":"C. Bailly, F. Leitner, Laurence Nigay","doi":"10.1145/3450522.3451326","DOIUrl":null,"url":null,"abstract":"Mixed Reality with a Head-Mounted Display (HMD) offers unique perspectives for head-based interaction with virtual content and widgets. Besides virtual widgets, physical objects can be anchors (mixed widgets) or directly materialised widgets (physical widgets). The physicality (virtual-mixed-physical) of widgets defines a new dimension for Mixed Reality (MR) interaction that extends existing taxonomies of widgets in MR. As a first step to explore this new dimension, we focus on a commonly used widget a menu. We thus evaluate the performance and usability of head pointing to a virtual, a mixed and a physical menu. Results suggest that pointing to a physical menu was on average 2s faster than pointing to a mixed or a virtual menu and preferred by participants. Virtual and mixed menus led to similar performances, but 11 participants over 15 preferred mixed menus over virtual ones. Based on our findings, we provide recommendations (benefits/limitations) for virtual, mixed and physical menus in MR.","PeriodicalId":330419,"journal":{"name":"Proceedings of the 32nd Conference on l'Interaction Homme-Machine","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-04-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 32nd Conference on l'Interaction Homme-Machine","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3450522.3451326","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Mixed Reality with a Head-Mounted Display (HMD) offers unique perspectives for head-based interaction with virtual content and widgets. Besides virtual widgets, physical objects can be anchors (mixed widgets) or directly materialised widgets (physical widgets). The physicality (virtual-mixed-physical) of widgets defines a new dimension for Mixed Reality (MR) interaction that extends existing taxonomies of widgets in MR. As a first step to explore this new dimension, we focus on a commonly used widget a menu. We thus evaluate the performance and usability of head pointing to a virtual, a mixed and a physical menu. Results suggest that pointing to a physical menu was on average 2s faster than pointing to a mixed or a virtual menu and preferred by participants. Virtual and mixed menus led to similar performances, but 11 participants over 15 preferred mixed menus over virtual ones. Based on our findings, we provide recommendations (benefits/limitations) for virtual, mixed and physical menus in MR.
探索基于头部运动的交互小部件的物理特性混合现实中的菜单案例:探索基于头部交互的小部件的物理特性混合现实中的菜单案例
混合现实头戴式显示器(HMD)为基于头部的虚拟内容和小部件交互提供了独特的视角。除了虚拟小部件,物理对象还可以是锚(混合小部件)或直接物化小部件(物理小部件)。小部件的物理特性(虚拟-混合物理)为混合现实(MR)交互定义了一个新的维度,它扩展了MR中现有的小部件分类法。作为探索这个新维度的第一步,我们将重点关注一个常用的小部件:菜单。因此,我们评估了头部指向虚拟、混合和物理菜单的性能和可用性。结果表明,指向实体菜单的速度比指向混合菜单或虚拟菜单的速度平均快25秒,这是参与者更喜欢的。虚拟菜单和混合菜单的表现相似,但超过15名参与者中有11名更喜欢混合菜单而不是虚拟菜单。根据我们的研究结果,我们为MR中的虚拟、混合和物理菜单提供了建议(优点/限制)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信