Modeling Pointing for 3D Target Selection in VR

Tor-Salve Dalsgaard, Jarrod Knibbe, Joanna Bergström
{"title":"Modeling Pointing for 3D Target Selection in VR","authors":"Tor-Salve Dalsgaard, Jarrod Knibbe, Joanna Bergström","doi":"10.1145/3489849.3489853","DOIUrl":null,"url":null,"abstract":"Virtual reality (VR) allows users to interact similarly to how they do in the physical world, such as touching, moving, and pointing at objects. To select objects at a distance, most VR techniques rely on casting a ray through one or two points located on the user’s body (e.g., on the head and a finger), and placing a cursor on that ray. However, previous studies show that such rays do not help users achieve optimal pointing accuracy nor correspond to how they would naturally point. We seek to find features, which would best describe natural pointing at distant targets. We collect motion data from seven locations on the hand, arm, and body, while participants point at 27 targets across a virtual room. We evaluate the features of pointing and analyse sets of those for predicting pointing targets. Our analysis shows an 87% classification accuracy between the 27 targets for the best feature set and a mean distance of 23.56 cm in predicting pointing targets across the room. The feature sets can inform the design of more natural and effective VR pointing techniques for distant object selection.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"9 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3489849.3489853","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

Abstract

Virtual reality (VR) allows users to interact similarly to how they do in the physical world, such as touching, moving, and pointing at objects. To select objects at a distance, most VR techniques rely on casting a ray through one or two points located on the user’s body (e.g., on the head and a finger), and placing a cursor on that ray. However, previous studies show that such rays do not help users achieve optimal pointing accuracy nor correspond to how they would naturally point. We seek to find features, which would best describe natural pointing at distant targets. We collect motion data from seven locations on the hand, arm, and body, while participants point at 27 targets across a virtual room. We evaluate the features of pointing and analyse sets of those for predicting pointing targets. Our analysis shows an 87% classification accuracy between the 27 targets for the best feature set and a mean distance of 23.56 cm in predicting pointing targets across the room. The feature sets can inform the design of more natural and effective VR pointing techniques for distant object selection.
VR中三维目标选择的建模指向
虚拟现实(VR)允许用户像在物理世界中一样进行交互,比如触摸、移动和指向物体。为了在一定距离内选择物体,大多数VR技术依赖于通过用户身体上的一个或两个点(例如,在头部和手指上)投射光线,并在该光线上放置光标。然而,先前的研究表明,这种光线不能帮助用户达到最佳的指向精度,也不能与他们自然指向的方式相对应。我们试图找到最能描述自然指向遥远目标的特征。我们从手、手臂和身体的七个位置收集运动数据,同时参与者指着虚拟房间里的27个目标。我们评估了指向的特征,并分析了指向目标预测的特征集。我们的分析表明,对于最佳特征集,27个目标之间的分类准确率为87%,预测房间内指向目标的平均距离为23.56 cm。这些功能集可以为远距离物体选择提供更自然、更有效的VR指向技术的设计。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信