Low-level Voice and Hand-Tracking Interaction Actions: Explorations with Let's Go There

Jaisie Sin, Cosmin Munteanu
{"title":"Low-level Voice and Hand-Tracking Interaction Actions: Explorations with Let's Go There","authors":"Jaisie Sin, Cosmin Munteanu","doi":"10.1145/3447527.3474875","DOIUrl":null,"url":null,"abstract":"Hand-tracking allows users to engage with a virtual environment with their own hands, rather than the more traditional method of using accompanying controllers in order to operate the device they are using and interact with the virtual world. We seek to explore the range of low-level interaction actions and high-level interaction tasks and domains can be associated with the multimodal hand-tracking and voice input in VR. Thus, we created Let's Go There, which explores this joint-input method. So far, we have identified four low-level interaction actions which are exemplified by this demo: positioning oneself, positioning others, selection, and information assignment. We anticipate potential high-level interaction tasks and domains to include customer service training, social skills training, and cultural competency training (e.g. when interacting with older adults). Let's Go There, the system described in this paper, had been previously demonstrated at CUI 2020 and MobileHCI 2021. We have since updated our approach to its development to separate it into low- and high-level interactions. Thus, we believe there is value in bringing it to MobileHCI again to highlight these different types of interactions for further showcase and discussion.","PeriodicalId":281566,"journal":{"name":"Adjunct Publication of the 23rd International Conference on Mobile Human-Computer Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-09-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Adjunct Publication of the 23rd International Conference on Mobile Human-Computer Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3447527.3474875","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Hand-tracking allows users to engage with a virtual environment with their own hands, rather than the more traditional method of using accompanying controllers in order to operate the device they are using and interact with the virtual world. We seek to explore the range of low-level interaction actions and high-level interaction tasks and domains can be associated with the multimodal hand-tracking and voice input in VR. Thus, we created Let's Go There, which explores this joint-input method. So far, we have identified four low-level interaction actions which are exemplified by this demo: positioning oneself, positioning others, selection, and information assignment. We anticipate potential high-level interaction tasks and domains to include customer service training, social skills training, and cultural competency training (e.g. when interacting with older adults). Let's Go There, the system described in this paper, had been previously demonstrated at CUI 2020 and MobileHCI 2021. We have since updated our approach to its development to separate it into low- and high-level interactions. Thus, we believe there is value in bringing it to MobileHCI again to highlight these different types of interactions for further showcase and discussion.
低级语音和手部跟踪交互动作:Let's Go There的探索
手部追踪允许用户用自己的手与虚拟环境互动,而不是使用更传统的方法,即使用附带的控制器来操作他们正在使用的设备并与虚拟世界互动。我们试图探索与VR中的多模态手部跟踪和语音输入相关的低级交互动作和高级交互任务和域的范围。因此,我们创建了Let's Go There,它探索了这种联合输入法。到目前为止,我们已经确定了四种低级交互动作,通过这个演示举例说明:定位自己,定位他人,选择和信息分配。我们预计潜在的高层次互动任务和领域包括客户服务培训、社交技能培训和文化能力培训(例如与老年人互动时)。本文中描述的系统Let’s Go There之前已经在CUI 2020和MobileHCI 2021上进行了演示。此后,我们更新了其开发方法,将其分为低级和高级交互。因此,我们相信在MobileHCI上再次强调这些不同类型的交互,以进一步展示和讨论是有价值的。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信