Object manipulation based on the head manipulation space in VR

IF 5.3 2区 计算机科学 Q1 COMPUTER SCIENCE, CYBERNETICS
Xiaolong Liu , Lili Wang , Wei Ke , Sio-Kei Im
{"title":"Object manipulation based on the head manipulation space in VR","authors":"Xiaolong Liu ,&nbsp;Lili Wang ,&nbsp;Wei Ke ,&nbsp;Sio-Kei Im","doi":"10.1016/j.ijhcs.2024.103346","DOIUrl":null,"url":null,"abstract":"<div><p>Object manipulation is fundamental in virtual and augmented reality, where efficiency and accuracy are crucial. However, repetitive object manipulation tasks using the hands can lead to arm fatigue, and in some scenarios, hands may not be feasible for object manipulation. In this paper, we propose a novel approach for object manipulation based on head movement. Firstly, we introduce the concept of head manipulation space and conduct an experiment to collect head manipulation space data to determine the manipulable space. Then, we propose a new method for object manipulation based on head speed and inter-frame viewpoint quality to enhance the efficiency and accuracy of head manipulation. Finally, we design two user studies to evaluate the performance of our head-based object manipulation method. The results show that our method is feasible in terms of task completion efficiency and accuracy compared to state-of-the-art methods and greatly reduces user fatigue and motion sickness. Moreover, our method significantly improves usability and reduces task load. Our method lays a foundation for head-based object manipulation in virtual and augmented reality and provides a new manipulation method for scenarios where hands are not suitable for object manipulation.</p></div>","PeriodicalId":54955,"journal":{"name":"International Journal of Human-Computer Studies","volume":"192 ","pages":"Article 103346"},"PeriodicalIF":5.3000,"publicationDate":"2024-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Human-Computer Studies","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1071581924001290","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, CYBERNETICS","Score":null,"Total":0}
引用次数: 0

Abstract

Object manipulation is fundamental in virtual and augmented reality, where efficiency and accuracy are crucial. However, repetitive object manipulation tasks using the hands can lead to arm fatigue, and in some scenarios, hands may not be feasible for object manipulation. In this paper, we propose a novel approach for object manipulation based on head movement. Firstly, we introduce the concept of head manipulation space and conduct an experiment to collect head manipulation space data to determine the manipulable space. Then, we propose a new method for object manipulation based on head speed and inter-frame viewpoint quality to enhance the efficiency and accuracy of head manipulation. Finally, we design two user studies to evaluate the performance of our head-based object manipulation method. The results show that our method is feasible in terms of task completion efficiency and accuracy compared to state-of-the-art methods and greatly reduces user fatigue and motion sickness. Moreover, our method significantly improves usability and reduces task load. Our method lays a foundation for head-based object manipulation in virtual and augmented reality and provides a new manipulation method for scenarios where hands are not suitable for object manipulation.

Abstract Image

基于虚拟现实中头部操作空间的物体操作
物体操作是虚拟现实和增强现实的基础,其效率和准确性至关重要。然而,使用双手重复操作物体会导致手臂疲劳,而且在某些情况下,双手可能无法操作物体。在本文中,我们提出了一种基于头部运动的物体操纵新方法。首先,我们引入了头部操纵空间的概念,并通过实验收集头部操纵空间数据来确定可操纵空间。然后,我们提出了一种基于头部速度和帧间视点质量的物体操纵新方法,以提高头部操纵的效率和准确性。最后,我们设计了两项用户研究来评估基于头部的物体操纵方法的性能。结果表明,与最先进的方法相比,我们的方法在完成任务的效率和准确性方面是可行的,并能大大减少用户的疲劳和晕动症。此外,我们的方法还大大提高了可用性,减轻了任务负担。我们的方法为虚拟现实和增强现实中基于头部的物体操纵奠定了基础,并为不适合用手操纵物体的场景提供了一种新的操纵方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
International Journal of Human-Computer Studies
International Journal of Human-Computer Studies 工程技术-计算机:控制论
CiteScore
11.50
自引率
5.60%
发文量
108
审稿时长
3 months
期刊介绍: The International Journal of Human-Computer Studies publishes original research over the whole spectrum of work relevant to the theory and practice of innovative interactive systems. The journal is inherently interdisciplinary, covering research in computing, artificial intelligence, psychology, linguistics, communication, design, engineering, and social organization, which is relevant to the design, analysis, evaluation and application of innovative interactive systems. Papers at the boundaries of these disciplines are especially welcome, as it is our view that interdisciplinary approaches are needed for producing theoretical insights in this complex area and for effective deployment of innovative technologies in concrete user communities. Research areas relevant to the journal include, but are not limited to: • Innovative interaction techniques • Multimodal interaction • Speech interaction • Graphic interaction • Natural language interaction • Interaction in mobile and embedded systems • Interface design and evaluation methodologies • Design and evaluation of innovative interactive systems • User interface prototyping and management systems • Ubiquitous computing • Wearable computers • Pervasive computing • Affective computing • Empirical studies of user behaviour • Empirical studies of programming and software engineering • Computer supported cooperative work • Computer mediated communication • Virtual reality • Mixed and augmented Reality • Intelligent user interfaces • Presence ...
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信