Haptic Rendering of Virtual Hand Moving Objects

Wenzhen Yang, Wenhua Chen
{"title":"Haptic Rendering of Virtual Hand Moving Objects","authors":"Wenzhen Yang, Wenhua Chen","doi":"10.1109/CW.2011.34","DOIUrl":null,"url":null,"abstract":"The paper proposes a method of haptic rendering for virtual hand moving an object in virtual environments, which would not only improve the immersion and authenticity of the virtual reality system, but also help operators predict virtual objects' natural behavior and direct their interaction with virtual environments. By tracking the position of the grasped object, the posture change of the grasped object is detected. An algorithm is used to calculate the distribution of the external force imposed on the grasped object in arbitrary posture, and the static grasp force of the virtual hand is regenerated according to the physically-based general force model of virtual hand grasp. Based on the theories of kinematics and dynamics, the resultant force which causes the variation in motion of the grasped object can be obtained. Further, the grasp force of the virtual hand is regenerated by combining the posture change and the resultant force. The solution has been experimentally implemented using an exoskeleton force-feedback glove. A special series of experimental results show that the static grasp force rendering approach for the virtual hand moving an object is computationally efficient while retaining a well level of realism.","PeriodicalId":231796,"journal":{"name":"2011 International Conference on Cyberworlds","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-10-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 International Conference on Cyberworlds","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CW.2011.34","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

The paper proposes a method of haptic rendering for virtual hand moving an object in virtual environments, which would not only improve the immersion and authenticity of the virtual reality system, but also help operators predict virtual objects' natural behavior and direct their interaction with virtual environments. By tracking the position of the grasped object, the posture change of the grasped object is detected. An algorithm is used to calculate the distribution of the external force imposed on the grasped object in arbitrary posture, and the static grasp force of the virtual hand is regenerated according to the physically-based general force model of virtual hand grasp. Based on the theories of kinematics and dynamics, the resultant force which causes the variation in motion of the grasped object can be obtained. Further, the grasp force of the virtual hand is regenerated by combining the posture change and the resultant force. The solution has been experimentally implemented using an exoskeleton force-feedback glove. A special series of experimental results show that the static grasp force rendering approach for the virtual hand moving an object is computationally efficient while retaining a well level of realism.
虚拟手移动物体的触觉渲染
本文提出了一种虚拟手在虚拟环境中移动物体的触觉渲染方法,不仅可以提高虚拟现实系统的沉浸感和真实性,还可以帮助操作者预测虚拟物体的自然行为,指导其与虚拟环境的交互。通过跟踪被抓物体的位置,检测被抓物体的姿态变化。采用一种算法计算被抓物体在任意姿态下所受外力的分布,并根据虚拟手抓取的基于物理的通用力模型再生虚拟手的静态抓取力。根据运动学和动力学理论,可以得到引起被抓物体运动变化的合力。进一步,结合姿态变化和合力再生虚拟手的抓握力。该解决方案已经通过外骨骼力反馈手套进行了实验。一系列特殊的实验结果表明,虚拟手移动物体的静态抓取力绘制方法计算效率高,同时保持了较好的真实感。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信