Multi-sensor data fusion for hand tracking using Kinect and Leap Motion

B. Penelle, O. Debeir
{"title":"Multi-sensor data fusion for hand tracking using Kinect and Leap Motion","authors":"B. Penelle, O. Debeir","doi":"10.1145/2617841.2620710","DOIUrl":null,"url":null,"abstract":"Often presented as competing products on the market of low cost 3D sensors, the Kinect™ and the Leap Motion™ (LM) can actually be complementary in some scenario. We promote, in this paper, the fusion of data acquired by both LM and Kinect sensors to improve hand tracking performances. The sensor fusion is applied to an existing augmented reality system targeting the treatment of phantom limb pain (PLP) in upper limb amputees. With the Kinect we acquire 3D images of the patient in real-time. These images are post-processed to apply a mirror effect along the sagittal plane of the body, before being displayed back to the patient in 3D, giving him the illusion that he has two arms. The patient uses the virtual reconstructed arm to perform given tasks involving interactions with virtual objects. Thanks to the plasticity of the brain, the restored visual feedback of the missing arm allows, in some cases, to reduce the pain intensity. The Leap Motion brings to the system the ability to perform accurate motion tracking of the hand, including the fingers. By registering the position and orientation of the LM in the frame of reference of the Kinect, we make our system able to accurately detect interactions of the hand and the fingers with virtual objects, which will greatly improve the user experience. We also show that the sensor fusion nicely extends the tracking domain by supplying finger positions even when the Kinect sensor fails to acquire the depth values for the hand.","PeriodicalId":128331,"journal":{"name":"Proceedings of the 2014 Virtual Reality International Conference","volume":"48 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-04-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"22","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2014 Virtual Reality International Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2617841.2620710","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 22

Abstract

Often presented as competing products on the market of low cost 3D sensors, the Kinect™ and the Leap Motion™ (LM) can actually be complementary in some scenario. We promote, in this paper, the fusion of data acquired by both LM and Kinect sensors to improve hand tracking performances. The sensor fusion is applied to an existing augmented reality system targeting the treatment of phantom limb pain (PLP) in upper limb amputees. With the Kinect we acquire 3D images of the patient in real-time. These images are post-processed to apply a mirror effect along the sagittal plane of the body, before being displayed back to the patient in 3D, giving him the illusion that he has two arms. The patient uses the virtual reconstructed arm to perform given tasks involving interactions with virtual objects. Thanks to the plasticity of the brain, the restored visual feedback of the missing arm allows, in some cases, to reduce the pain intensity. The Leap Motion brings to the system the ability to perform accurate motion tracking of the hand, including the fingers. By registering the position and orientation of the LM in the frame of reference of the Kinect, we make our system able to accurately detect interactions of the hand and the fingers with virtual objects, which will greatly improve the user experience. We also show that the sensor fusion nicely extends the tracking domain by supplying finger positions even when the Kinect sensor fails to acquire the depth values for the hand.
基于Kinect和Leap Motion的手部跟踪多传感器数据融合
在低成本3D传感器市场上,Kinect™和Leap Motion™(LM)通常被视为竞争产品,但在某些情况下,它们实际上是互补的。在本文中,我们提出了LM和Kinect传感器采集数据的融合,以提高手部跟踪性能。传感器融合应用于现有的增强现实系统,目标是治疗上肢截肢者的幻肢痛(PLP)。通过Kinect,我们可以实时获取病人的3D图像。这些图像经过后处理,沿着身体的矢状面应用镜像效果,然后以3D形式显示给患者,让他产生自己有两只手臂的错觉。患者使用虚拟重建的手臂来完成与虚拟物体交互的给定任务。由于大脑的可塑性,在某些情况下,失去的手臂恢复的视觉反馈可以减轻疼痛的强度。Leap Motion为系统带来了对手(包括手指)进行精确运动跟踪的能力。通过在Kinect的参照系中注册LM的位置和方向,我们使我们的系统能够准确地检测手和手指与虚拟物体的交互,这将大大提高用户体验。我们还表明,即使Kinect传感器无法获取手部的深度值,传感器融合也可以通过提供手指位置来很好地扩展跟踪域。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信