Real-time neural network prediction for handling two-hands mutual occlusions

Q2 Engineering
Dario Pavllo, Mathias Delahaye, Thibault Porssut, Bruno Herbelin, Ronan Boulic
{"title":"Real-time neural network prediction for handling two-hands mutual occlusions","authors":"Dario Pavllo,&nbsp;Mathias Delahaye,&nbsp;Thibault Porssut,&nbsp;Bruno Herbelin,&nbsp;Ronan Boulic","doi":"10.1016/j.cagx.2019.100011","DOIUrl":null,"url":null,"abstract":"<div><p>Hands deserve particular attention in virtual reality (VR) applications because they represent our primary means for interacting with the environment. Although marker-based motion capture works adequately for full body tracking, it is less reliable for small body parts such as hands and fingers which are often occluded when captured optically, thus leading VR professionals to rely on additional systems (e.g. inertial trackers). We present a machine learning pipeline to track hands and fingers using solely a motion capture system based on cameras and active markers. Our finger animation is performed by a predictive model based on neural networks trained on a movements dataset acquired from several subjects with a complementary capture system. We employ a two-stage pipeline that first resolves occlusions and then recovers all joint transformations. We show that our method compares favorably to inverse kinematics by inferring automatically the constraints from the data, provides a natural reconstruction of postures, and handles occlusions better than three proposed baselines.</p></div>","PeriodicalId":52283,"journal":{"name":"Computers and Graphics: X","volume":"2 ","pages":"Article 100011"},"PeriodicalIF":0.0000,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.cagx.2019.100011","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers and Graphics: X","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2590148619300111","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Engineering","Score":null,"Total":0}
引用次数: 4

Abstract

Hands deserve particular attention in virtual reality (VR) applications because they represent our primary means for interacting with the environment. Although marker-based motion capture works adequately for full body tracking, it is less reliable for small body parts such as hands and fingers which are often occluded when captured optically, thus leading VR professionals to rely on additional systems (e.g. inertial trackers). We present a machine learning pipeline to track hands and fingers using solely a motion capture system based on cameras and active markers. Our finger animation is performed by a predictive model based on neural networks trained on a movements dataset acquired from several subjects with a complementary capture system. We employ a two-stage pipeline that first resolves occlusions and then recovers all joint transformations. We show that our method compares favorably to inverse kinematics by inferring automatically the constraints from the data, provides a natural reconstruction of postures, and handles occlusions better than three proposed baselines.

Abstract Image

双手互咬合处理的实时神经网络预测
在虚拟现实(VR)应用中,手值得特别关注,因为它们代表了我们与环境交互的主要手段。尽管基于标记的动作捕捉对于全身跟踪足够有效,但对于像手和手指这样的小身体部位来说,它不太可靠,因为这些部位在光学捕获时经常被遮挡,因此导致VR专业人员依赖于额外的系统(例如惯性跟踪器)。我们提出了一个机器学习管道,仅使用基于相机和活动标记的动作捕捉系统来跟踪手和手指。我们的手指动画是由一个基于神经网络的预测模型执行的,该神经网络是通过一个互补捕获系统从几个受试者获得的运动数据集训练而成的。我们采用了一个两阶段的管道,首先解决闭塞,然后恢复所有的关节变换。我们表明,我们的方法通过自动推断数据的约束,提供姿势的自然重建,并且比三个建议的基线更好地处理遮挡,从而优于逆运动学。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Computers and Graphics: X
Computers and Graphics: X Engineering-Engineering (all)
CiteScore
3.30
自引率
0.00%
发文量
0
审稿时长
20 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信