HuTrain: a Framework for Fast Creation of Real Human Pose Datasets

R. R. Barioni, W. Costa, J. A. C. Neto, L. Figueiredo, V. Teichrieb, J. Quintino, F. Q. Silva, André L. M. Santos, Helder Pinho
{"title":"HuTrain: a Framework for Fast Creation of Real Human Pose Datasets","authors":"R. R. Barioni, W. Costa, J. A. C. Neto, L. Figueiredo, V. Teichrieb, J. Quintino, F. Q. Silva, André L. M. Santos, Helder Pinho","doi":"10.1109/ISMAR-Adjunct51615.2020.00031","DOIUrl":null,"url":null,"abstract":"Image-based body tracking algorithms are useful in several scenarios, such as avatar animations and gesture interaction for VR applications. In the last few years, the best-ranked solutions presented on the state of the art of body tracking (according to the most popular datasets in the field) are intensively based on Convolutional Neural Networks (CNNs) algorithms and use large datasets for training and validation. Although these solutions achieve high precision scores while evaluated with some of these datasets, there are particular tracking challenges (for example, upside-down cases) that are not well-modeled and, therefore, not correctly tracked. Instead of lurking an all-in-one solution for all cases, we propose HuTrain, a framework for creating datasets quickly and easily. HuTrain comprises a series of steps, including automatic camera calibration, refined human pose estimation, and known dataset formats conversion. We show that, with our system, the user can generate human pose datasets, targeting specific tracking challenges for the desired application context, with no need to annotate human pose instances manually.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00031","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Image-based body tracking algorithms are useful in several scenarios, such as avatar animations and gesture interaction for VR applications. In the last few years, the best-ranked solutions presented on the state of the art of body tracking (according to the most popular datasets in the field) are intensively based on Convolutional Neural Networks (CNNs) algorithms and use large datasets for training and validation. Although these solutions achieve high precision scores while evaluated with some of these datasets, there are particular tracking challenges (for example, upside-down cases) that are not well-modeled and, therefore, not correctly tracked. Instead of lurking an all-in-one solution for all cases, we propose HuTrain, a framework for creating datasets quickly and easily. HuTrain comprises a series of steps, including automatic camera calibration, refined human pose estimation, and known dataset formats conversion. We show that, with our system, the user can generate human pose datasets, targeting specific tracking challenges for the desired application context, with no need to annotate human pose instances manually.
HuTrain:一个快速创建真实人体姿势数据集的框架
基于图像的身体跟踪算法在一些场景中很有用,比如虚拟现实应用的化身动画和手势交互。在过去的几年里,身体跟踪技术(根据该领域最流行的数据集)中排名最高的解决方案主要基于卷积神经网络(cnn)算法,并使用大型数据集进行训练和验证。尽管这些解决方案在使用其中一些数据集进行评估时获得了高精度分数,但存在特定的跟踪挑战(例如,颠倒的情况),这些挑战没有很好地建模,因此无法正确跟踪。而不是潜伏在所有情况下的一体化解决方案,我们提出HuTrain,一个快速,轻松地创建数据集的框架。HuTrain包括一系列步骤,包括自动相机校准,精细人体姿势估计和已知数据集格式转换。我们表明,使用我们的系统,用户可以生成人体姿势数据集,针对所需应用上下文的特定跟踪挑战,而无需手动注释人体姿势实例。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信