Regress 3D human pose from 2D skeleton with kinematics knowledge

IF 1 4区 数学 Q1 MATHEMATICS
Longkui Jiang, Yuru Wang, Weijia Li
{"title":"Regress 3D human pose from 2D skeleton with kinematics knowledge","authors":"Longkui Jiang, Yuru Wang, Weijia Li","doi":"10.3934/era.2023075","DOIUrl":null,"url":null,"abstract":"3D human pose estimation is a hot topic in the field of computer vision. It provides data support for tasks such as pose recognition, human tracking and action recognition. Therefore, it is widely applied in the fields of advanced human-computer interaction, intelligent monitoring and so on. Estimating 3D human pose from a single 2D image is an ill-posed problem and is likely to cause low prediction accuracy, due to the problems of self-occlusion and depth ambiguity. This paper developed two types of human kinematics to improve the estimation accuracy. First, taking the 2D human body skeleton sequence obtained by the 2D human body pose detector as input, a temporal convolutional network is proposed to develop the movement periodicity in temporal domain. Second, geometrical prior knowledge is introduced into the model to constrain the estimated pose to fit the general kinematics knowledge. The experiments are tested on Human3.6M and MPII (Max Planck Institut Informatik) Human Pose (MPI-INF-3DHP) datasets, and the proposed model shows better generalization ability compared with the baseline and the state-of-the-art models.","PeriodicalId":48554,"journal":{"name":"Electronic Research Archive","volume":null,"pages":null},"PeriodicalIF":1.0000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Electronic Research Archive","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.3934/era.2023075","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS","Score":null,"Total":0}
引用次数: 0

Abstract

3D human pose estimation is a hot topic in the field of computer vision. It provides data support for tasks such as pose recognition, human tracking and action recognition. Therefore, it is widely applied in the fields of advanced human-computer interaction, intelligent monitoring and so on. Estimating 3D human pose from a single 2D image is an ill-posed problem and is likely to cause low prediction accuracy, due to the problems of self-occlusion and depth ambiguity. This paper developed two types of human kinematics to improve the estimation accuracy. First, taking the 2D human body skeleton sequence obtained by the 2D human body pose detector as input, a temporal convolutional network is proposed to develop the movement periodicity in temporal domain. Second, geometrical prior knowledge is introduced into the model to constrain the estimated pose to fit the general kinematics knowledge. The experiments are tested on Human3.6M and MPII (Max Planck Institut Informatik) Human Pose (MPI-INF-3DHP) datasets, and the proposed model shows better generalization ability compared with the baseline and the state-of-the-art models.
利用运动学知识从二维骨架回归三维人体姿态
三维人体姿态估计是计算机视觉领域的研究热点。它为姿态识别、人体跟踪和动作识别等任务提供数据支持。因此,它被广泛应用于高级人机交互、智能监控等领域。从单个二维图像估计三维人体姿态是一个病态问题,由于自遮挡和深度模糊的问题,可能导致预测精度低。为了提高估计精度,本文发展了两种类型的人体运动学。首先,以二维人体姿态检测器获得的二维人体骨架序列为输入,提出了一种时域卷积网络,在时域内发展运动周期性;其次,在模型中引入几何先验知识,约束姿态估计以拟合一般运动学知识;实验在Human3.6M和MPII (Max Planck Institut Informatik) Human Pose (MPI-INF-3DHP)数据集上进行了测试,与基线和最先进的模型相比,所提出的模型具有更好的泛化能力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
1.30
自引率
12.50%
发文量
170
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信