A Multi-view 3D Human Pose Estimation Algorithm Based On Positional Attention

Dandan Sun, ChangAn Zhang
{"title":"A Multi-view 3D Human Pose Estimation Algorithm Based On Positional Attention","authors":"Dandan Sun, ChangAn Zhang","doi":"10.1109/ICSP54964.2022.9778615","DOIUrl":null,"url":null,"abstract":"With the development of CNNs, the human pose estimation research has made great progress, but there is still a problem: the relationships of the human each joint location are not well exploited in previous CNNs-based methods. Considering the order of global spatial information and human body location information, we propose a multi-view 3D human pose estimation algorithm based on position attention. In 2D detection stage, position coding is adopted to rebuild the image in the global space position relation. The attention mechanism can model the relationship between various channels and capture feature maps the dependencies between the horizontal and vertical direction, and the details are mined from the feature location relationship to generate high-quality feature maps. In the last stage of feature extraction, adjacent view features are used to enhance the spatial expression ability of feature images, so as to better solve occlusion and oblique view. Experiments on the Human3.6M data set show that when using Resnet-50 as the backbone network and 256×256 of the image size, the average joint error of our algorithm is reduced to 25.2mm, which reaching the competitive result.","PeriodicalId":363766,"journal":{"name":"2022 7th International Conference on Intelligent Computing and Signal Processing (ICSP)","volume":"231 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 7th International Conference on Intelligent Computing and Signal Processing (ICSP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICSP54964.2022.9778615","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

With the development of CNNs, the human pose estimation research has made great progress, but there is still a problem: the relationships of the human each joint location are not well exploited in previous CNNs-based methods. Considering the order of global spatial information and human body location information, we propose a multi-view 3D human pose estimation algorithm based on position attention. In 2D detection stage, position coding is adopted to rebuild the image in the global space position relation. The attention mechanism can model the relationship between various channels and capture feature maps the dependencies between the horizontal and vertical direction, and the details are mined from the feature location relationship to generate high-quality feature maps. In the last stage of feature extraction, adjacent view features are used to enhance the spatial expression ability of feature images, so as to better solve occlusion and oblique view. Experiments on the Human3.6M data set show that when using Resnet-50 as the backbone network and 256×256 of the image size, the average joint error of our algorithm is reduced to 25.2mm, which reaching the competitive result.
一种基于位置注意的多视图三维人体姿态估计算法
随着cnn的发展,人体姿态估计的研究取得了很大的进展,但仍然存在一个问题:以前基于cnn的方法没有很好地利用人体各关节位置之间的关系。考虑到全局空间信息和人体位置信息的先后顺序,提出了一种基于位置注意的多视角三维人体姿态估计算法。在二维检测阶段,采用位置编码在全局空间位置关系中重建图像。注意机制可以对各个通道之间的关系进行建模,捕获水平方向和垂直方向之间的依赖关系的特征映射,并从特征位置关系中挖掘细节,生成高质量的特征映射。在特征提取的最后阶段,利用相邻视图特征增强特征图像的空间表达能力,从而更好地解决遮挡和倾斜视图问题。在Human3.6M数据集上的实验表明,当使用Resnet-50作为骨干网,图像尺寸为256×256时,我们的算法的平均关节误差减小到25.2mm,达到了竞争的结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信