LiPE: Lightweight human pose estimator for mobile applications towards automated pose analysis

Chengxiu Li , Ni Duan
{"title":"LiPE: Lightweight human pose estimator for mobile applications towards automated pose analysis","authors":"Chengxiu Li ,&nbsp;Ni Duan","doi":"10.1016/j.cogr.2024.11.005","DOIUrl":null,"url":null,"abstract":"<div><div>Current human pose estimation models adopt heavy backbones and complex feature enhance- ment modules to pursue higher accuracy. However, they ignore the need for model efficiency in real-world applications. In real-world scenarios such as sports teaching and automated sports analysis for better preservation of traditional folk sports, human pose estimation often needs to be performed on mobile devices with limited computing resources. In this paper, we propose a lightweight human pose estimator termed LiPE. LiPE adopts a lightweight MobileNetV2 backbone for feature extraction and lightweight depthwise separable deconvolution modules for upsampling. Predictions are made at a high resolution with a lightweight prediction head. Compared with the baseline, our model reduces MACs by 93.2 %, and reduces the number of parameters by 93.9 %, while the accuracy drops by only 3.2 %. Based on LiPE, we develop a real- time human pose estimation and evaluation system for automated pose analysis. Experimental results show that our LiPE achieves high computational efficiency and good accuracy for application on mobile devices.</div></div>","PeriodicalId":100288,"journal":{"name":"Cognitive Robotics","volume":"5 ","pages":"Pages 26-36"},"PeriodicalIF":0.0000,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cognitive Robotics","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2667241324000193","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Current human pose estimation models adopt heavy backbones and complex feature enhance- ment modules to pursue higher accuracy. However, they ignore the need for model efficiency in real-world applications. In real-world scenarios such as sports teaching and automated sports analysis for better preservation of traditional folk sports, human pose estimation often needs to be performed on mobile devices with limited computing resources. In this paper, we propose a lightweight human pose estimator termed LiPE. LiPE adopts a lightweight MobileNetV2 backbone for feature extraction and lightweight depthwise separable deconvolution modules for upsampling. Predictions are made at a high resolution with a lightweight prediction head. Compared with the baseline, our model reduces MACs by 93.2 %, and reduces the number of parameters by 93.9 %, while the accuracy drops by only 3.2 %. Based on LiPE, we develop a real- time human pose estimation and evaluation system for automated pose analysis. Experimental results show that our LiPE achieves high computational efficiency and good accuracy for application on mobile devices.
LiPE:用于移动应用程序的轻量级人体姿势估计器,用于自动姿势分析
目前的人体姿态估计模型采用重型骨架和复杂的特征增强模块来追求更高的精度。然而,它们忽略了实际应用中对模型效率的需求。在体育教学和自动化体育分析等现实场景中,为了更好地保存传统民间体育,通常需要在计算资源有限的移动设备上进行人体姿势估计。在本文中,我们提出了一个轻量级的人体姿态估计器LiPE。LiPE采用轻量级的MobileNetV2主干进行特征提取,轻量级的深度可分离反卷积模块进行上采样。预测是在一个轻量级的预测头的高分辨率。与基线相比,我们的模型减少了93.2%的mac,减少了93.9%的参数数量,而准确率仅下降了3.2%。基于LiPE,我们开发了一个实时人体姿态估计和评估系统,用于自动姿态分析。实验结果表明,该算法具有较高的计算效率和较好的精度,适用于移动设备。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
8.40
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信