Skeleton-based Human Action Recognition by the Integration of Euclidean distance

Yi Gao, Zhaokun Liu, Xinmeng Wu, Guangyuan Wu, Jiahui Zhao, Xiaofan Zhao
{"title":"Skeleton-based Human Action Recognition by the Integration of Euclidean distance","authors":"Yi Gao, Zhaokun Liu, Xinmeng Wu, Guangyuan Wu, Jiahui Zhao, Xiaofan Zhao","doi":"10.1145/3512576.3512585","DOIUrl":null,"url":null,"abstract":"With the growing popularity of somatosensory interaction devices, human action recognition becomes more and more attractive in many application scenarios. Nowadays, it takes a long time for some behavior recognition methods to train a model. To solve the problem, this paper proposes a human skeleton model for human action recognition. The joint coordinates are extracted using OpenPose and the thermodynamic diagram, and then we extract partial features and overall features, as well as Euclidean distances between joints. All the data collected above makes up the motion features of independent video frames. What's more, we leverage Multilayer Perceptron (MLP) classifier to classify all the motion features. On the KTH and ICPR datasets, we test the accuracy of the mode verified by changing several parameters. When the image resolution is 320*240, weight is 12 and stride is 1, the highest accuracy rate of single-person behavior recognition is 0.821. When weight is 10 and stride is 1, the highest accuracy rate of multi-person behavior recognition is 0.812, and the high running speed enables the mode to be real-time.","PeriodicalId":278114,"journal":{"name":"Proceedings of the 2021 9th International Conference on Information Technology: IoT and Smart City","volume":"15 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2021 9th International Conference on Information Technology: IoT and Smart City","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3512576.3512585","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

With the growing popularity of somatosensory interaction devices, human action recognition becomes more and more attractive in many application scenarios. Nowadays, it takes a long time for some behavior recognition methods to train a model. To solve the problem, this paper proposes a human skeleton model for human action recognition. The joint coordinates are extracted using OpenPose and the thermodynamic diagram, and then we extract partial features and overall features, as well as Euclidean distances between joints. All the data collected above makes up the motion features of independent video frames. What's more, we leverage Multilayer Perceptron (MLP) classifier to classify all the motion features. On the KTH and ICPR datasets, we test the accuracy of the mode verified by changing several parameters. When the image resolution is 320*240, weight is 12 and stride is 1, the highest accuracy rate of single-person behavior recognition is 0.821. When weight is 10 and stride is 1, the highest accuracy rate of multi-person behavior recognition is 0.812, and the high running speed enables the mode to be real-time.
基于欧几里得距离积分的骨骼人体动作识别
随着体感交互设备的日益普及,人体动作识别在许多应用场景中越来越具有吸引力。目前,一些行为识别方法需要很长时间才能训练出一个模型。为了解决这一问题,本文提出了一种用于人体动作识别的人体骨架模型。利用OpenPose和热力学图提取关节坐标,提取关节的局部特征和整体特征,以及关节间的欧氏距离。以上收集的所有数据构成了独立视频帧的运动特征。此外,我们利用多层感知器(MLP)分类器对所有的运动特征进行分类。在KTH和ICPR数据集上,我们通过改变几个参数验证了模型的准确性。当图像分辨率为320*240,体重为12,步幅为1时,单人行为识别的最高准确率为0.821。当体重为10,步幅为1时,多人行为识别的最高准确率为0.812,较高的跑速使模式具有实时性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信