A Simplified Skeleton Joints Based Approach For Human Action Recognition

N. Malik, S. Abu-Bakar, U. U. Sheikh
{"title":"A Simplified Skeleton Joints Based Approach For Human Action Recognition","authors":"N. Malik, S. Abu-Bakar, U. U. Sheikh","doi":"10.1109/ICSIPA52582.2021.9576770","DOIUrl":null,"url":null,"abstract":"The growing technological development in the field of computer vision in general, and human action recognition (HAR), in particular, have attracted increasing number of researchers from various disciplines. Amid the variety of challenges in the field of human action recognition, one of the major issues is complex modelling which requires multiple parameters leading to troublesome training which further requires heavy configuration machines for real-time recognition. Therefore, there is a need to develop a simplified method that could result in reduced complexity, without compromising the performance accuracy. In order to address the mentioned issue, this paper proposes an approach that extracts the mean, variance and median from the skeleton joint locations and directly uses them in the classification process. The system used MCAD dataset for extracting 2D skeleton features with the help of OpenPose technique, which is suitable for the extraction of skeleton features from the 2D image instead of 3D image or using a depth sensor. Henceforth, we avoid using either the RGB images or the skeleton images in the recognition process. The method shows a promising performance with an accuracy of 73.8% when tested with MCAD dataset.","PeriodicalId":326688,"journal":{"name":"2021 IEEE International Conference on Signal and Image Processing Applications (ICSIPA)","volume":"163 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Conference on Signal and Image Processing Applications (ICSIPA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICSIPA52582.2021.9576770","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

The growing technological development in the field of computer vision in general, and human action recognition (HAR), in particular, have attracted increasing number of researchers from various disciplines. Amid the variety of challenges in the field of human action recognition, one of the major issues is complex modelling which requires multiple parameters leading to troublesome training which further requires heavy configuration machines for real-time recognition. Therefore, there is a need to develop a simplified method that could result in reduced complexity, without compromising the performance accuracy. In order to address the mentioned issue, this paper proposes an approach that extracts the mean, variance and median from the skeleton joint locations and directly uses them in the classification process. The system used MCAD dataset for extracting 2D skeleton features with the help of OpenPose technique, which is suitable for the extraction of skeleton features from the 2D image instead of 3D image or using a depth sensor. Henceforth, we avoid using either the RGB images or the skeleton images in the recognition process. The method shows a promising performance with an accuracy of 73.8% when tested with MCAD dataset.
基于简化骨骼关节的人体动作识别方法
随着计算机视觉领域技术的不断发展,尤其是人体动作识别(HAR),吸引了越来越多来自不同学科的研究人员。在人体动作识别领域面临的各种挑战中,一个主要问题是建模复杂,需要多个参数,导致训练困难,这进一步要求重型配置机器进行实时识别。因此,有必要开发一种简化的方法,在不影响性能准确性的情况下降低复杂性。为了解决上述问题,本文提出了一种从骨骼关节位置提取均值、方差和中位数并直接用于分类过程的方法。该系统使用MCAD数据集提取二维骨骼特征,并借助OpenPose技术,适合于从二维图像中提取骨骼特征,而不是3D图像或使用深度传感器。因此,我们在识别过程中尽量避免使用RGB图像或骨架图像。通过对MCAD数据集的测试,表明该方法具有良好的性能,准确率达到73.8%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信