An Image-based Human Physical Activities Recognition in an Indoor Environment

F. Ullah, Asif Iqbal, Ajmal Khan, Rida Gul Khan, Laraib Malik, K. Kwak
{"title":"An Image-based Human Physical Activities Recognition in an Indoor Environment","authors":"F. Ullah, Asif Iqbal, Ajmal Khan, Rida Gul Khan, Laraib Malik, K. Kwak","doi":"10.1109/ICTC49870.2020.9289314","DOIUrl":null,"url":null,"abstract":"In this paper, we propose real-time image-based recognition of human activities from series of images considering different human actions performed in an indoor environment.The proposed image-based human activity recognition(IHAR)system can be utilized for assisting the life of disabled persons, surveillance and human tracking, human computer interaction,and efficient resource utilization. The proposed IHAR system consists of closed-circuit television (CCTV) camera based image acquisitioning, various filtering based image enhancement, principle component analysis(PCA) based features extraction, and various machine learning algorithms for recognition accuracy performance comparison. We collected dataset of 10 different activities such as walking, sitting down and standing up consists of 35,530 images. The dataset is divided into(90%,10%),(80%,20%), and(70%,30%)training and testing respectively and evaluated three classifier K-nearest neighbors (KNN), Random Forest (RF), and Decision Tree(DT). The experimental results show the accuracy of 95%, 97%, and 90% by KNN, RF, and DT respectively.","PeriodicalId":282243,"journal":{"name":"2020 International Conference on Information and Communication Technology Convergence (ICTC)","volume":"54 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 International Conference on Information and Communication Technology Convergence (ICTC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICTC49870.2020.9289314","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

In this paper, we propose real-time image-based recognition of human activities from series of images considering different human actions performed in an indoor environment.The proposed image-based human activity recognition(IHAR)system can be utilized for assisting the life of disabled persons, surveillance and human tracking, human computer interaction,and efficient resource utilization. The proposed IHAR system consists of closed-circuit television (CCTV) camera based image acquisitioning, various filtering based image enhancement, principle component analysis(PCA) based features extraction, and various machine learning algorithms for recognition accuracy performance comparison. We collected dataset of 10 different activities such as walking, sitting down and standing up consists of 35,530 images. The dataset is divided into(90%,10%),(80%,20%), and(70%,30%)training and testing respectively and evaluated three classifier K-nearest neighbors (KNN), Random Forest (RF), and Decision Tree(DT). The experimental results show the accuracy of 95%, 97%, and 90% by KNN, RF, and DT respectively.
室内环境下基于图像的人体运动识别
在本文中,我们提出了一种基于实时图像的人类活动识别方法,该方法从一系列图像中考虑室内环境中不同的人类行为。所提出的基于图像的人体活动识别(IHAR)系统可用于辅助残疾人的生活、监视和人体跟踪、人机交互和有效的资源利用。提出的IHAR系统由基于闭路电视(CCTV)摄像机的图像采集、基于各种滤波的图像增强、基于主成分分析(PCA)的特征提取以及用于识别精度性能比较的各种机器学习算法组成。我们收集了走路、坐着、站着等10种不同活动的数据集,包括35530张图像。将数据集分别分为(90%,10%)、(80%,20%)和(70%,30%)的训练和测试,并评估了k -近邻(KNN)、随机森林(RF)和决策树(DT)三个分类器。实验结果表明,KNN、RF和DT的准确率分别为95%、97%和90%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信