Comparison Study of Inertial Sensor Signal Combination for Human Activity Recognition based on Convolutional Neural Networks

Farhad Nazari, N. Mohajer, D. Nahavandi, A. Khosravi, S. Nahavandi
{"title":"Comparison Study of Inertial Sensor Signal Combination for Human Activity Recognition based on Convolutional Neural Networks","authors":"Farhad Nazari, N. Mohajer, D. Nahavandi, A. Khosravi, S. Nahavandi","doi":"10.1109/HSI55341.2022.9869457","DOIUrl":null,"url":null,"abstract":"Human Activity Recognition (HAR) is one of the essential building blocks of so many applications like security, monitoring, the internet of things and human-robot interaction. The research community has developed various methodologies to detect human activity based on various input types. However, most of the research in the field has been focused on applications other than human-in-the-centre applications. This paper focused on optimising the input signals to maximise the HAR performance from wearable sensors. A model based on Convolutional Neural Networks (CNN) has been proposed and trained on different signal combinations of three Inertial Measurement Units (IMU) that exhibit the movements of the dominant hand, leg and chest of the subject. The results demonstrate k-fold cross-validation accuracy between 99.77 and 99.98% for signals with the modality of 12 or higher. The performance of lower dimension signals, except signals containing information from both chest and ankle, was far inferior, showing between 73 and 85% accuracy.","PeriodicalId":282607,"journal":{"name":"2022 15th International Conference on Human System Interaction (HSI)","volume":"142 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 15th International Conference on Human System Interaction (HSI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HSI55341.2022.9869457","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5

Abstract

Human Activity Recognition (HAR) is one of the essential building blocks of so many applications like security, monitoring, the internet of things and human-robot interaction. The research community has developed various methodologies to detect human activity based on various input types. However, most of the research in the field has been focused on applications other than human-in-the-centre applications. This paper focused on optimising the input signals to maximise the HAR performance from wearable sensors. A model based on Convolutional Neural Networks (CNN) has been proposed and trained on different signal combinations of three Inertial Measurement Units (IMU) that exhibit the movements of the dominant hand, leg and chest of the subject. The results demonstrate k-fold cross-validation accuracy between 99.77 and 99.98% for signals with the modality of 12 or higher. The performance of lower dimension signals, except signals containing information from both chest and ankle, was far inferior, showing between 73 and 85% accuracy.
基于卷积神经网络的惯性传感器信号组合人体活动识别比较研究
人类活动识别(HAR)是安全、监控、物联网和人机交互等许多应用的基本组成部分之一。研究界已经开发了各种方法来检测基于各种输入类型的人类活动。然而,该领域的大多数研究都集中在人类中心以外的应用上。本文的重点是优化输入信号,以最大限度地提高可穿戴传感器的HAR性能。提出了一种基于卷积神经网络(CNN)的模型,并对三个惯性测量单元(IMU)的不同信号组合进行了训练,这些单元显示了受试者的优势手、腿和胸部的运动。结果表明,对于模态为12或更高的信号,交叉验证准确率在99.77 ~ 99.98%之间。除了包含胸部和脚踝信息的信号外,低维信号的表现要差得多,准确率在73 - 85%之间。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信