A Novel Deep BiGRU-ResNet Model for Human Activity Recognition using Smartphone Sensors

S. Mekruksavanich, Ponnipa Jantawong, Narit Hnoohom, A. Jitpattanakul
{"title":"A Novel Deep BiGRU-ResNet Model for Human Activity Recognition using Smartphone Sensors","authors":"S. Mekruksavanich, Ponnipa Jantawong, Narit Hnoohom, A. Jitpattanakul","doi":"10.1109/jcsse54890.2022.9836276","DOIUrl":null,"url":null,"abstract":"Human activity recognition (HAR) employing wearable sensors is utilized in several implementations, including remote health monitoring and exercise performance. The most widely used HAR research is inspired by traditional machine learning and developing methodologies using deep learning. Whereas machine learning techniques have proven effective in resolving HAR, these require human feature extraction. Consequently, deep learning methods have been designed to circumvent this constraint autonomously rather than manually extracting information. This paper provides an innovative deep residual learning approach based on LSTM-CNN and deep residual modeling techniques. The objective of the proposed model, BiGRUResNet, was to increase accuracy while decreasing the number of parameters. Two BiGRU layers, three residual layers, one global average pooling layer, and one softmax layer are present. Utilizing a publicly recognized UCI-HAR dataset, the proposed model was analyzed. Results of the experiment indicate that the proposed model outperforms previous deep learning-based models in 5-fold cross-validation, with a 99.09% accuracy and 99.15% F1 score.","PeriodicalId":284735,"journal":{"name":"2022 19th International Joint Conference on Computer Science and Software Engineering (JCSSE)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 19th International Joint Conference on Computer Science and Software Engineering (JCSSE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/jcsse54890.2022.9836276","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

Human activity recognition (HAR) employing wearable sensors is utilized in several implementations, including remote health monitoring and exercise performance. The most widely used HAR research is inspired by traditional machine learning and developing methodologies using deep learning. Whereas machine learning techniques have proven effective in resolving HAR, these require human feature extraction. Consequently, deep learning methods have been designed to circumvent this constraint autonomously rather than manually extracting information. This paper provides an innovative deep residual learning approach based on LSTM-CNN and deep residual modeling techniques. The objective of the proposed model, BiGRUResNet, was to increase accuracy while decreasing the number of parameters. Two BiGRU layers, three residual layers, one global average pooling layer, and one softmax layer are present. Utilizing a publicly recognized UCI-HAR dataset, the proposed model was analyzed. Results of the experiment indicate that the proposed model outperforms previous deep learning-based models in 5-fold cross-validation, with a 99.09% accuracy and 99.15% F1 score.
基于智能手机传感器的人类活动识别深度BiGRU-ResNet模型
采用可穿戴传感器的人体活动识别(HAR)在多种实现中得到了应用,包括远程健康监测和运动表现。最广泛使用的HAR研究受到传统机器学习和使用深度学习的开发方法的启发。虽然机器学习技术已被证明在解决HAR方面是有效的,但这需要人类的特征提取。因此,深度学习方法被设计为自主地绕过这一约束,而不是手动提取信息。本文提出了一种基于LSTM-CNN和深度残差建模技术的创新深度残差学习方法。提出的模型BiGRUResNet的目标是在减少参数数量的同时提高准确性。其中有2个BiGRU层、3个残差层、1个全局平均池化层和1个softmax层。利用公开认可的UCI-HAR数据集,对所提出的模型进行了分析。实验结果表明,该模型在5次交叉验证中优于以往基于深度学习的模型,准确率达到99.09%,F1分数达到99.15%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信