Human Activity Recognition Based on Transformer via Smart-phone Sensors

Y. Liang, Kaile Feng, Zizhuo Ren
{"title":"Human Activity Recognition Based on Transformer via Smart-phone Sensors","authors":"Y. Liang, Kaile Feng, Zizhuo Ren","doi":"10.1109/CCAI57533.2023.10201297","DOIUrl":null,"url":null,"abstract":"Capturing the spatial and temporal relationships of time-series signals is a significant obstacle for human activity recognition based on wearable devices. Traditional artificial intelligence algorithms cannot handle it well, with convolution-based models focusing on local feature extraction and recurrent networks lacking consideration of the spatial domain. This paper offers a deep learning architecture based on transformer to address the aforementioned issue with data collected from smart-phones embedded with three-axis accelerometers. The transformer model, as a deep learning network mainly applied to natural language processing (NLP), is good at processing time-series information, where the self-attention mechanism captures the dependencies of perceptual signals in the temporal and spatial domains, improving the overall comprehensibility. We implement convolutional neural networks (CNN) and long and short-term memory networks (LSTM) for evaluation while our proposed model achieves an average classification accuracy of 94.84%, which is an improvement compared to the traditional model.","PeriodicalId":285760,"journal":{"name":"2023 IEEE 3rd International Conference on Computer Communication and Artificial Intelligence (CCAI)","volume":"41 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE 3rd International Conference on Computer Communication and Artificial Intelligence (CCAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCAI57533.2023.10201297","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Capturing the spatial and temporal relationships of time-series signals is a significant obstacle for human activity recognition based on wearable devices. Traditional artificial intelligence algorithms cannot handle it well, with convolution-based models focusing on local feature extraction and recurrent networks lacking consideration of the spatial domain. This paper offers a deep learning architecture based on transformer to address the aforementioned issue with data collected from smart-phones embedded with three-axis accelerometers. The transformer model, as a deep learning network mainly applied to natural language processing (NLP), is good at processing time-series information, where the self-attention mechanism captures the dependencies of perceptual signals in the temporal and spatial domains, improving the overall comprehensibility. We implement convolutional neural networks (CNN) and long and short-term memory networks (LSTM) for evaluation while our proposed model achieves an average classification accuracy of 94.84%, which is an improvement compared to the traditional model.
基于智能手机传感器的变压器人体活动识别
捕获时间序列信号的时空关系是基于可穿戴设备的人体活动识别的一个重要障碍。传统的人工智能算法不能很好地处理它,基于卷积的模型侧重于局部特征提取,而循环网络缺乏对空间域的考虑。本文提供了一种基于变压器的深度学习架构,通过从嵌入三轴加速度计的智能手机收集数据来解决上述问题。变压器模型作为一种主要应用于自然语言处理(NLP)的深度学习网络,擅长处理时间序列信息,其自注意机制捕获了感知信号在时空域的依赖关系,提高了整体的可理解性。我们实现了卷积神经网络(CNN)和长短期记忆网络(LSTM)进行评估,我们提出的模型达到了94.84%的平均分类准确率,与传统模型相比有了很大的提高。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信