GT-WHAR: A Generic Graph-Based Temporal Framework for Wearable Human Activity Recognition With Multiple Sensors

IF 5.3 3区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Hailin Zou;Zijie Chen;Jing Zhang;Lei Wang;Fuchun Zhang;Jianqing Li;Yuanyuan Pan
{"title":"GT-WHAR: A Generic Graph-Based Temporal Framework for Wearable Human Activity Recognition With Multiple Sensors","authors":"Hailin Zou;Zijie Chen;Jing Zhang;Lei Wang;Fuchun Zhang;Jianqing Li;Yuanyuan Pan","doi":"10.1109/TETCI.2024.3378331","DOIUrl":null,"url":null,"abstract":"Using wearable sensors to identify human activities has elicited significant interest within the discipline of ubiquitous computing for everyday facilitation. Recent research has employed hybrid models to better leverage the modal information of sensors and temporal information, enabling improved performance for wearable human activity recognition. Nevertheless, the lack of effective exploitation of human structural information and limited capacity for cross-channel fusion remains a major challenge. This study proposes a generic design, called GT-WHAR, to accommodate the varying application scenarios and datasets while performing effective feature extraction and fusion. Firstly, a novel and unified representation paradigm, namely \n<italic>Body-Sensing Graph Representation</i>\n, has been proposed to represent body movement by a graph set, which incorporates structural information by considering the intrinsic connectivity of the skeletal structure. Secondly, the newly designed \n<italic>Body-Node Attention Graph Network</i>\n employs graph neural networks to extract and fuse the cross-channel information within the graph set. Eventually, the graph network has been embedded in the proposed \n<italic>Bidirectional Temporal Learning Network</i>\n, facilitating the extraction of temporal information in conjunction with the learned structural features. GT-WHAR outperformed the state-of-the-art methods in extensive experiments conducted on benchmark datasets, proving its validity and efficacy. Besides, we have demonstrated the generality of the framework through multiple research questions and provided an in-depth investigation of various influential factors.","PeriodicalId":13135,"journal":{"name":"IEEE Transactions on Emerging Topics in Computational Intelligence","volume":"8 6","pages":"3912-3924"},"PeriodicalIF":5.3000,"publicationDate":"2024-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Emerging Topics in Computational Intelligence","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10483025/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Using wearable sensors to identify human activities has elicited significant interest within the discipline of ubiquitous computing for everyday facilitation. Recent research has employed hybrid models to better leverage the modal information of sensors and temporal information, enabling improved performance for wearable human activity recognition. Nevertheless, the lack of effective exploitation of human structural information and limited capacity for cross-channel fusion remains a major challenge. This study proposes a generic design, called GT-WHAR, to accommodate the varying application scenarios and datasets while performing effective feature extraction and fusion. Firstly, a novel and unified representation paradigm, namely Body-Sensing Graph Representation , has been proposed to represent body movement by a graph set, which incorporates structural information by considering the intrinsic connectivity of the skeletal structure. Secondly, the newly designed Body-Node Attention Graph Network employs graph neural networks to extract and fuse the cross-channel information within the graph set. Eventually, the graph network has been embedded in the proposed Bidirectional Temporal Learning Network , facilitating the extraction of temporal information in conjunction with the learned structural features. GT-WHAR outperformed the state-of-the-art methods in extensive experiments conducted on benchmark datasets, proving its validity and efficacy. Besides, we have demonstrated the generality of the framework through multiple research questions and provided an in-depth investigation of various influential factors.
GT-WHAR:利用多个传感器进行可穿戴人体活动识别的通用图式时态框架
利用可穿戴传感器识别人类活动已引起泛在计算学科的极大兴趣,从而为日常生活提供便利。最近的研究采用了混合模型来更好地利用传感器的模态信息和时间信息,从而提高了可穿戴人体活动识别的性能。然而,缺乏对人体结构信息的有效利用以及跨通道融合能力有限仍然是一个重大挑战。本研究提出了一种名为 GT-WHAR 的通用设计,以适应不同的应用场景和数据集,同时进行有效的特征提取和融合。首先,本研究提出了一种新颖而统一的表示范式,即体感图表示法(Body-Sensing Graph Representation),通过考虑骨骼结构的内在连接性,将结构信息纳入图集,从而用图集表示人体运动。其次,新设计的身体节点注意力图网络采用图神经网络来提取和融合图集中的跨通道信息。最后,将图网络嵌入到所提出的双向时态学习网络中,便于结合所学结构特征提取时态信息。在基准数据集上进行的大量实验中,GT-WHAR 的表现优于最先进的方法,证明了其有效性和功效。此外,我们还通过多个研究问题证明了该框架的通用性,并对各种影响因素进行了深入研究。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
10.30
自引率
7.50%
发文量
147
期刊介绍: The IEEE Transactions on Emerging Topics in Computational Intelligence (TETCI) publishes original articles on emerging aspects of computational intelligence, including theory, applications, and surveys. TETCI is an electronics only publication. TETCI publishes six issues per year. Authors are encouraged to submit manuscripts in any emerging topic in computational intelligence, especially nature-inspired computing topics not covered by other IEEE Computational Intelligence Society journals. A few such illustrative examples are glial cell networks, computational neuroscience, Brain Computer Interface, ambient intelligence, non-fuzzy computing with words, artificial life, cultural learning, artificial endocrine networks, social reasoning, artificial hormone networks, computational intelligence for the IoT and Smart-X technologies.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信