A Lightweight and Explainable Hybrid Deep Learning Model for Wearable Sensor-Based Human Activity Recognition

IF 4.3 2区 综合性期刊 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC
Pratibha Tokas;Vijay Bhaskar Semwal;Sweta Jain
{"title":"A Lightweight and Explainable Hybrid Deep Learning Model for Wearable Sensor-Based Human Activity Recognition","authors":"Pratibha Tokas;Vijay Bhaskar Semwal;Sweta Jain","doi":"10.1109/JSEN.2025.3564045","DOIUrl":null,"url":null,"abstract":"Human activity recognition (HAR) is critical for rehabilitation and clinical monitoring, but robust recognition using wearable sensors (e.g., sEMG or IMU) remains challenging due to signal noise and variability. We propose X-LiteHAR, a lightweight, explainable hybrid deep learning framework for real-time HAR, combining adaptive EEMD for noise-robust signal enhancement and a multihead CNN-LSTM for spatio-temporal feature learning. The optimized framework demonstrates efficient edge deployment through structured pruning and quantization, achieving 70% model size reduction while maintaining competitive performance, with on-device validation on an Android OnePlus 6T smartphone showing 9 ms inference latency. The model was trained and evaluated independently on two distinct datasets: 1) the UCI sEMG dataset (muscle activity signals) and 2) the IMU-only MHealth dataset (motion signals), demonstrating the architecture’s adaptability to different sensor modalities. On the UCI dataset, X-LiteHAR achieved 99.0% accuracy (healthy subjects) and 98.7% (pathological), while on MHealth (IMU-only), it reached 99.2% accuracy. Leveraging explainable AI (XAI), we interpret muscle activation patterns for personalized rehabilitation insights. By unifying signal processing, efficient deep learning, and interpretability, X-LiteHAR advances real-time HAR for clinical and wearable applications.","PeriodicalId":447,"journal":{"name":"IEEE Sensors Journal","volume":"25 12","pages":"22618-22628"},"PeriodicalIF":4.3000,"publicationDate":"2025-04-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Journal","FirstCategoryId":"103","ListUrlMain":"https://ieeexplore.ieee.org/document/10981518/","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Human activity recognition (HAR) is critical for rehabilitation and clinical monitoring, but robust recognition using wearable sensors (e.g., sEMG or IMU) remains challenging due to signal noise and variability. We propose X-LiteHAR, a lightweight, explainable hybrid deep learning framework for real-time HAR, combining adaptive EEMD for noise-robust signal enhancement and a multihead CNN-LSTM for spatio-temporal feature learning. The optimized framework demonstrates efficient edge deployment through structured pruning and quantization, achieving 70% model size reduction while maintaining competitive performance, with on-device validation on an Android OnePlus 6T smartphone showing 9 ms inference latency. The model was trained and evaluated independently on two distinct datasets: 1) the UCI sEMG dataset (muscle activity signals) and 2) the IMU-only MHealth dataset (motion signals), demonstrating the architecture’s adaptability to different sensor modalities. On the UCI dataset, X-LiteHAR achieved 99.0% accuracy (healthy subjects) and 98.7% (pathological), while on MHealth (IMU-only), it reached 99.2% accuracy. Leveraging explainable AI (XAI), we interpret muscle activation patterns for personalized rehabilitation insights. By unifying signal processing, efficient deep learning, and interpretability, X-LiteHAR advances real-time HAR for clinical and wearable applications.
基于可穿戴传感器的人体活动识别的轻量级可解释混合深度学习模型
人类活动识别(HAR)对于康复和临床监测至关重要,但由于信号噪声和可变性,使用可穿戴传感器(如肌电图或IMU)进行鲁棒识别仍然具有挑战性。我们提出了用于实时HAR的轻量级、可解释的混合深度学习框架X-LiteHAR,结合了用于噪声鲁棒信号增强的自适应EEMD和用于时空特征学习的多头CNN-LSTM。优化后的框架通过结构化修剪和量化展示了高效的边缘部署,在保持竞争性能的同时实现了70%的模型尺寸缩减,在Android OnePlus 6T智能手机上的设备验证显示了9毫秒的推理延迟。该模型在两个不同的数据集上进行了独立的训练和评估:1)UCI表面肌电信号数据集(肌肉活动信号)和2)仅imu移动健康数据集(运动信号),展示了该架构对不同传感器模式的适应性。在UCI数据集上,X-LiteHAR达到99.0%(健康受试者)和98.7%(病理受试者)的准确率,而在移动健康(仅imu)上,它达到99.2%的准确率。利用可解释的人工智能(XAI),我们解释肌肉激活模式,以获得个性化的康复见解。通过统一信号处理、高效深度学习和可解释性,X-LiteHAR为临床和可穿戴应用推进了实时HAR。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Sensors Journal
IEEE Sensors Journal 工程技术-工程:电子与电气
CiteScore
7.70
自引率
14.00%
发文量
2058
审稿时长
5.2 months
期刊介绍: The fields of interest of the IEEE Sensors Journal are the theory, design , fabrication, manufacturing and applications of devices for sensing and transducing physical, chemical and biological phenomena, with emphasis on the electronics and physics aspect of sensors and integrated sensors-actuators. IEEE Sensors Journal deals with the following: -Sensor Phenomenology, Modelling, and Evaluation -Sensor Materials, Processing, and Fabrication -Chemical and Gas Sensors -Microfluidics and Biosensors -Optical Sensors -Physical Sensors: Temperature, Mechanical, Magnetic, and others -Acoustic and Ultrasonic Sensors -Sensor Packaging -Sensor Networks -Sensor Applications -Sensor Systems: Signals, Processing, and Interfaces -Actuators and Sensor Power Systems -Sensor Signal Processing for high precision and stability (amplification, filtering, linearization, modulation/demodulation) and under harsh conditions (EMC, radiation, humidity, temperature); energy consumption/harvesting -Sensor Data Processing (soft computing with sensor data, e.g., pattern recognition, machine learning, evolutionary computation; sensor data fusion, processing of wave e.g., electromagnetic and acoustic; and non-wave, e.g., chemical, gravity, particle, thermal, radiative and non-radiative sensor data, detection, estimation and classification based on sensor data) -Sensors in Industrial Practice
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信