An Effective Action Recognition Method Based on Image Coding and a Dual-Channel Fusion Network

IF 4.3 2区 综合性期刊 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC
Yukun Wang;Junlong Zhu
{"title":"An Effective Action Recognition Method Based on Image Coding and a Dual-Channel Fusion Network","authors":"Yukun Wang;Junlong Zhu","doi":"10.1109/JSEN.2025.3596568","DOIUrl":null,"url":null,"abstract":"Action recognition is a research hotspot in artificial intelligence, with significant applications in intelligent sports analysis, health monitoring, and human–computer interaction. Traditional methods rely on high-frame-rate cameras or complex motion capture systems, which are costly and highly dependent on environmental conditions. In contrast, data-driven methods based on wearable sensors have gained widespread attention due to their portability and cost-effectiveness. In this article, we propose an action recognition method based on image encoding and a dual-channel feature extraction network. We convert time-series data collected from wearable sensors into color images through image encoding, fully preserving the temporal information and multidimensional feature relationships in the data. Then, we design a dual-channel feature extraction network that extracts complex features using a multiscale spatial channel attention (MSCA) module, a dual-stream alternating feature fusion (DAF) module, and a weighted loss function (WFL). We conducted experiments on the USC-HAD and PAMAP2 datasets, demonstrating that our method outperforms several state-of-the-art methods. Ablation studies further verify the contributions of the backbone network, fusion module, classifier, and loss function to the overall performance. Overall, our method provides a new solution for action recognition tasks and shows broad application prospects.","PeriodicalId":447,"journal":{"name":"IEEE Sensors Journal","volume":"25 18","pages":"35144-35156"},"PeriodicalIF":4.3000,"publicationDate":"2025-08-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Journal","FirstCategoryId":"103","ListUrlMain":"https://ieeexplore.ieee.org/document/11124422/","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Action recognition is a research hotspot in artificial intelligence, with significant applications in intelligent sports analysis, health monitoring, and human–computer interaction. Traditional methods rely on high-frame-rate cameras or complex motion capture systems, which are costly and highly dependent on environmental conditions. In contrast, data-driven methods based on wearable sensors have gained widespread attention due to their portability and cost-effectiveness. In this article, we propose an action recognition method based on image encoding and a dual-channel feature extraction network. We convert time-series data collected from wearable sensors into color images through image encoding, fully preserving the temporal information and multidimensional feature relationships in the data. Then, we design a dual-channel feature extraction network that extracts complex features using a multiscale spatial channel attention (MSCA) module, a dual-stream alternating feature fusion (DAF) module, and a weighted loss function (WFL). We conducted experiments on the USC-HAD and PAMAP2 datasets, demonstrating that our method outperforms several state-of-the-art methods. Ablation studies further verify the contributions of the backbone network, fusion module, classifier, and loss function to the overall performance. Overall, our method provides a new solution for action recognition tasks and shows broad application prospects.
基于图像编码和双通道融合网络的有效动作识别方法
动作识别是人工智能领域的一个研究热点,在智能运动分析、健康监测、人机交互等方面有着重要的应用。传统的方法依赖于高帧率相机或复杂的运动捕捉系统,这些系统成本高昂且高度依赖于环境条件。相比之下,基于可穿戴传感器的数据驱动方法因其便携性和成本效益而受到广泛关注。本文提出了一种基于图像编码和双通道特征提取网络的动作识别方法。我们通过图像编码将可穿戴传感器采集的时间序列数据转换为彩色图像,充分保留了数据中的时间信息和多维特征关系。然后,我们设计了一个双通道特征提取网络,该网络使用多尺度空间通道注意(MSCA)模块、双流交替特征融合(DAF)模块和加权损失函数(WFL)提取复杂特征。我们在USC-HAD和PAMAP2数据集上进行了实验,证明我们的方法优于几种最先进的方法。消融研究进一步验证了骨干网、融合模块、分类器和损失函数对整体性能的贡献。该方法为动作识别任务提供了新的解决方案,具有广阔的应用前景。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Sensors Journal
IEEE Sensors Journal 工程技术-工程:电子与电气
CiteScore
7.70
自引率
14.00%
发文量
2058
审稿时长
5.2 months
期刊介绍: The fields of interest of the IEEE Sensors Journal are the theory, design , fabrication, manufacturing and applications of devices for sensing and transducing physical, chemical and biological phenomena, with emphasis on the electronics and physics aspect of sensors and integrated sensors-actuators. IEEE Sensors Journal deals with the following: -Sensor Phenomenology, Modelling, and Evaluation -Sensor Materials, Processing, and Fabrication -Chemical and Gas Sensors -Microfluidics and Biosensors -Optical Sensors -Physical Sensors: Temperature, Mechanical, Magnetic, and others -Acoustic and Ultrasonic Sensors -Sensor Packaging -Sensor Networks -Sensor Applications -Sensor Systems: Signals, Processing, and Interfaces -Actuators and Sensor Power Systems -Sensor Signal Processing for high precision and stability (amplification, filtering, linearization, modulation/demodulation) and under harsh conditions (EMC, radiation, humidity, temperature); energy consumption/harvesting -Sensor Data Processing (soft computing with sensor data, e.g., pattern recognition, machine learning, evolutionary computation; sensor data fusion, processing of wave e.g., electromagnetic and acoustic; and non-wave, e.g., chemical, gravity, particle, thermal, radiative and non-radiative sensor data, detection, estimation and classification based on sensor data) -Sensors in Industrial Practice
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信