基于传感器的HAR的无目标数据的半监督解纠缠上下文和活动特征

IF 4.3 2区 综合性期刊 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC
Furong Duan;Tao Zhu;Liming Chen;Huansheng Ning;Chao Liu;Yaping Wan
{"title":"基于传感器的HAR的无目标数据的半监督解纠缠上下文和活动特征","authors":"Furong Duan;Tao Zhu;Liming Chen;Huansheng Ning;Chao Liu;Yaping Wan","doi":"10.1109/JSEN.2025.3543928","DOIUrl":null,"url":null,"abstract":"Classic deep learning methods for human activity recognition (HAR) from wearable sensors struggle with cross-person and cross-position challenges due to nonidentical data distributions caused by context variations (e.g., user, sensor placement). Existing solutions show promise but usually require extensive labeled data from source and target contexts, which is often unavailable in real-world scenarios. To address these limitations, we introduce semi-supervised context agnostic representation learning without target (SCAGOT), a novel semi-supervised approach that learns context-agnostic activity representations without relying on target context data. SCAGOT uses a dual-stream architecture with adversarial disentanglement and a contrastive clustering mechanism. This effectively separates context features from context-agnostic activity features, maximizing intraclass compactness and interclass separability in the activity representation space. In addition, a new inverse cross-entropy loss further refines the representations by removing residual context information. Extensive evaluations on four benchmark datasets demonstrate that SCAGOT outperforms state-of-the-art methods in cross-person and cross-position HAR, offering a promising solution for robust real-world activity recognition.","PeriodicalId":447,"journal":{"name":"IEEE Sensors Journal","volume":"25 8","pages":"14220-14234"},"PeriodicalIF":4.3000,"publicationDate":"2025-03-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"SCAGOT: Semi-Supervised Disentangling Context and Activity Features Without Target Data for Sensor-Based HAR\",\"authors\":\"Furong Duan;Tao Zhu;Liming Chen;Huansheng Ning;Chao Liu;Yaping Wan\",\"doi\":\"10.1109/JSEN.2025.3543928\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Classic deep learning methods for human activity recognition (HAR) from wearable sensors struggle with cross-person and cross-position challenges due to nonidentical data distributions caused by context variations (e.g., user, sensor placement). Existing solutions show promise but usually require extensive labeled data from source and target contexts, which is often unavailable in real-world scenarios. To address these limitations, we introduce semi-supervised context agnostic representation learning without target (SCAGOT), a novel semi-supervised approach that learns context-agnostic activity representations without relying on target context data. SCAGOT uses a dual-stream architecture with adversarial disentanglement and a contrastive clustering mechanism. This effectively separates context features from context-agnostic activity features, maximizing intraclass compactness and interclass separability in the activity representation space. In addition, a new inverse cross-entropy loss further refines the representations by removing residual context information. Extensive evaluations on four benchmark datasets demonstrate that SCAGOT outperforms state-of-the-art methods in cross-person and cross-position HAR, offering a promising solution for robust real-world activity recognition.\",\"PeriodicalId\":447,\"journal\":{\"name\":\"IEEE Sensors Journal\",\"volume\":\"25 8\",\"pages\":\"14220-14234\"},\"PeriodicalIF\":4.3000,\"publicationDate\":\"2025-03-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Sensors Journal\",\"FirstCategoryId\":\"103\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10909234/\",\"RegionNum\":2,\"RegionCategory\":\"综合性期刊\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Journal","FirstCategoryId":"103","ListUrlMain":"https://ieeexplore.ieee.org/document/10909234/","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

摘要

来自可穿戴传感器的人类活动识别(HAR)的经典深度学习方法由于上下文变化(例如,用户,传感器位置)引起的数据分布不相同而面临跨人和跨位置的挑战。现有的解决方案显示出希望,但通常需要来自源和目标上下文的大量标记数据,这在实际场景中通常是不可用的。为了解决这些限制,我们引入了半监督无目标上下文不可知论表示学习(SCAGOT),这是一种新的半监督方法,可以在不依赖目标上下文数据的情况下学习上下文不可知论活动表示。SCAGOT使用具有对抗性解纠缠和对比聚类机制的双流架构。这有效地将上下文特征从与上下文无关的活动特征中分离出来,最大限度地提高了活动表示空间中的类内紧凑性和类间可分离性。此外,一种新的逆交叉熵损失算法通过去除残差上下文信息进一步细化表征。对四个基准数据集的广泛评估表明,SCAGOT在跨人和跨位置HAR方面优于最先进的方法,为鲁棒的现实世界活动识别提供了一个有前途的解决方案。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
SCAGOT: Semi-Supervised Disentangling Context and Activity Features Without Target Data for Sensor-Based HAR
Classic deep learning methods for human activity recognition (HAR) from wearable sensors struggle with cross-person and cross-position challenges due to nonidentical data distributions caused by context variations (e.g., user, sensor placement). Existing solutions show promise but usually require extensive labeled data from source and target contexts, which is often unavailable in real-world scenarios. To address these limitations, we introduce semi-supervised context agnostic representation learning without target (SCAGOT), a novel semi-supervised approach that learns context-agnostic activity representations without relying on target context data. SCAGOT uses a dual-stream architecture with adversarial disentanglement and a contrastive clustering mechanism. This effectively separates context features from context-agnostic activity features, maximizing intraclass compactness and interclass separability in the activity representation space. In addition, a new inverse cross-entropy loss further refines the representations by removing residual context information. Extensive evaluations on four benchmark datasets demonstrate that SCAGOT outperforms state-of-the-art methods in cross-person and cross-position HAR, offering a promising solution for robust real-world activity recognition.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE Sensors Journal
IEEE Sensors Journal 工程技术-工程:电子与电气
CiteScore
7.70
自引率
14.00%
发文量
2058
审稿时长
5.2 months
期刊介绍: The fields of interest of the IEEE Sensors Journal are the theory, design , fabrication, manufacturing and applications of devices for sensing and transducing physical, chemical and biological phenomena, with emphasis on the electronics and physics aspect of sensors and integrated sensors-actuators. IEEE Sensors Journal deals with the following: -Sensor Phenomenology, Modelling, and Evaluation -Sensor Materials, Processing, and Fabrication -Chemical and Gas Sensors -Microfluidics and Biosensors -Optical Sensors -Physical Sensors: Temperature, Mechanical, Magnetic, and others -Acoustic and Ultrasonic Sensors -Sensor Packaging -Sensor Networks -Sensor Applications -Sensor Systems: Signals, Processing, and Interfaces -Actuators and Sensor Power Systems -Sensor Signal Processing for high precision and stability (amplification, filtering, linearization, modulation/demodulation) and under harsh conditions (EMC, radiation, humidity, temperature); energy consumption/harvesting -Sensor Data Processing (soft computing with sensor data, e.g., pattern recognition, machine learning, evolutionary computation; sensor data fusion, processing of wave e.g., electromagnetic and acoustic; and non-wave, e.g., chemical, gravity, particle, thermal, radiative and non-radiative sensor data, detection, estimation and classification based on sensor data) -Sensors in Industrial Practice
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信