基于传感器的人类活动识别的对比自监督学习

Bulat Khaertdinov, E. Ghaleb, S. Asteriadis
{"title":"基于传感器的人类活动识别的对比自监督学习","authors":"Bulat Khaertdinov, E. Ghaleb, S. Asteriadis","doi":"10.1109/IJCB52358.2021.9484410","DOIUrl":null,"url":null,"abstract":"Deep Learning models, applied to a sensor-based Human Activity Recognition task, usually require vast amounts of annotated time-series data to extract robust features. However, annotating signals coming from wearable sensors can be a tedious and, often, not so intuitive process, that requires specialized tools and predefined scenarios, making it an expensive and time-consuming task. This paper combines one of the most recent advances in Self-Supervised Leaning (SSL), namely a SimCLR framework, with a powerful transformer-based encoder to introduce a Contrastive Self-supervised learning approach to Sensor-based Human Activity Recognition (CSSHAR) that learns feature representations from unlabeled sensory data. Extensive experiments conducted on three widely used public datasets have shown that the proposed method outperforms recent SSL models. Moreover, CSSHAR is capable of extracting more robust features than the identical supervised transformer when transferring knowledge from one dataset to another as well as when very limited amounts of annotated data are available.","PeriodicalId":175984,"journal":{"name":"2021 IEEE International Joint Conference on Biometrics (IJCB)","volume":"61 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-08-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"27","resultStr":"{\"title\":\"Contrastive Self-supervised Learning for Sensor-based Human Activity Recognition\",\"authors\":\"Bulat Khaertdinov, E. Ghaleb, S. Asteriadis\",\"doi\":\"10.1109/IJCB52358.2021.9484410\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Deep Learning models, applied to a sensor-based Human Activity Recognition task, usually require vast amounts of annotated time-series data to extract robust features. However, annotating signals coming from wearable sensors can be a tedious and, often, not so intuitive process, that requires specialized tools and predefined scenarios, making it an expensive and time-consuming task. This paper combines one of the most recent advances in Self-Supervised Leaning (SSL), namely a SimCLR framework, with a powerful transformer-based encoder to introduce a Contrastive Self-supervised learning approach to Sensor-based Human Activity Recognition (CSSHAR) that learns feature representations from unlabeled sensory data. Extensive experiments conducted on three widely used public datasets have shown that the proposed method outperforms recent SSL models. Moreover, CSSHAR is capable of extracting more robust features than the identical supervised transformer when transferring knowledge from one dataset to another as well as when very limited amounts of annotated data are available.\",\"PeriodicalId\":175984,\"journal\":{\"name\":\"2021 IEEE International Joint Conference on Biometrics (IJCB)\",\"volume\":\"61 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-08-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"27\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 IEEE International Joint Conference on Biometrics (IJCB)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCB52358.2021.9484410\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Joint Conference on Biometrics (IJCB)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCB52358.2021.9484410","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 27

摘要

应用于基于传感器的人类活动识别任务的深度学习模型通常需要大量带注释的时间序列数据来提取鲁棒性特征。然而,注释来自可穿戴传感器的信号可能是一个繁琐的过程,而且通常不那么直观,需要专门的工具和预定义的场景,这使得它成为一项昂贵且耗时的任务。本文结合了自监督学习(SSL)的最新进展之一,即SimCLR框架,以及一个强大的基于变压器的编码器,为基于传感器的人类活动识别(CSSHAR)引入了一种对比自监督学习方法,该方法从未标记的感官数据中学习特征表示。在三个广泛使用的公共数据集上进行的大量实验表明,所提出的方法优于最近的SSL模型。此外,在将知识从一个数据集转移到另一个数据集以及可用的注释数据数量非常有限的情况下,CSSHAR能够提取比相同的监督转换器更健壮的特征。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Contrastive Self-supervised Learning for Sensor-based Human Activity Recognition
Deep Learning models, applied to a sensor-based Human Activity Recognition task, usually require vast amounts of annotated time-series data to extract robust features. However, annotating signals coming from wearable sensors can be a tedious and, often, not so intuitive process, that requires specialized tools and predefined scenarios, making it an expensive and time-consuming task. This paper combines one of the most recent advances in Self-Supervised Leaning (SSL), namely a SimCLR framework, with a powerful transformer-based encoder to introduce a Contrastive Self-supervised learning approach to Sensor-based Human Activity Recognition (CSSHAR) that learns feature representations from unlabeled sensory data. Extensive experiments conducted on three widely used public datasets have shown that the proposed method outperforms recent SSL models. Moreover, CSSHAR is capable of extracting more robust features than the identical supervised transformer when transferring knowledge from one dataset to another as well as when very limited amounts of annotated data are available.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信