Yanhui Ren , Di Wang , Lingling An , Shiwen Mao , Xuyu Wang
{"title":"Quantum contrastive learning for human activity recognition","authors":"Yanhui Ren , Di Wang , Lingling An , Shiwen Mao , Xuyu Wang","doi":"10.1016/j.smhl.2025.100574","DOIUrl":null,"url":null,"abstract":"<div><div>Deep learning techniques have been widely used for human activity recognition (HAR) applications. The major challenge lies in obtaining high-quality, large-scale labeled sensor datasets. However, unlike datasets such as images or text, HAR sensor datasets are non-intuitive and uninterpretable, making manual labeling extremely difficult. Self-supervised learning has emerged to address this problem, which can learn from large-scale unlabeled datasets that are easier to collect. Nevertheless, self-supervised learning has the increased computational cost and the demand for larger deep neural networks. Recently, quantum machine learning has attracted widespread attention due to its powerful computational capability and feature extraction ability. In this paper, we aim to address this classical hardware bottleneck using quantum machine learning techniques. We propose QCLHAR, a quantum contrastive learning framework for HAR, which combines quantum machine learning techniques with contrastive learning to learn better latent representations. We evaluate the feasibility of the proposed framework on six publicly available datasets for HAR. The experimental results demonstrate the effectiveness of the framework for HAR, which can surpass or match the precision of classical contrastive learning with fewer parameters. This validates the effectiveness of our approach and demonstrates the significant potential of quantum technology in addressing the challenges associated with the scarcity of labeled sensory data.</div></div>","PeriodicalId":37151,"journal":{"name":"Smart Health","volume":"36 ","pages":"Article 100574"},"PeriodicalIF":0.0000,"publicationDate":"2025-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Smart Health","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2352648325000352","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Health Professions","Score":null,"Total":0}
引用次数: 0
Abstract
Deep learning techniques have been widely used for human activity recognition (HAR) applications. The major challenge lies in obtaining high-quality, large-scale labeled sensor datasets. However, unlike datasets such as images or text, HAR sensor datasets are non-intuitive and uninterpretable, making manual labeling extremely difficult. Self-supervised learning has emerged to address this problem, which can learn from large-scale unlabeled datasets that are easier to collect. Nevertheless, self-supervised learning has the increased computational cost and the demand for larger deep neural networks. Recently, quantum machine learning has attracted widespread attention due to its powerful computational capability and feature extraction ability. In this paper, we aim to address this classical hardware bottleneck using quantum machine learning techniques. We propose QCLHAR, a quantum contrastive learning framework for HAR, which combines quantum machine learning techniques with contrastive learning to learn better latent representations. We evaluate the feasibility of the proposed framework on six publicly available datasets for HAR. The experimental results demonstrate the effectiveness of the framework for HAR, which can surpass or match the precision of classical contrastive learning with fewer parameters. This validates the effectiveness of our approach and demonstrates the significant potential of quantum technology in addressing the challenges associated with the scarcity of labeled sensory data.