Self-supervised transfer learning of physiological representations from free-living wearable data

Dimitris Spathis, I. Perez-Pozuelo, S. Brage, N. Wareham, C. Mascolo
{"title":"Self-supervised transfer learning of physiological representations from free-living wearable data","authors":"Dimitris Spathis, I. Perez-Pozuelo, S. Brage, N. Wareham, C. Mascolo","doi":"10.1145/3450439.3451863","DOIUrl":null,"url":null,"abstract":"Wearable devices such as smartwatches are becoming increasingly popular tools for objectively monitoring physical activity in free-living conditions. To date, research has primarily focused on the purely supervised task of human activity recognition, demonstrating limited success in inferring high-level health outcomes from low-level signals. Here, we present a novel self-supervised representation learning method using activity and heart rate (HR) signals without semantic labels. With a deep neural network, we set HR responses as the supervisory signal for the activity data, leveraging their underlying physiological relationship. In addition, we propose a custom quantile loss function that accounts for the long-tailed HR distribution present in the general population. We evaluate our model in the largest free-living combined-sensing dataset (comprising >280k hours of wrist accelerometer & wearable ECG data). Our contributions are two-fold: i) the pre-training task creates a model that can accurately forecast HR based only on cheap activity sensors, and ii) we leverage the information captured through this task by proposing a simple method to aggregate the learnt latent representations (embeddings) from the window-level to user-level. Notably, we show that the embeddings can generalize in various downstream tasks through transfer learning with linear classifiers, capturing physiologically meaningful, personalized information. For instance, they can be used to predict variables associated with individuals' health, fitness and demographic characteristics (AUC >70), outperforming unsupervised autoencoders and common bio-markers. Overall, we propose the first multimodal self-supervised method for behavioral and physiological data with implications for large-scale health and lifestyle monitoring. Code: https://github.com/sdimi/Step2heart.","PeriodicalId":87342,"journal":{"name":"Proceedings of the ACM Conference on Health, Inference, and Learning","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2020-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"26","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the ACM Conference on Health, Inference, and Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3450439.3451863","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 26

Abstract

Wearable devices such as smartwatches are becoming increasingly popular tools for objectively monitoring physical activity in free-living conditions. To date, research has primarily focused on the purely supervised task of human activity recognition, demonstrating limited success in inferring high-level health outcomes from low-level signals. Here, we present a novel self-supervised representation learning method using activity and heart rate (HR) signals without semantic labels. With a deep neural network, we set HR responses as the supervisory signal for the activity data, leveraging their underlying physiological relationship. In addition, we propose a custom quantile loss function that accounts for the long-tailed HR distribution present in the general population. We evaluate our model in the largest free-living combined-sensing dataset (comprising >280k hours of wrist accelerometer & wearable ECG data). Our contributions are two-fold: i) the pre-training task creates a model that can accurately forecast HR based only on cheap activity sensors, and ii) we leverage the information captured through this task by proposing a simple method to aggregate the learnt latent representations (embeddings) from the window-level to user-level. Notably, we show that the embeddings can generalize in various downstream tasks through transfer learning with linear classifiers, capturing physiologically meaningful, personalized information. For instance, they can be used to predict variables associated with individuals' health, fitness and demographic characteristics (AUC >70), outperforming unsupervised autoencoders and common bio-markers. Overall, we propose the first multimodal self-supervised method for behavioral and physiological data with implications for large-scale health and lifestyle monitoring. Code: https://github.com/sdimi/Step2heart.
来自自由生活的可穿戴数据的生理表征的自监督迁移学习
在自由生活的条件下,智能手表等可穿戴设备正成为越来越受欢迎的客观监测身体活动的工具。迄今为止,研究主要集中在人类活动识别的纯粹监督任务上,表明在从低水平信号推断高水平健康结果方面取得的成功有限。在这里,我们提出了一种新的自监督表示学习方法,使用活动和心率(HR)信号,而不使用语义标签。通过深层神经网络,我们将HR反应设置为活动数据的监督信号,利用它们之间潜在的生理关系。此外,我们提出了一个自定义的分位数损失函数,用于解释一般人群中存在的长尾HR分布。我们在最大的自由生活组合传感数据集(包括>280k小时的手腕加速度计和可穿戴ECG数据)中评估了我们的模型。我们的贡献是双重的:i)预训练任务创建了一个仅基于廉价活动传感器就能准确预测HR的模型,ii)我们利用通过该任务捕获的信息,提出了一种简单的方法,将学习到的潜在表征(嵌入)从窗口级聚合到用户级。值得注意的是,我们表明嵌入可以通过线性分类器的迁移学习来推广各种下游任务,捕获生理上有意义的个性化信息。例如,它们可用于预测与个人健康、适应性和人口特征(AUC >70)相关的变量,优于无监督自动编码器和普通生物标记。总的来说,我们提出了第一个多模式自我监督方法,用于行为和生理数据,具有大规模健康和生活方式监测的意义。代码:https://github.com/sdimi/Step2heart。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信