Neil Joshua Limbaga;Haozheng He;José Ilton de Oliveira Filho;Khaled Nabil Salama
{"title":"Cross-Sensor Transferability of a Deep Residual U-Net for Sleep Staging Using Temporal Low-Frequency Photoplethysmography","authors":"Neil Joshua Limbaga;Haozheng He;José Ilton de Oliveira Filho;Khaled Nabil Salama","doi":"10.1109/LSENS.2025.3589877","DOIUrl":null,"url":null,"abstract":"Wearable sensors are increasingly used for sleep monitoring, but accurate sleep staging often depends on expensive, high-fidelity devices and hand-crafted features. This work explores whether deep learning models trained on raw, temporal photoplethysmography (PPG) signals can generalize, not only across subjects but also across different sensors. The first phase of the study involved training a residual U-Net architecture on a large-scale sleep dataset to classify sleep stages (light, deep, and rapid eye movement (REM)). A two-stage hyperparameter sweep yielded a best test F1 score of 0.805. The second phase of the study introduced a cross-sensor transfer learning paradigm using proprietary raw PPG data labeled via a commercial wearable. Transfer learning was performed across four PPG channels; namely, Green, Green2, Red, and IR, yielding F1 scores of 0.901, 0.877, 0.892, and 0.840, respectively. These results demonstrate the model's capacity to adapt across distinct PPG configurations, supporting scalable and sensor-agnostic sleep staging.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"9 8","pages":"1-4"},"PeriodicalIF":2.2000,"publicationDate":"2025-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Letters","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11081425/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Wearable sensors are increasingly used for sleep monitoring, but accurate sleep staging often depends on expensive, high-fidelity devices and hand-crafted features. This work explores whether deep learning models trained on raw, temporal photoplethysmography (PPG) signals can generalize, not only across subjects but also across different sensors. The first phase of the study involved training a residual U-Net architecture on a large-scale sleep dataset to classify sleep stages (light, deep, and rapid eye movement (REM)). A two-stage hyperparameter sweep yielded a best test F1 score of 0.805. The second phase of the study introduced a cross-sensor transfer learning paradigm using proprietary raw PPG data labeled via a commercial wearable. Transfer learning was performed across four PPG channels; namely, Green, Green2, Red, and IR, yielding F1 scores of 0.901, 0.877, 0.892, and 0.840, respectively. These results demonstrate the model's capacity to adapt across distinct PPG configurations, supporting scalable and sensor-agnostic sleep staging.