Dong Han, Jihye Moon, Luís Roberto Mercado Díaz, Darren Chen, Devan Williams, Eric Y. Ding, Khanh-Van Tran, David D. McManus, Ki H. Chon
{"title":"Multiclass Arrhythmia Classification using Smartwatch Photoplethysmography Signals Collected in Real-life Settings","authors":"Dong Han, Jihye Moon, Luís Roberto Mercado Díaz, Darren Chen, Devan Williams, Eric Y. Ding, Khanh-Van Tran, David D. McManus, Ki H. Chon","doi":"arxiv-2409.06147","DOIUrl":null,"url":null,"abstract":"Most deep learning models of multiclass arrhythmia classification are tested\non fingertip photoplethysmographic (PPG) data, which has higher signal-to-noise\nratios compared to smartwatch-derived PPG, and the best reported sensitivity\nvalue for premature atrial/ventricular contraction (PAC/PVC) detection is only\n75%. To improve upon PAC/PVC detection sensitivity while maintaining high AF\ndetection, we use multi-modal data which incorporates 1D PPG, accelerometers,\nand heart rate data as the inputs to a computationally efficient 1D\nbi-directional Gated Recurrent Unit (1D-Bi-GRU) model to detect three\narrhythmia classes. We used motion-artifact prone smartwatch PPG data from the\nNIH-funded Pulsewatch clinical trial. Our multimodal model tested on 72\nsubjects achieved an unprecedented 83% sensitivity for PAC/PVC detection while\nmaintaining a high accuracy of 97.31% for AF detection. These results\noutperformed the best state-of-the-art model by 20.81% for PAC/PVC and 2.55%\nfor AF detection even while our model was computationally more efficient (14\ntimes lighter and 2.7 faster).","PeriodicalId":501034,"journal":{"name":"arXiv - EE - Signal Processing","volume":"91 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - EE - Signal Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.06147","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Most deep learning models of multiclass arrhythmia classification are tested
on fingertip photoplethysmographic (PPG) data, which has higher signal-to-noise
ratios compared to smartwatch-derived PPG, and the best reported sensitivity
value for premature atrial/ventricular contraction (PAC/PVC) detection is only
75%. To improve upon PAC/PVC detection sensitivity while maintaining high AF
detection, we use multi-modal data which incorporates 1D PPG, accelerometers,
and heart rate data as the inputs to a computationally efficient 1D
bi-directional Gated Recurrent Unit (1D-Bi-GRU) model to detect three
arrhythmia classes. We used motion-artifact prone smartwatch PPG data from the
NIH-funded Pulsewatch clinical trial. Our multimodal model tested on 72
subjects achieved an unprecedented 83% sensitivity for PAC/PVC detection while
maintaining a high accuracy of 97.31% for AF detection. These results
outperformed the best state-of-the-art model by 20.81% for PAC/PVC and 2.55%
for AF detection even while our model was computationally more efficient (14
times lighter and 2.7 faster).