{"title":"GAF- gradcam:基于深度学习的基于ecg的心律失常检测的时间和频率GAF二维矩阵的引导动态加权融合","authors":"Zakaria Khatar , Dounia Bentaleb , Noreddine Abghour , Khalid Moussaid","doi":"10.1016/j.sciaf.2025.e02687","DOIUrl":null,"url":null,"abstract":"<div><div>This study introduces an innovative approach for arrhythmia classification that employs a Grad-CAM-guided dynamic weighted fusion of temporal and frequency features extracted from electrocardiogram (ECG) signals. By transforming ECG signals into two-dimensional Gramian Angular Field (GAF) matrices, the proposed method effectively captures temporal dynamics via Gramian Angular Summation Fields (GASF) and frequency dependencies from features extracted using Continuous Wavelet Transform (CWT) and refined through Principal Component Analysis (PCA). The Grad-CAM-guided dynamic fusion adaptively assigns importance to these complementary feature types based on their relevance for each input, enhancing both classification accuracy and interpretability. Optimizing this fusion process fine-tunes the balance between temporal and frequency information, thus focusing the model on the most critical ECG features. As a result, training accuracy reached 99.68% and validation accuracy 98.78%, alongside a substantial reduction in loss, underscoring the efficacy of Grad-CAM-guided fusion in integrating essential ECG features and advancing arrhythmia detection accuracy. Building on this fusion framework, this study further proposes a Hybrid Parallel-Residual Architecture specifically tailored for arrhythmia detection, integrating parallel and residual connections with Bidirectional Long Short-Term Memory (Bi-LSTM). This architecture ensures robust feature extraction and precise classification, achieving up to 98.75% accuracy, 99.14% sensitivity, and a 98.97% F1 score across multiple ECG leads, thereby surpassing traditional methods.</div></div>","PeriodicalId":21690,"journal":{"name":"Scientific African","volume":"28 ","pages":"Article e02687"},"PeriodicalIF":2.7000,"publicationDate":"2025-04-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"GAF-GradCAM: Guided dynamic weighted fusion of temporal and frequency GAF 2D matrices for ECG-based arrhythmia detection using deep learning\",\"authors\":\"Zakaria Khatar , Dounia Bentaleb , Noreddine Abghour , Khalid Moussaid\",\"doi\":\"10.1016/j.sciaf.2025.e02687\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>This study introduces an innovative approach for arrhythmia classification that employs a Grad-CAM-guided dynamic weighted fusion of temporal and frequency features extracted from electrocardiogram (ECG) signals. By transforming ECG signals into two-dimensional Gramian Angular Field (GAF) matrices, the proposed method effectively captures temporal dynamics via Gramian Angular Summation Fields (GASF) and frequency dependencies from features extracted using Continuous Wavelet Transform (CWT) and refined through Principal Component Analysis (PCA). The Grad-CAM-guided dynamic fusion adaptively assigns importance to these complementary feature types based on their relevance for each input, enhancing both classification accuracy and interpretability. Optimizing this fusion process fine-tunes the balance between temporal and frequency information, thus focusing the model on the most critical ECG features. As a result, training accuracy reached 99.68% and validation accuracy 98.78%, alongside a substantial reduction in loss, underscoring the efficacy of Grad-CAM-guided fusion in integrating essential ECG features and advancing arrhythmia detection accuracy. Building on this fusion framework, this study further proposes a Hybrid Parallel-Residual Architecture specifically tailored for arrhythmia detection, integrating parallel and residual connections with Bidirectional Long Short-Term Memory (Bi-LSTM). This architecture ensures robust feature extraction and precise classification, achieving up to 98.75% accuracy, 99.14% sensitivity, and a 98.97% F1 score across multiple ECG leads, thereby surpassing traditional methods.</div></div>\",\"PeriodicalId\":21690,\"journal\":{\"name\":\"Scientific African\",\"volume\":\"28 \",\"pages\":\"Article e02687\"},\"PeriodicalIF\":2.7000,\"publicationDate\":\"2025-04-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Scientific African\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2468227625001577\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MULTIDISCIPLINARY SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Scientific African","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2468227625001577","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
GAF-GradCAM: Guided dynamic weighted fusion of temporal and frequency GAF 2D matrices for ECG-based arrhythmia detection using deep learning
This study introduces an innovative approach for arrhythmia classification that employs a Grad-CAM-guided dynamic weighted fusion of temporal and frequency features extracted from electrocardiogram (ECG) signals. By transforming ECG signals into two-dimensional Gramian Angular Field (GAF) matrices, the proposed method effectively captures temporal dynamics via Gramian Angular Summation Fields (GASF) and frequency dependencies from features extracted using Continuous Wavelet Transform (CWT) and refined through Principal Component Analysis (PCA). The Grad-CAM-guided dynamic fusion adaptively assigns importance to these complementary feature types based on their relevance for each input, enhancing both classification accuracy and interpretability. Optimizing this fusion process fine-tunes the balance between temporal and frequency information, thus focusing the model on the most critical ECG features. As a result, training accuracy reached 99.68% and validation accuracy 98.78%, alongside a substantial reduction in loss, underscoring the efficacy of Grad-CAM-guided fusion in integrating essential ECG features and advancing arrhythmia detection accuracy. Building on this fusion framework, this study further proposes a Hybrid Parallel-Residual Architecture specifically tailored for arrhythmia detection, integrating parallel and residual connections with Bidirectional Long Short-Term Memory (Bi-LSTM). This architecture ensures robust feature extraction and precise classification, achieving up to 98.75% accuracy, 99.14% sensitivity, and a 98.97% F1 score across multiple ECG leads, thereby surpassing traditional methods.