GAF-GradCAM: Guided dynamic weighted fusion of temporal and frequency GAF 2D matrices for ECG-based arrhythmia detection using deep learning

IF 2.7 Q2 MULTIDISCIPLINARY SCIENCES
Zakaria Khatar , Dounia Bentaleb , Noreddine Abghour , Khalid Moussaid
{"title":"GAF-GradCAM: Guided dynamic weighted fusion of temporal and frequency GAF 2D matrices for ECG-based arrhythmia detection using deep learning","authors":"Zakaria Khatar ,&nbsp;Dounia Bentaleb ,&nbsp;Noreddine Abghour ,&nbsp;Khalid Moussaid","doi":"10.1016/j.sciaf.2025.e02687","DOIUrl":null,"url":null,"abstract":"<div><div>This study introduces an innovative approach for arrhythmia classification that employs a Grad-CAM-guided dynamic weighted fusion of temporal and frequency features extracted from electrocardiogram (ECG) signals. By transforming ECG signals into two-dimensional Gramian Angular Field (GAF) matrices, the proposed method effectively captures temporal dynamics via Gramian Angular Summation Fields (GASF) and frequency dependencies from features extracted using Continuous Wavelet Transform (CWT) and refined through Principal Component Analysis (PCA). The Grad-CAM-guided dynamic fusion adaptively assigns importance to these complementary feature types based on their relevance for each input, enhancing both classification accuracy and interpretability. Optimizing this fusion process fine-tunes the balance between temporal and frequency information, thus focusing the model on the most critical ECG features. As a result, training accuracy reached 99.68% and validation accuracy 98.78%, alongside a substantial reduction in loss, underscoring the efficacy of Grad-CAM-guided fusion in integrating essential ECG features and advancing arrhythmia detection accuracy. Building on this fusion framework, this study further proposes a Hybrid Parallel-Residual Architecture specifically tailored for arrhythmia detection, integrating parallel and residual connections with Bidirectional Long Short-Term Memory (Bi-LSTM). This architecture ensures robust feature extraction and precise classification, achieving up to 98.75% accuracy, 99.14% sensitivity, and a 98.97% F1 score across multiple ECG leads, thereby surpassing traditional methods.</div></div>","PeriodicalId":21690,"journal":{"name":"Scientific African","volume":"28 ","pages":"Article e02687"},"PeriodicalIF":2.7000,"publicationDate":"2025-04-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Scientific African","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2468227625001577","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
引用次数: 0

Abstract

This study introduces an innovative approach for arrhythmia classification that employs a Grad-CAM-guided dynamic weighted fusion of temporal and frequency features extracted from electrocardiogram (ECG) signals. By transforming ECG signals into two-dimensional Gramian Angular Field (GAF) matrices, the proposed method effectively captures temporal dynamics via Gramian Angular Summation Fields (GASF) and frequency dependencies from features extracted using Continuous Wavelet Transform (CWT) and refined through Principal Component Analysis (PCA). The Grad-CAM-guided dynamic fusion adaptively assigns importance to these complementary feature types based on their relevance for each input, enhancing both classification accuracy and interpretability. Optimizing this fusion process fine-tunes the balance between temporal and frequency information, thus focusing the model on the most critical ECG features. As a result, training accuracy reached 99.68% and validation accuracy 98.78%, alongside a substantial reduction in loss, underscoring the efficacy of Grad-CAM-guided fusion in integrating essential ECG features and advancing arrhythmia detection accuracy. Building on this fusion framework, this study further proposes a Hybrid Parallel-Residual Architecture specifically tailored for arrhythmia detection, integrating parallel and residual connections with Bidirectional Long Short-Term Memory (Bi-LSTM). This architecture ensures robust feature extraction and precise classification, achieving up to 98.75% accuracy, 99.14% sensitivity, and a 98.97% F1 score across multiple ECG leads, thereby surpassing traditional methods.
GAF- gradcam:基于深度学习的基于ecg的心律失常检测的时间和频率GAF二维矩阵的引导动态加权融合
本研究介绍了一种创新的心律失常分类方法,该方法采用 Grad-CAM 引导的动态加权融合从心电图(ECG)信号中提取的时间和频率特性。通过将心电图信号转换为二维革兰氏角场(GAF)矩阵,所提出的方法通过革兰氏角求和场(GASF)有效捕捉了时间动态,并通过连续小波变换(CWT)提取的特征和主成分分析(PCA)精炼的频率依赖性有效捕捉了时间动态。Grad-CAM 引导的动态融合可根据这些互补特征类型与每个输入的相关性自适应地分配其重要性,从而提高分类准确性和可解释性。通过优化这一融合过程,可以微调时间和频率信息之间的平衡,从而将模型的重点放在最关键的心电图特征上。结果,训练准确率达到了 99.68%,验证准确率达到了 98.78%,同时还大大减少了损失,这表明 Grad-CAM 引导的融合在整合基本心电图特征和提高心律失常检测准确率方面非常有效。在此融合框架的基础上,本研究进一步提出了专为心律失常检测量身定制的并行-残差混合架构,将并行和残差连接与双向长短期记忆(Bi-LSTM)整合在一起。该架构可确保稳健的特征提取和精确的分类,在多个心电图导联中实现高达 98.75% 的准确率、99.14% 的灵敏度和 98.97% 的 F1 分数,从而超越了传统方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Scientific African
Scientific African Multidisciplinary-Multidisciplinary
CiteScore
5.60
自引率
3.40%
发文量
332
审稿时长
10 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信