{"title":"Attention-Based Dual-Knowledge Distillation for Alzheimer’s Disease Stage Detection Using MRI Scans","authors":"Chandita Barman;Sudhanshu Singh;Manob Jyoti Saikia;Shovan Barma","doi":"10.1109/OJIM.2025.3589698","DOIUrl":null,"url":null,"abstract":"This study presents an efficient attention-guided dual-knowledge distillation (D-KD) framework for classifying Alzheimer’s disease (AD) stages using magnetic resonance imaging (MRI) scans based on detection of the subtle anatomical differences. Current challenges involve identifying precise discriminating features in low computational complexity without compromising classification accuracy. In this work, a dual-teacher model consisting of vision transformer (ViT) and swin transformer (ST) for capturing global and local features, respectively, is utilized to distill comprehensive knowledge into a lightweight ViT-based student model, ensuring accurate classification efficacy with reduced computational demands. For validation of the proposed idea, two well-known benchmark MRI datasets, Alzheimer’s Disease Neuroimaging Initiative (ADNI) and AIBL, have been considered for multiclass classification, using an online-training knowledge distillation approach, where teacher and student networks are trained concurrently. The proposed model has achieved accuracies (Ac) up to 98.24% and 97.07% on ADNI and AIBL, respectively, with a significant performance improvement of 15.6% with respect to existing works. The analysis shows that by leveraging the complementary strengths of ViT and ST, the D-KD strategy enhances generalization in data-limited scenarios and provides a reliable, resource-efficient solution for MRI-based AD diagnosis.","PeriodicalId":100630,"journal":{"name":"IEEE Open Journal of Instrumentation and Measurement","volume":"4 ","pages":"1-10"},"PeriodicalIF":1.5000,"publicationDate":"2025-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11082332","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Open Journal of Instrumentation and Measurement","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11082332/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
This study presents an efficient attention-guided dual-knowledge distillation (D-KD) framework for classifying Alzheimer’s disease (AD) stages using magnetic resonance imaging (MRI) scans based on detection of the subtle anatomical differences. Current challenges involve identifying precise discriminating features in low computational complexity without compromising classification accuracy. In this work, a dual-teacher model consisting of vision transformer (ViT) and swin transformer (ST) for capturing global and local features, respectively, is utilized to distill comprehensive knowledge into a lightweight ViT-based student model, ensuring accurate classification efficacy with reduced computational demands. For validation of the proposed idea, two well-known benchmark MRI datasets, Alzheimer’s Disease Neuroimaging Initiative (ADNI) and AIBL, have been considered for multiclass classification, using an online-training knowledge distillation approach, where teacher and student networks are trained concurrently. The proposed model has achieved accuracies (Ac) up to 98.24% and 97.07% on ADNI and AIBL, respectively, with a significant performance improvement of 15.6% with respect to existing works. The analysis shows that by leveraging the complementary strengths of ViT and ST, the D-KD strategy enhances generalization in data-limited scenarios and provides a reliable, resource-efficient solution for MRI-based AD diagnosis.