{"title":"MDSFD-Net: Alzheimer’s disease diagnosis with missing modality via disentanglement learning and feature distillation","authors":"Nana Jia , Zhiao Zhang , Tong Jia","doi":"10.1016/j.neunet.2025.108128","DOIUrl":null,"url":null,"abstract":"<div><div>Multi-modal analysis can provide complementary information and significantly aid in the early diagnosis and intervention of Alzheimer’s Disease (AD). However, the issue of missing modalities presents a major challenge, as most methods that rely on complete multi-modal data become infeasible. The most advanced approaches to addressing missing modalities typically use generative models, but these often neglect the importance of modality-specific features, leading to biased predictions and poor performance. Inspired by this limitation, we propose a Modality Disentanglement and Specific Features Distillation Network (MDSFD-Net) for AD diagnosis with missing modality, which consists of a disentanglement-based imputation module (DI module) and a specific features distillation module (SFD module). In the DI module, we introduce a novel spatial-channel modality disentanglement learning scheme that is first used to disentangle modality-specific features, along with a shared constrain objective to learn modality-shared features, which are used for imputing missing modality features. To address the specific features of the missing modality, the SFD module is designed to transfer the specific features from complete modality in the teacher network to the incomplete modality in the student network. A regularized knowledge distillation (R-KD) mechanism is incorporated to mitigate the impact of incorrect predictions from the teacher network. By leveraging modality-shared features imputation and modality-specific features distillation, our model can effectively learn sufficient information for classification even if some modalities are missing. Extensive experiments on ADNI dataset demonstrate the superiority of our proposed MDSFD-Net over state-of-the-art methods in missing modality situations.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"194 ","pages":"Article 108128"},"PeriodicalIF":6.3000,"publicationDate":"2025-09-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025010081","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Multi-modal analysis can provide complementary information and significantly aid in the early diagnosis and intervention of Alzheimer’s Disease (AD). However, the issue of missing modalities presents a major challenge, as most methods that rely on complete multi-modal data become infeasible. The most advanced approaches to addressing missing modalities typically use generative models, but these often neglect the importance of modality-specific features, leading to biased predictions and poor performance. Inspired by this limitation, we propose a Modality Disentanglement and Specific Features Distillation Network (MDSFD-Net) for AD diagnosis with missing modality, which consists of a disentanglement-based imputation module (DI module) and a specific features distillation module (SFD module). In the DI module, we introduce a novel spatial-channel modality disentanglement learning scheme that is first used to disentangle modality-specific features, along with a shared constrain objective to learn modality-shared features, which are used for imputing missing modality features. To address the specific features of the missing modality, the SFD module is designed to transfer the specific features from complete modality in the teacher network to the incomplete modality in the student network. A regularized knowledge distillation (R-KD) mechanism is incorporated to mitigate the impact of incorrect predictions from the teacher network. By leveraging modality-shared features imputation and modality-specific features distillation, our model can effectively learn sufficient information for classification even if some modalities are missing. Extensive experiments on ADNI dataset demonstrate the superiority of our proposed MDSFD-Net over state-of-the-art methods in missing modality situations.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.