Hybrid Multimodality Fusion with Cross-Domain Knowledge Transfer to Forecast Progression Trajectories in Cognitive Decline.

Minhui Yu, Yunbi Liu, Jinjian Wu, Andrea Bozoki, Shijun Qiu, Ling Yue, Mingxia Liu
{"title":"Hybrid Multimodality Fusion with Cross-Domain Knowledge Transfer to Forecast Progression Trajectories in Cognitive Decline.","authors":"Minhui Yu, Yunbi Liu, Jinjian Wu, Andrea Bozoki, Shijun Qiu, Ling Yue, Mingxia Liu","doi":"10.1007/978-3-031-47425-5_24","DOIUrl":null,"url":null,"abstract":"<p><p>Magnetic resonance imaging (MRI) and positron emission tomography (PET) are increasingly used to forecast progression trajectories of cognitive decline caused by preclinical and prodromal Alzheimer's disease (AD). Many existing studies have explored the potential of these two distinct modalities with diverse machine and deep learning approaches. But successfully fusing MRI and PET can be complex due to their unique characteristics and missing modalities. To this end, we develop a hybrid multimodality fusion (HMF) framework with cross-domain knowledge transfer for joint MRI and PET representation learning, feature fusion, and cognitive decline progression forecasting. Our HMF consists of three modules: 1) a module to impute missing PET images, 2) a module to extract multimodality features from MRI and PET images, and 3) a module to fuse the extracted multimodality features. To address the issue of small sample sizes, we employ a cross-domain knowledge transfer strategy from the ADNI dataset, which includes 795 subjects, to independent small-scale AD-related cohorts, in order to leverage the rich knowledge present within the ADNI. The proposed HMF is extensively evaluated in three AD-related studies with 272 subjects across multiple disease stages, such as subjective cognitive decline and mild cognitive impairment. Experimental results demonstrate the superiority of our method over several state-of-the-art approaches in forecasting progression trajectories of AD-related cognitive decline.</p>","PeriodicalId":94280,"journal":{"name":"Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention","volume":"14394 ","pages":"265-275"},"PeriodicalIF":0.0000,"publicationDate":"2023-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10904401/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/978-3-031-47425-5_24","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/2/3 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Magnetic resonance imaging (MRI) and positron emission tomography (PET) are increasingly used to forecast progression trajectories of cognitive decline caused by preclinical and prodromal Alzheimer's disease (AD). Many existing studies have explored the potential of these two distinct modalities with diverse machine and deep learning approaches. But successfully fusing MRI and PET can be complex due to their unique characteristics and missing modalities. To this end, we develop a hybrid multimodality fusion (HMF) framework with cross-domain knowledge transfer for joint MRI and PET representation learning, feature fusion, and cognitive decline progression forecasting. Our HMF consists of three modules: 1) a module to impute missing PET images, 2) a module to extract multimodality features from MRI and PET images, and 3) a module to fuse the extracted multimodality features. To address the issue of small sample sizes, we employ a cross-domain knowledge transfer strategy from the ADNI dataset, which includes 795 subjects, to independent small-scale AD-related cohorts, in order to leverage the rich knowledge present within the ADNI. The proposed HMF is extensively evaluated in three AD-related studies with 272 subjects across multiple disease stages, such as subjective cognitive decline and mild cognitive impairment. Experimental results demonstrate the superiority of our method over several state-of-the-art approaches in forecasting progression trajectories of AD-related cognitive decline.

混合多模态融合与跨域知识转移,预测认知衰退的进展轨迹。
磁共振成像(MRI)和正电子发射断层扫描(PET)越来越多地被用于预测临床前和前驱阿尔茨海默病(AD)引起的认知能力下降的进展轨迹。现有的许多研究已经利用不同的机器学习和深度学习方法探索了这两种不同模式的潜力。但是,由于核磁共振成像和正电子发射计算机断层成像的独特性和缺失模式,成功融合这两种模式可能非常复杂。为此,我们开发了一种混合多模态融合(HMF)框架,该框架具有跨领域知识转移功能,可用于联合 MRI 和 PET 表征学习、特征融合和认知衰退进展预测。我们的混合多模态融合框架由三个模块组成:1)对缺失的 PET 图像进行补偿的模块;2)从 MRI 和 PET 图像中提取多模态特征的模块;3)对提取的多模态特征进行融合的模块。为了解决样本量小的问题,我们采用了跨领域知识转移策略,从包括 795 名受试者的 ADNI 数据集转移到独立的小规模 AD 相关队列,以充分利用 ADNI 中的丰富知识。拟议的 HMF 在三项 AD 相关研究中进行了广泛评估,研究对象包括 272 名受试者,涉及多个疾病阶段,如主观认知能力下降和轻度认知障碍。实验结果表明,在预测注意力缺失症相关认知能力下降的进展轨迹方面,我们的方法优于几种最先进的方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信