Multimodal feature fusion-based graph convolutional networks for Alzheimer's disease stage classification using F-18 florbetaben brain PET images and clinical indicators.

IF 2.9 3区 综合性期刊 Q1 MULTIDISCIPLINARY SCIENCES
PLoS ONE Pub Date : 2024-12-23 eCollection Date: 2024-01-01 DOI:10.1371/journal.pone.0315809
Gyu-Bin Lee, Young-Jin Jeong, Do-Young Kang, Hyun-Jin Yun, Min Yoon
{"title":"Multimodal feature fusion-based graph convolutional networks for Alzheimer's disease stage classification using F-18 florbetaben brain PET images and clinical indicators.","authors":"Gyu-Bin Lee, Young-Jin Jeong, Do-Young Kang, Hyun-Jin Yun, Min Yoon","doi":"10.1371/journal.pone.0315809","DOIUrl":null,"url":null,"abstract":"<p><p>Alzheimer's disease (AD), the most prevalent degenerative brain disease associated with dementia, requires early diagnosis to alleviate worsening of symptoms through appropriate management and treatment. Recent studies on AD stage classification are increasingly using multimodal data. However, few studies have applied graph neural networks to multimodal data comprising F-18 florbetaben (FBB) amyloid brain positron emission tomography (PET) images and clinical indicators. The objective of this study was to demonstrate the effectiveness of graph convolutional network (GCN) for AD stage classification using multimodal data, specifically FBB PET images and clinical indicators, collected from Dong-A University Hospital (DAUH) and Alzheimer's Disease Neuroimaging Initiative (ADNI). The effectiveness of GCN was demonstrated through comparisons with the support vector machine, random forest, and multilayer perceptron across four classification tasks (normal control (NC) vs. AD, NC vs. mild cognitive impairment (MCI), MCI vs. AD, and NC vs. MCI vs. AD). As input, all models received the same combined feature vectors, created by concatenating the PET imaging feature vectors extracted by the 3D dense convolutional network and non-imaging feature vectors consisting of clinical indicators using multimodal feature fusion method. An adjacency matrix for the population graph was constructed using cosine similarity or the Euclidean distance between subjects' PET imaging feature vectors and/or non-imaging feature vectors. The usage ratio of these different modal data and edge assignment threshold were tuned by setting them as hyperparameters. In this study, GCN-CS-com and GCN-ED-com were the GCN models that received the adjacency matrix constructed using cosine similarity (CS) and the Euclidean distance (ED) between the subjects' PET imaging feature vectors and non-imaging feature vectors, respectively. In modified nested cross validation, GCN-CS-com and GCN-ED-com respectively achieved average test accuracies of 98.40%, 94.58%, 94.01%, 82.63% and 99.68%, 93.82%, 93.88%, 90.43% for the four aforementioned classification tasks using DAUH dataset, outperforming the other models. Furthermore, GCN-CS-com and GCN-ED-com respectively achieved average test accuracies of 76.16% and 90.11% for NC vs. MCI vs. AD classification using ADNI dataset, outperforming the other models. These results demonstrate that GCN could be an effective model for AD stage classification using multimodal data.</p>","PeriodicalId":20189,"journal":{"name":"PLoS ONE","volume":"19 12","pages":"e0315809"},"PeriodicalIF":2.9000,"publicationDate":"2024-12-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11666044/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"PLoS ONE","FirstCategoryId":"103","ListUrlMain":"https://doi.org/10.1371/journal.pone.0315809","RegionNum":3,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/1/1 0:00:00","PubModel":"eCollection","JCR":"Q1","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
引用次数: 0

Abstract

Alzheimer's disease (AD), the most prevalent degenerative brain disease associated with dementia, requires early diagnosis to alleviate worsening of symptoms through appropriate management and treatment. Recent studies on AD stage classification are increasingly using multimodal data. However, few studies have applied graph neural networks to multimodal data comprising F-18 florbetaben (FBB) amyloid brain positron emission tomography (PET) images and clinical indicators. The objective of this study was to demonstrate the effectiveness of graph convolutional network (GCN) for AD stage classification using multimodal data, specifically FBB PET images and clinical indicators, collected from Dong-A University Hospital (DAUH) and Alzheimer's Disease Neuroimaging Initiative (ADNI). The effectiveness of GCN was demonstrated through comparisons with the support vector machine, random forest, and multilayer perceptron across four classification tasks (normal control (NC) vs. AD, NC vs. mild cognitive impairment (MCI), MCI vs. AD, and NC vs. MCI vs. AD). As input, all models received the same combined feature vectors, created by concatenating the PET imaging feature vectors extracted by the 3D dense convolutional network and non-imaging feature vectors consisting of clinical indicators using multimodal feature fusion method. An adjacency matrix for the population graph was constructed using cosine similarity or the Euclidean distance between subjects' PET imaging feature vectors and/or non-imaging feature vectors. The usage ratio of these different modal data and edge assignment threshold were tuned by setting them as hyperparameters. In this study, GCN-CS-com and GCN-ED-com were the GCN models that received the adjacency matrix constructed using cosine similarity (CS) and the Euclidean distance (ED) between the subjects' PET imaging feature vectors and non-imaging feature vectors, respectively. In modified nested cross validation, GCN-CS-com and GCN-ED-com respectively achieved average test accuracies of 98.40%, 94.58%, 94.01%, 82.63% and 99.68%, 93.82%, 93.88%, 90.43% for the four aforementioned classification tasks using DAUH dataset, outperforming the other models. Furthermore, GCN-CS-com and GCN-ED-com respectively achieved average test accuracies of 76.16% and 90.11% for NC vs. MCI vs. AD classification using ADNI dataset, outperforming the other models. These results demonstrate that GCN could be an effective model for AD stage classification using multimodal data.

基于F-18 florbetaben脑PET图像和临床指标的基于多模态特征融合的图卷积网络用于阿尔茨海默病分期分类
阿尔茨海默病(AD)是与痴呆症相关的最普遍的退行性脑疾病,需要早期诊断,通过适当的管理和治疗来缓解症状的恶化。近年来对AD分期分类的研究越来越多地使用多模态数据。然而,很少有研究将图神经网络应用于包括F-18 florbetaben (FBB)淀粉样脑正电子发射断层扫描(PET)图像和临床指标的多模态数据。本研究的目的是利用东亚大学医院(DAUH)和阿尔茨海默病神经影像学倡议(ADNI)收集的多模态数据,特别是FBB PET图像和临床指标,证明图卷积网络(GCN)对AD分期分类的有效性。GCN的有效性通过与支持向量机、随机森林和多层感知器在四个分类任务(正常控制(NC) vs. AD、NC vs.轻度认知障碍(MCI)、MCI vs. AD和NC vs. MCI vs. AD)上的比较得到了证明。将三维密集卷积网络提取的PET成像特征向量与由临床指标组成的非成像特征向量采用多模态特征融合方法拼接而成的组合特征向量作为输入,所有模型得到相同的组合特征向量。利用被试PET成像特征向量与/或非成像特征向量之间的余弦相似度或欧氏距离,构建种群图的邻接矩阵。通过将这些不同的模态数据设置为超参数来调整它们的使用率和边缘分配阈值。在本研究中,GCN-CS-com和GCN-ED-com是分别接收由被试PET成像特征向量与非成像特征向量之间的余弦相似度(CS)和欧几里德距离(ED)构建的邻接矩阵的GCN模型。在改进的嵌套交叉验证中,GCN-CS-com和GCN-ED-com使用DAUH数据集对上述四种分类任务的平均测试准确率分别为98.40%、94.58%、94.01%、82.63%和99.68%、93.82%、93.88%、90.43%,均优于其他模型。此外,GCN-CS-com和GCN-ED-com在使用ADNI数据集进行NC、MCI和AD分类时的平均测试准确率分别为76.16%和90.11%,优于其他模型。这些结果表明,GCN可以作为一种有效的多模态AD阶段分类模型。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
PLoS ONE
PLoS ONE 生物-生物学
CiteScore
6.20
自引率
5.40%
发文量
14242
审稿时长
3.7 months
期刊介绍: PLOS ONE is an international, peer-reviewed, open-access, online publication. PLOS ONE welcomes reports on primary research from any scientific discipline. It provides: * Open-access—freely accessible online, authors retain copyright * Fast publication times * Peer review by expert, practicing researchers * Post-publication tools to indicate quality and impact * Community-based dialogue on articles * Worldwide media coverage
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信