Yanwu Yang, Chenfei Ye, Guoqing Cai, Kunru Song, Jintao Zhang, Yang Xiang, Ting Ma
{"title":"Hypercomplex Graph Neural Network: Towards Deep Intersection of Multi-modal Brain Networks.","authors":"Yanwu Yang, Chenfei Ye, Guoqing Cai, Kunru Song, Jintao Zhang, Yang Xiang, Ting Ma","doi":"10.1109/JBHI.2024.3490664","DOIUrl":null,"url":null,"abstract":"<p><p>The multi-modal neuroimage study has provided insights into understanding the heteromodal relationships between brain network organization and behavioral phenotypes. Integrating data from various modalities facilitates the characterization of the interplay among anatomical, functional, and physiological brain alterations or developments. Graph Neural Networks (GNNs) have recently become popular in analyzing and fusing multi-modal, graph-structured brain networks. However, effectively learning complementary representations from other modalities remains a significant challenge due to the sophisticated and heterogeneous inter-modal dependencies. Furthermore, most existing studies often focus on specific modalities (e.g., only fMRI and DTI), which limits their scalability to other types of brain networks. To overcome these limitations, we propose a HyperComplex Graph Neural Network (HC-GNN) that models multi-modal networks as hypercomplex tensor graphs. In our approach, HC-GNN is conceptualized as a dynamic spatial graph, where the attentively learned inter-modal associations are represented as the adjacency matrix. HC-GNN leverages hypercomplex operations for inter-modal intersections through cross-embedding and cross-aggregation, enriching the deep coupling of multi-modal representations. We conduct a statistical analysis on the saliency maps to associate disease biomarkers. Extensive experiments on three datasets demonstrate the superior classification performance of our method and its strong scalability to various types of modalities. Our work presents a powerful paradigm for the study of multi-modal brain networks.</p>","PeriodicalId":13073,"journal":{"name":"IEEE Journal of Biomedical and Health Informatics","volume":null,"pages":null},"PeriodicalIF":6.7000,"publicationDate":"2024-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Journal of Biomedical and Health Informatics","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1109/JBHI.2024.3490664","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
The multi-modal neuroimage study has provided insights into understanding the heteromodal relationships between brain network organization and behavioral phenotypes. Integrating data from various modalities facilitates the characterization of the interplay among anatomical, functional, and physiological brain alterations or developments. Graph Neural Networks (GNNs) have recently become popular in analyzing and fusing multi-modal, graph-structured brain networks. However, effectively learning complementary representations from other modalities remains a significant challenge due to the sophisticated and heterogeneous inter-modal dependencies. Furthermore, most existing studies often focus on specific modalities (e.g., only fMRI and DTI), which limits their scalability to other types of brain networks. To overcome these limitations, we propose a HyperComplex Graph Neural Network (HC-GNN) that models multi-modal networks as hypercomplex tensor graphs. In our approach, HC-GNN is conceptualized as a dynamic spatial graph, where the attentively learned inter-modal associations are represented as the adjacency matrix. HC-GNN leverages hypercomplex operations for inter-modal intersections through cross-embedding and cross-aggregation, enriching the deep coupling of multi-modal representations. We conduct a statistical analysis on the saliency maps to associate disease biomarkers. Extensive experiments on three datasets demonstrate the superior classification performance of our method and its strong scalability to various types of modalities. Our work presents a powerful paradigm for the study of multi-modal brain networks.
期刊介绍:
IEEE Journal of Biomedical and Health Informatics publishes original papers presenting recent advances where information and communication technologies intersect with health, healthcare, life sciences, and biomedicine. Topics include acquisition, transmission, storage, retrieval, management, and analysis of biomedical and health information. The journal covers applications of information technologies in healthcare, patient monitoring, preventive care, early disease diagnosis, therapy discovery, and personalized treatment protocols. It explores electronic medical and health records, clinical information systems, decision support systems, medical and biological imaging informatics, wearable systems, body area/sensor networks, and more. Integration-related topics like interoperability, evidence-based medicine, and secure patient data are also addressed.