{"title":"不平衡多模态数据的置信度感知自适应融合学习用于肿瘤诊断和预后。","authors":"Ziye Zhang, Shijin Wang, Yuying Huang, XiaoRou Zheng, Shoubin Dong","doi":"10.1109/JBHI.2025.3582626","DOIUrl":null,"url":null,"abstract":"<p><p>The effective fusion of pathological images and molecular omics holds significant potential for precision medicine. However, pathological and molecular data are highly heterogeneous, and large-scale multi-modal cancer data often suffer from incomplete information. Predicting clinical tasks from such imbalanced multi-modal data presents a major challenge. Therefore, we propose a confidence-aware adaptive fusion framework CAFusion. The framework adopts a modular design, providing independent and flexible modal feature learning modules to capture high-quality features. To address issues of modal imbalance caused by heterogeneous and incomplete modal, we design a confidence-aware method that evaluates the features of each modal and automatically adjusts their weights. To effectively fuse pathological and molecular modals, we propose an adaptive deep network, which features a flexible, non-fixed layer structure that effectively extracts hidden joint information from multi-modal features, ensuring high generalizability. Experiment results demonstrate that the performance of the CAFusion framework outperforms other state-of-the-art methods, both on complete and incomplete datasets. Moreover, the CAFusion framework offers reasonable medical interpretability. The source code is available at GitHub: https://github.com/SCUT-CCNL/CAFusion.</p>","PeriodicalId":13073,"journal":{"name":"IEEE Journal of Biomedical and Health Informatics","volume":"PP ","pages":""},"PeriodicalIF":6.7000,"publicationDate":"2025-06-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Confidence-aware adaptive fusion leaning of imbalance multi-modal data for cancer diagnosis and prognosis.\",\"authors\":\"Ziye Zhang, Shijin Wang, Yuying Huang, XiaoRou Zheng, Shoubin Dong\",\"doi\":\"10.1109/JBHI.2025.3582626\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>The effective fusion of pathological images and molecular omics holds significant potential for precision medicine. However, pathological and molecular data are highly heterogeneous, and large-scale multi-modal cancer data often suffer from incomplete information. Predicting clinical tasks from such imbalanced multi-modal data presents a major challenge. Therefore, we propose a confidence-aware adaptive fusion framework CAFusion. The framework adopts a modular design, providing independent and flexible modal feature learning modules to capture high-quality features. To address issues of modal imbalance caused by heterogeneous and incomplete modal, we design a confidence-aware method that evaluates the features of each modal and automatically adjusts their weights. To effectively fuse pathological and molecular modals, we propose an adaptive deep network, which features a flexible, non-fixed layer structure that effectively extracts hidden joint information from multi-modal features, ensuring high generalizability. Experiment results demonstrate that the performance of the CAFusion framework outperforms other state-of-the-art methods, both on complete and incomplete datasets. Moreover, the CAFusion framework offers reasonable medical interpretability. The source code is available at GitHub: https://github.com/SCUT-CCNL/CAFusion.</p>\",\"PeriodicalId\":13073,\"journal\":{\"name\":\"IEEE Journal of Biomedical and Health Informatics\",\"volume\":\"PP \",\"pages\":\"\"},\"PeriodicalIF\":6.7000,\"publicationDate\":\"2025-06-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Journal of Biomedical and Health Informatics\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1109/JBHI.2025.3582626\",\"RegionNum\":2,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Journal of Biomedical and Health Informatics","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1109/JBHI.2025.3582626","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
Confidence-aware adaptive fusion leaning of imbalance multi-modal data for cancer diagnosis and prognosis.
The effective fusion of pathological images and molecular omics holds significant potential for precision medicine. However, pathological and molecular data are highly heterogeneous, and large-scale multi-modal cancer data often suffer from incomplete information. Predicting clinical tasks from such imbalanced multi-modal data presents a major challenge. Therefore, we propose a confidence-aware adaptive fusion framework CAFusion. The framework adopts a modular design, providing independent and flexible modal feature learning modules to capture high-quality features. To address issues of modal imbalance caused by heterogeneous and incomplete modal, we design a confidence-aware method that evaluates the features of each modal and automatically adjusts their weights. To effectively fuse pathological and molecular modals, we propose an adaptive deep network, which features a flexible, non-fixed layer structure that effectively extracts hidden joint information from multi-modal features, ensuring high generalizability. Experiment results demonstrate that the performance of the CAFusion framework outperforms other state-of-the-art methods, both on complete and incomplete datasets. Moreover, the CAFusion framework offers reasonable medical interpretability. The source code is available at GitHub: https://github.com/SCUT-CCNL/CAFusion.
期刊介绍:
IEEE Journal of Biomedical and Health Informatics publishes original papers presenting recent advances where information and communication technologies intersect with health, healthcare, life sciences, and biomedicine. Topics include acquisition, transmission, storage, retrieval, management, and analysis of biomedical and health information. The journal covers applications of information technologies in healthcare, patient monitoring, preventive care, early disease diagnosis, therapy discovery, and personalized treatment protocols. It explores electronic medical and health records, clinical information systems, decision support systems, medical and biological imaging informatics, wearable systems, body area/sensor networks, and more. Integration-related topics like interoperability, evidence-based medicine, and secure patient data are also addressed.