Agnostic-Specific Modality Learning for Cancer Survival Prediction from Multiple Data.

IF 6.7 2区 医学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS
Honglei Liu, Yi Shi, Ying Xu, Ao Li, Minghui Wang
{"title":"Agnostic-Specific Modality Learning for Cancer Survival Prediction from Multiple Data.","authors":"Honglei Liu, Yi Shi, Ying Xu, Ao Li, Minghui Wang","doi":"10.1109/JBHI.2024.3481310","DOIUrl":null,"url":null,"abstract":"<p><p>Cancer is a pressing public health problem and one of the main causes of mortality worldwide. The development of advanced computational methods for predicting cancer survival is pivotal in aiding clinicians to formulate effective treatment strategies and improve patient quality of life. Recent advances in survival prediction methods show that integrating diverse information from various cancer-related data, such as pathological images and genomics, is crucial for improving prediction accuracy. Despite promising results of existing approaches, there are great challenges of modality gap and semantic redundancy presented in multiple cancer data, which could hinder the comprehensive integration and pose substantial obstacles to further enhancing cancer survival prediction. In this study, we propose a novel agnostic-specific modality learning (ASML) framework for accurate cancer survival prediction. To bridge the modality gap and provide a comprehensive view of distinct data modalities, we employ an agnostic-specific learning strategy to learn the commonality across modalities and the uniqueness of each modality. Moreover, a cross-modal fusion network is exerted to integrate multimodal information by modeling modality correlations and diminish semantic redundancy in a divide-and-conquer manner. Extensive experiment results on three TCGA datasets demonstrate that ASML reaches better performance than other existing cancer survival prediction methods for multiple data.</p>","PeriodicalId":13073,"journal":{"name":"IEEE Journal of Biomedical and Health Informatics","volume":null,"pages":null},"PeriodicalIF":6.7000,"publicationDate":"2024-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Journal of Biomedical and Health Informatics","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1109/JBHI.2024.3481310","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

Abstract

Cancer is a pressing public health problem and one of the main causes of mortality worldwide. The development of advanced computational methods for predicting cancer survival is pivotal in aiding clinicians to formulate effective treatment strategies and improve patient quality of life. Recent advances in survival prediction methods show that integrating diverse information from various cancer-related data, such as pathological images and genomics, is crucial for improving prediction accuracy. Despite promising results of existing approaches, there are great challenges of modality gap and semantic redundancy presented in multiple cancer data, which could hinder the comprehensive integration and pose substantial obstacles to further enhancing cancer survival prediction. In this study, we propose a novel agnostic-specific modality learning (ASML) framework for accurate cancer survival prediction. To bridge the modality gap and provide a comprehensive view of distinct data modalities, we employ an agnostic-specific learning strategy to learn the commonality across modalities and the uniqueness of each modality. Moreover, a cross-modal fusion network is exerted to integrate multimodal information by modeling modality correlations and diminish semantic redundancy in a divide-and-conquer manner. Extensive experiment results on three TCGA datasets demonstrate that ASML reaches better performance than other existing cancer survival prediction methods for multiple data.

从多种数据中预测癌症生存期的不可知特定模式学习。
癌症是一个紧迫的公共卫生问题,也是全球死亡的主要原因之一。开发先进的癌症生存期预测计算方法对于帮助临床医生制定有效的治疗策略和提高患者生活质量至关重要。生存预测方法的最新进展表明,整合病理图像和基因组学等各种癌症相关数据中的不同信息对于提高预测准确性至关重要。尽管现有方法取得了可喜的成果,但多种癌症数据中存在的模态差距和语义冗余仍是巨大的挑战,这可能会阻碍全面整合,并对进一步提高癌症生存预测能力构成实质性障碍。在本研究中,我们提出了一种新颖的不可知论特定模式学习(ASML)框架,用于准确预测癌症生存率。为了弥合模态鸿沟并提供不同数据模态的综合视图,我们采用了一种不可知论特定学习策略来学习不同模态的共性和每种模态的独特性。此外,我们还利用跨模态融合网络,通过模态相关性建模来整合多模态信息,并以分而治之的方式减少语义冗余。在三个 TCGA 数据集上进行的广泛实验结果表明,ASML 在多数据癌症生存预测方面的性能优于其他现有方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Journal of Biomedical and Health Informatics
IEEE Journal of Biomedical and Health Informatics COMPUTER SCIENCE, INFORMATION SYSTEMS-COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
CiteScore
13.60
自引率
6.50%
发文量
1151
期刊介绍: IEEE Journal of Biomedical and Health Informatics publishes original papers presenting recent advances where information and communication technologies intersect with health, healthcare, life sciences, and biomedicine. Topics include acquisition, transmission, storage, retrieval, management, and analysis of biomedical and health information. The journal covers applications of information technologies in healthcare, patient monitoring, preventive care, early disease diagnosis, therapy discovery, and personalized treatment protocols. It explores electronic medical and health records, clinical information systems, decision support systems, medical and biological imaging informatics, wearable systems, body area/sensor networks, and more. Integration-related topics like interoperability, evidence-based medicine, and secure patient data are also addressed.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信