Cross-modal alignment and contrastive learning for enhanced cancer survival prediction

IF 4.9 2区 医学 Q1 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
Tengfei Li , Xuezhong Zhou , Jingyan Xue , Lili Zeng , Qiang Zhu , Ruiping Wang , Haibin Yu , Jianan Xia
{"title":"Cross-modal alignment and contrastive learning for enhanced cancer survival prediction","authors":"Tengfei Li ,&nbsp;Xuezhong Zhou ,&nbsp;Jingyan Xue ,&nbsp;Lili Zeng ,&nbsp;Qiang Zhu ,&nbsp;Ruiping Wang ,&nbsp;Haibin Yu ,&nbsp;Jianan Xia","doi":"10.1016/j.cmpb.2025.108633","DOIUrl":null,"url":null,"abstract":"<div><h3>Background and Objective:</h3><div>Integrating multimodal data, such as pathology images and genomics, is crucial for understanding cancer heterogeneity, personalized treatment complexity, and enhancing survival prediction. However, most current prognostic methods are limited to a single domain of histopathology or genomics, inevitably reducing their potential for accurate patient outcome prediction. Despite advancements in the concurrent analysis of pathology and genomic data, existing approaches inadequately address the intricate intermodal relationships.</div></div><div><h3>Methods:</h3><div>This paper introduces the CPathomic method for multimodal data-based survival prediction. By leveraging whole slide pathology images to guide local pathological features, the method effectively mitigates significant intermodal differences through a cross-modal representational contrastive learning module. Furthermore, it facilitates interactive learning between different modalities through cross-modal and gated attention modules.</div></div><div><h3>Results:</h3><div>The extensive experiments on five public TCGA datasets demonstrate that CPathomic framework effectively bridges modality gaps, consistently outperforming alternative multimodal survival prediction methods.</div></div><div><h3>Conclusion:</h3><div>The model we propose, CPathomic, unveils the potential of contrastive learning and cross-modal attention in the representation and fusion of multimodal data, enhancing the performance of patient survival prediction.</div></div>","PeriodicalId":10624,"journal":{"name":"Computer methods and programs in biomedicine","volume":"263 ","pages":"Article 108633"},"PeriodicalIF":4.9000,"publicationDate":"2025-02-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer methods and programs in biomedicine","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0169260725000501","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

Abstract

Background and Objective:

Integrating multimodal data, such as pathology images and genomics, is crucial for understanding cancer heterogeneity, personalized treatment complexity, and enhancing survival prediction. However, most current prognostic methods are limited to a single domain of histopathology or genomics, inevitably reducing their potential for accurate patient outcome prediction. Despite advancements in the concurrent analysis of pathology and genomic data, existing approaches inadequately address the intricate intermodal relationships.

Methods:

This paper introduces the CPathomic method for multimodal data-based survival prediction. By leveraging whole slide pathology images to guide local pathological features, the method effectively mitigates significant intermodal differences through a cross-modal representational contrastive learning module. Furthermore, it facilitates interactive learning between different modalities through cross-modal and gated attention modules.

Results:

The extensive experiments on five public TCGA datasets demonstrate that CPathomic framework effectively bridges modality gaps, consistently outperforming alternative multimodal survival prediction methods.

Conclusion:

The model we propose, CPathomic, unveils the potential of contrastive learning and cross-modal attention in the representation and fusion of multimodal data, enhancing the performance of patient survival prediction.
求助全文
约1分钟内获得全文 求助全文
来源期刊
Computer methods and programs in biomedicine
Computer methods and programs in biomedicine 工程技术-工程:生物医学
CiteScore
12.30
自引率
6.60%
发文量
601
审稿时长
135 days
期刊介绍: To encourage the development of formal computing methods, and their application in biomedical research and medical practice, by illustration of fundamental principles in biomedical informatics research; to stimulate basic research into application software design; to report the state of research of biomedical information processing projects; to report new computer methodologies applied in biomedical areas; the eventual distribution of demonstrable software to avoid duplication of effort; to provide a forum for discussion and improvement of existing software; to optimize contact between national organizations and regional user groups by promoting an international exchange of information on formal methods, standards and software in biomedicine. Computer Methods and Programs in Biomedicine covers computing methodology and software systems derived from computing science for implementation in all aspects of biomedical research and medical practice. It is designed to serve: biochemists; biologists; geneticists; immunologists; neuroscientists; pharmacologists; toxicologists; clinicians; epidemiologists; psychiatrists; psychologists; cardiologists; chemists; (radio)physicists; computer scientists; programmers and systems analysts; biomedical, clinical, electrical and other engineers; teachers of medical informatics and users of educational software.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信