HGMSurvNet:用于多模式癌症生存预测的两阶段超图学习网络

IF 11.8 1区 医学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Saisai Ding , Linjin Li , Ge Jin , Jun Wang , Shihui Ying , Jun Shi
{"title":"HGMSurvNet:用于多模式癌症生存预测的两阶段超图学习网络","authors":"Saisai Ding ,&nbsp;Linjin Li ,&nbsp;Ge Jin ,&nbsp;Jun Wang ,&nbsp;Shihui Ying ,&nbsp;Jun Shi","doi":"10.1016/j.media.2025.103661","DOIUrl":null,"url":null,"abstract":"<div><div>Cancer survival prediction based on multimodal data (e.g., pathological slides, clinical records, and genomic profiles) has become increasingly prevalent in recent years. A key challenge of this task is obtaining an effective survival-specific global representation from patient data with highly complicated correlations. Furthermore, the absence of certain modalities is a common issue in clinical practice, which renders current multimodal methods either outdated or ineffective. This article proposes a novel two-stage hypergraph learning network, called HGMSurvNet, for multimodal cancer survival prediction. HGMSurvNet can gradually learn the higher-order global representations from the WSI-level to the patient-level for multimodal learning via multilateral correlation modeling in multiple stages. Most importantly, to address the data noise and missing modalities issues in clinical scenarios, we develop a new hypergraph convolution network with a hyperedge dropout mechanism to discard unimportant hyperedges during model training. Extensive validation experiments were conducted on six public cancer cohorts from TCGA. The results demonstrated that the proposed method consistently outperforms state-of-the-art methods. We also demonstrate the interpretable analysis of HGMSurvNet and its application potential in pathological images and patient modeling, which has valuable clinical significance for the survival prognosis.</div></div>","PeriodicalId":18328,"journal":{"name":"Medical image analysis","volume":"104 ","pages":"Article 103661"},"PeriodicalIF":11.8000,"publicationDate":"2025-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"HGMSurvNet: A two-stage hypergraph learning network for multimodal cancer survival prediction\",\"authors\":\"Saisai Ding ,&nbsp;Linjin Li ,&nbsp;Ge Jin ,&nbsp;Jun Wang ,&nbsp;Shihui Ying ,&nbsp;Jun Shi\",\"doi\":\"10.1016/j.media.2025.103661\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Cancer survival prediction based on multimodal data (e.g., pathological slides, clinical records, and genomic profiles) has become increasingly prevalent in recent years. A key challenge of this task is obtaining an effective survival-specific global representation from patient data with highly complicated correlations. Furthermore, the absence of certain modalities is a common issue in clinical practice, which renders current multimodal methods either outdated or ineffective. This article proposes a novel two-stage hypergraph learning network, called HGMSurvNet, for multimodal cancer survival prediction. HGMSurvNet can gradually learn the higher-order global representations from the WSI-level to the patient-level for multimodal learning via multilateral correlation modeling in multiple stages. Most importantly, to address the data noise and missing modalities issues in clinical scenarios, we develop a new hypergraph convolution network with a hyperedge dropout mechanism to discard unimportant hyperedges during model training. Extensive validation experiments were conducted on six public cancer cohorts from TCGA. The results demonstrated that the proposed method consistently outperforms state-of-the-art methods. We also demonstrate the interpretable analysis of HGMSurvNet and its application potential in pathological images and patient modeling, which has valuable clinical significance for the survival prognosis.</div></div>\",\"PeriodicalId\":18328,\"journal\":{\"name\":\"Medical image analysis\",\"volume\":\"104 \",\"pages\":\"Article 103661\"},\"PeriodicalIF\":11.8000,\"publicationDate\":\"2025-05-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Medical image analysis\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1361841525002087\",\"RegionNum\":1,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Medical image analysis","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1361841525002087","RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

近年来,基于多模式数据(如病理切片、临床记录和基因组谱)的癌症生存预测变得越来越普遍。该任务的一个关键挑战是从具有高度复杂相关性的患者数据中获得有效的生存特异性全局表示。此外,某些模式的缺乏是临床实践中的一个常见问题,这使得目前的多模式方法要么过时要么无效。本文提出了一种新的两阶段超图学习网络,称为HGMSurvNet,用于多模式癌症生存预测。HGMSurvNet可以通过多阶段的多边关联建模,逐步学习从wsi级到患者级的高阶全局表示,进行多模态学习。最重要的是,为了解决临床场景中的数据噪声和缺失模式问题,我们开发了一种新的超图卷积网络,该网络具有超边缘丢弃机制,可以在模型训练期间丢弃不重要的超边缘。在TCGA的六个公共癌症队列中进行了广泛的验证实验。结果表明,所提出的方法始终优于最先进的方法。我们还展示了HGMSurvNet的可解释性分析及其在病理图像和患者建模方面的应用潜力,这对生存预后具有重要的临床意义。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
HGMSurvNet: A two-stage hypergraph learning network for multimodal cancer survival prediction
Cancer survival prediction based on multimodal data (e.g., pathological slides, clinical records, and genomic profiles) has become increasingly prevalent in recent years. A key challenge of this task is obtaining an effective survival-specific global representation from patient data with highly complicated correlations. Furthermore, the absence of certain modalities is a common issue in clinical practice, which renders current multimodal methods either outdated or ineffective. This article proposes a novel two-stage hypergraph learning network, called HGMSurvNet, for multimodal cancer survival prediction. HGMSurvNet can gradually learn the higher-order global representations from the WSI-level to the patient-level for multimodal learning via multilateral correlation modeling in multiple stages. Most importantly, to address the data noise and missing modalities issues in clinical scenarios, we develop a new hypergraph convolution network with a hyperedge dropout mechanism to discard unimportant hyperedges during model training. Extensive validation experiments were conducted on six public cancer cohorts from TCGA. The results demonstrated that the proposed method consistently outperforms state-of-the-art methods. We also demonstrate the interpretable analysis of HGMSurvNet and its application potential in pathological images and patient modeling, which has valuable clinical significance for the survival prognosis.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Medical image analysis
Medical image analysis 工程技术-工程:生物医学
CiteScore
22.10
自引率
6.40%
发文量
309
审稿时长
6.6 months
期刊介绍: Medical Image Analysis serves as a platform for sharing new research findings in the realm of medical and biological image analysis, with a focus on applications of computer vision, virtual reality, and robotics to biomedical imaging challenges. The journal prioritizes the publication of high-quality, original papers contributing to the fundamental science of processing, analyzing, and utilizing medical and biological images. It welcomes approaches utilizing biomedical image datasets across all spatial scales, from molecular/cellular imaging to tissue/organ imaging.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信