Histo-Genomic Knowledge Association for Cancer Prognosis From Histopathology Whole Slide Images

Zhikang Wang;Yumeng Zhang;Yingxue Xu;Seiya Imoto;Hao Chen;Jiangning Song
{"title":"Histo-Genomic Knowledge Association for Cancer Prognosis From Histopathology Whole Slide Images","authors":"Zhikang Wang;Yumeng Zhang;Yingxue Xu;Seiya Imoto;Hao Chen;Jiangning Song","doi":"10.1109/TMI.2025.3526816","DOIUrl":null,"url":null,"abstract":"Histo-genomic multi-modal methods have emerged as a powerful paradigm, demonstrating significant potential for cancer prognosis. However, genome sequencing, unlike histopathology imaging, is still not widely accessible in underdeveloped regions, limiting the application of these multi-modal approaches in clinical settings. To address this, we propose a novel Genome-informed Hyper-Attention Network, termed G-HANet, which is capable of effectively learning the histo-genomic associations during training to elevate uni-modal whole slide image (WSI)-based inference for the first time. Compared with the potential knowledge distillation strategy for this setting (i.e., distilling a multi-modal network to a uni-modal network), our end-to-end model is superior in training efficiency and learning cross-modal interactions. Specifically, the network comprises cross-modal associating branch (CAB) and hyper-attention survival branch (HSB). Through the genomic data reconstruction from WSIs, CAB effectively distills the associations between functional genotypes and morphological phenotypes and offers insights into the gene expression profiles in the feature space. Subsequently, HSB leverages the distilled histo-genomic associations as well as the generated morphology-based weights to achieve the hyper-attention modeling of the patients from both histopathology and genomic perspectives to improve cancer prognosis. Extensive experiments are conducted on five TCGA benchmarking datasets and the results demonstrate that G-HANet significantly outperforms the state-of-the-art WSI-based methods and achieves competitive performance with genome-based and multi-modal methods. G-HANet is expected to be explored as a useful tool by the research community to address the current bottleneck of insufficient histo-genomic data pairing in the context of cancer prognosis and precision oncology. The code is available at <uri>https://github.com/ZacharyWang-007/G-HANet</uri>.","PeriodicalId":94033,"journal":{"name":"IEEE transactions on medical imaging","volume":"44 5","pages":"2170-2181"},"PeriodicalIF":0.0000,"publicationDate":"2025-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on medical imaging","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10830530/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Histo-genomic multi-modal methods have emerged as a powerful paradigm, demonstrating significant potential for cancer prognosis. However, genome sequencing, unlike histopathology imaging, is still not widely accessible in underdeveloped regions, limiting the application of these multi-modal approaches in clinical settings. To address this, we propose a novel Genome-informed Hyper-Attention Network, termed G-HANet, which is capable of effectively learning the histo-genomic associations during training to elevate uni-modal whole slide image (WSI)-based inference for the first time. Compared with the potential knowledge distillation strategy for this setting (i.e., distilling a multi-modal network to a uni-modal network), our end-to-end model is superior in training efficiency and learning cross-modal interactions. Specifically, the network comprises cross-modal associating branch (CAB) and hyper-attention survival branch (HSB). Through the genomic data reconstruction from WSIs, CAB effectively distills the associations between functional genotypes and morphological phenotypes and offers insights into the gene expression profiles in the feature space. Subsequently, HSB leverages the distilled histo-genomic associations as well as the generated morphology-based weights to achieve the hyper-attention modeling of the patients from both histopathology and genomic perspectives to improve cancer prognosis. Extensive experiments are conducted on five TCGA benchmarking datasets and the results demonstrate that G-HANet significantly outperforms the state-of-the-art WSI-based methods and achieves competitive performance with genome-based and multi-modal methods. G-HANet is expected to be explored as a useful tool by the research community to address the current bottleneck of insufficient histo-genomic data pairing in the context of cancer prognosis and precision oncology. The code is available at https://github.com/ZacharyWang-007/G-HANet.
从组织病理学整张幻灯片图像来看,组织基因组知识与癌症预后的关联
组织基因组多模态方法已经成为一种强有力的范例,显示出癌症预后的巨大潜力。然而,与组织病理学成像不同,基因组测序在不发达地区仍然无法广泛获得,限制了这些多模式方法在临床环境中的应用。为了解决这个问题,我们提出了一种新的基因组信息超注意网络,称为G-HANet,它能够在训练过程中有效地学习历史基因组关联,从而首次提高基于单模态全幻灯片图像(WSI)的推理。与该设置的潜在知识蒸馏策略(即将多模态网络蒸馏为单模态网络)相比,我们的端到端模型在训练效率和学习跨模态交互方面具有优势。具体来说,神经网络包括交叉模态关联分支(CAB)和高度注意生存分支(HSB)。通过wsi的基因组数据重建,CAB有效地提取了功能基因型和形态表型之间的关联,并提供了对特征空间中基因表达谱的见解。随后,HSB利用提取的组织-基因组关联以及生成的基于形态学的权重,从组织病理学和基因组学角度对患者进行高度关注建模,以改善癌症预后。在五个TCGA基准测试数据集上进行了大量实验,结果表明G-HANet显著优于最先进的基于wsi的方法,并与基于基因组和多模态的方法取得了竞争性能。G-HANet有望作为一个有用的工具被研究界探索,以解决当前在癌症预后和精确肿瘤学背景下组织基因组数据配对不足的瓶颈。代码可在https://github.com/ZacharyWang-007/G-HANet上获得。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信