ct篡位:一种基于细胞感知变压器的不确定性网络,用于使用整个幻灯片图像进行生存预测

Zhihao Tang;Lin Yang;Zongyi Chen;Li Liu;Chaozhuo Li;Ruanqi Chen;Xi Zhang;Qingfeng Zheng
{"title":"ct篡位:一种基于细胞感知变压器的不确定性网络,用于使用整个幻灯片图像进行生存预测","authors":"Zhihao Tang;Lin Yang;Zongyi Chen;Li Liu;Chaozhuo Li;Ruanqi Chen;Xi Zhang;Qingfeng Zheng","doi":"10.1109/TMI.2025.3526848","DOIUrl":null,"url":null,"abstract":"Image-based survival prediction through deep learning techniques represents a burgeoning frontier aimed at augmenting the diagnostic capabilities of pathologists. However, directly applying existing deep learning models to survival prediction may not be a panacea due to the inherent complexity and sophistication of whole slide images (WSIs). The intricate nature of high-resolution WSIs, characterized by sophisticated patterns and inherent noise, presents significant challenges in terms of effectiveness and trustworthiness. In this paper, we propose CTUSurv, a novel survival prediction model designed to simultaneously capture cell-to-cell and cell-to-microenvironment interactions, complemented by a region-based uncertainty estimation framework to assess the reliability of survival predictions. Our approach incorporates an innovative region sampling strategy to extract task-relevant, informative regions from high-resolution WSIs. To address the challenges posed by sophisticated biological patterns, a cell-aware encoding module is integrated to model the interactions among biological entities. Furthermore, CTUSurv includes a novel aleatoric uncertainty estimation module to provide fine-grained uncertainty scores at the region level. Extensive evaluations across four datasets demonstrate the superiority of our proposed approach in terms of both predictive accuracy and reliability.","PeriodicalId":94033,"journal":{"name":"IEEE transactions on medical imaging","volume":"44 4","pages":"1750-1764"},"PeriodicalIF":0.0000,"publicationDate":"2025-01-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"CTUSurv: A Cell-Aware Transformer-Based Network With Uncertainty for Survival Prediction Using Whole Slide Images\",\"authors\":\"Zhihao Tang;Lin Yang;Zongyi Chen;Li Liu;Chaozhuo Li;Ruanqi Chen;Xi Zhang;Qingfeng Zheng\",\"doi\":\"10.1109/TMI.2025.3526848\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Image-based survival prediction through deep learning techniques represents a burgeoning frontier aimed at augmenting the diagnostic capabilities of pathologists. However, directly applying existing deep learning models to survival prediction may not be a panacea due to the inherent complexity and sophistication of whole slide images (WSIs). The intricate nature of high-resolution WSIs, characterized by sophisticated patterns and inherent noise, presents significant challenges in terms of effectiveness and trustworthiness. In this paper, we propose CTUSurv, a novel survival prediction model designed to simultaneously capture cell-to-cell and cell-to-microenvironment interactions, complemented by a region-based uncertainty estimation framework to assess the reliability of survival predictions. Our approach incorporates an innovative region sampling strategy to extract task-relevant, informative regions from high-resolution WSIs. To address the challenges posed by sophisticated biological patterns, a cell-aware encoding module is integrated to model the interactions among biological entities. Furthermore, CTUSurv includes a novel aleatoric uncertainty estimation module to provide fine-grained uncertainty scores at the region level. Extensive evaluations across four datasets demonstrate the superiority of our proposed approach in terms of both predictive accuracy and reliability.\",\"PeriodicalId\":94033,\"journal\":{\"name\":\"IEEE transactions on medical imaging\",\"volume\":\"44 4\",\"pages\":\"1750-1764\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2025-01-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on medical imaging\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10834512/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on medical imaging","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10834512/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

通过深度学习技术的基于图像的生存预测代表了一个新兴的前沿,旨在增强病理学家的诊断能力。然而,由于整个幻灯片图像(wsi)固有的复杂性和复杂性,直接应用现有的深度学习模型进行生存预测可能不是灵丹妙药。高分辨率wsi的复杂性质,以复杂的模式和固有的噪声为特征,在有效性和可靠性方面提出了重大挑战。在本文中,我们提出了ct篡夺,一个新的生存预测模型,旨在同时捕获细胞与细胞和细胞与微环境的相互作用,辅以基于区域的不确定性估计框架来评估生存预测的可靠性。我们的方法结合了一种创新的区域采样策略,从高分辨率wsi中提取与任务相关的信息区域。为了解决复杂生物模式带来的挑战,集成了一个细胞感知编码模块来模拟生物实体之间的相互作用。此外,cturev还包括一个新的任意不确定性估计模块,以提供区域级别的细粒度不确定性评分。对四个数据集的广泛评估证明了我们提出的方法在预测准确性和可靠性方面的优越性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
CTUSurv: A Cell-Aware Transformer-Based Network With Uncertainty for Survival Prediction Using Whole Slide Images
Image-based survival prediction through deep learning techniques represents a burgeoning frontier aimed at augmenting the diagnostic capabilities of pathologists. However, directly applying existing deep learning models to survival prediction may not be a panacea due to the inherent complexity and sophistication of whole slide images (WSIs). The intricate nature of high-resolution WSIs, characterized by sophisticated patterns and inherent noise, presents significant challenges in terms of effectiveness and trustworthiness. In this paper, we propose CTUSurv, a novel survival prediction model designed to simultaneously capture cell-to-cell and cell-to-microenvironment interactions, complemented by a region-based uncertainty estimation framework to assess the reliability of survival predictions. Our approach incorporates an innovative region sampling strategy to extract task-relevant, informative regions from high-resolution WSIs. To address the challenges posed by sophisticated biological patterns, a cell-aware encoding module is integrated to model the interactions among biological entities. Furthermore, CTUSurv includes a novel aleatoric uncertainty estimation module to provide fine-grained uncertainty scores at the region level. Extensive evaluations across four datasets demonstrate the superiority of our proposed approach in terms of both predictive accuracy and reliability.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信