Embedding-based Multimodal Learning on Pan-Squamous Cell Carcinomas for Improved Survival Outcomes

Asim Waqas, Aakash Tripathi, Paul Stewart, Mia Naeini, Ghulam Rasool
{"title":"Embedding-based Multimodal Learning on Pan-Squamous Cell Carcinomas for Improved Survival Outcomes","authors":"Asim Waqas, Aakash Tripathi, Paul Stewart, Mia Naeini, Ghulam Rasool","doi":"arxiv-2406.08521","DOIUrl":null,"url":null,"abstract":"Cancer clinics capture disease data at various scales, from genetic to organ\nlevel. Current bioinformatic methods struggle to handle the heterogeneous\nnature of this data, especially with missing modalities. We propose PARADIGM, a\nGraph Neural Network (GNN) framework that learns from multimodal, heterogeneous\ndatasets to improve clinical outcome prediction. PARADIGM generates embeddings\nfrom multi-resolution data using foundation models, aggregates them into\npatient-level representations, fuses them into a unified graph, and enhances\nperformance for tasks like survival analysis. We train GNNs on pan-Squamous\nCell Carcinomas and validate our approach on Moffitt Cancer Center lung SCC\ndata. Multimodal GNN outperforms other models in patient survival prediction.\nConverging individual data modalities across varying scales provides a more\ninsightful disease view. Our solution aims to understand the patient's\ncircumstances comprehensively, offering insights on heterogeneous data\nintegration and the benefits of converging maximum data views.","PeriodicalId":501321,"journal":{"name":"arXiv - QuanBio - Cell Behavior","volume":"65 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-06-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - QuanBio - Cell Behavior","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2406.08521","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Cancer clinics capture disease data at various scales, from genetic to organ level. Current bioinformatic methods struggle to handle the heterogeneous nature of this data, especially with missing modalities. We propose PARADIGM, a Graph Neural Network (GNN) framework that learns from multimodal, heterogeneous datasets to improve clinical outcome prediction. PARADIGM generates embeddings from multi-resolution data using foundation models, aggregates them into patient-level representations, fuses them into a unified graph, and enhances performance for tasks like survival analysis. We train GNNs on pan-Squamous Cell Carcinomas and validate our approach on Moffitt Cancer Center lung SCC data. Multimodal GNN outperforms other models in patient survival prediction. Converging individual data modalities across varying scales provides a more insightful disease view. Our solution aims to understand the patient's circumstances comprehensively, offering insights on heterogeneous data integration and the benefits of converging maximum data views.
基于嵌入式多模态学习的泛鳞状细胞癌改善生存结果
癌症诊所从基因到器官层面捕捉各种规模的疾病数据。目前的生物信息学方法很难处理这些数据的异质性,尤其是缺失的模式。我们提出的 PARADIGM 是一种图神经网络(GNN)框架,它可以从多模态、异构数据集中学习,从而改进临床结果预测。PARADIGM 利用基础模型从多分辨率数据中生成嵌入,将它们聚合成患者级别的表示,融合成统一的图,并提高生存分析等任务的性能。我们在泛鳞状细胞癌上训练 GNN,并在 Moffitt 癌症中心的肺 SCC 数据上验证了我们的方法。多模态 GNN 在患者生存预测方面的表现优于其他模型。我们的解决方案旨在全面了解患者的情况,提供关于异构数据整合的见解以及最大数据视图融合的益处。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信