Luis Carlos Rivera Monroy , Leonhard Rist , Frauke Wilm , Christian Ostalecki , Andreas Baur , Julio Vera , Katharina Breininger , Andreas Maier
{"title":"Multi-level cancer profiling through joint cell-graph representations","authors":"Luis Carlos Rivera Monroy , Leonhard Rist , Frauke Wilm , Christian Ostalecki , Andreas Baur , Julio Vera , Katharina Breininger , Andreas Maier","doi":"10.1016/j.smhl.2024.100470","DOIUrl":null,"url":null,"abstract":"<div><p>Computer-aided analysis of digitized pathology samples has significantly advanced with the rapid progression of machine and Deep Learning (DL) methods. However, most existing approaches primarily focus on features extracted from patches due to the large image sizes. This focus limits the ability of Convolutional Neural Networks (CNNs) to capture global information from the samples, resulting in an incomplete phenotypical and topological representation and thereby restricting the diagnostic capabilities of these methods. The recent emergence of Graph Neural Networks (GNNs) offers new opportunities to overcome these limitations through graph-driven representations of pathological samples. This work introduces a graph-based framework that encompasses diverse cancer types and integrates different imaging modalities. In this framework, histopathology samples are represented as graphs, and a pipeline facilitating cell-wise and disease classification is developed. The results support this motivation: for cell-wise classification, we achieved an average accuracy of <span><math><mrow><mn>88</mn><mtext>%</mtext></mrow></math></span>, and for disease-wise classification, an average accuracy of 83%, outperforming reference models such as XGBoost and standard CNNs. This approach not only provides flexibility in combining various diseases but also extends to integrating different staining techniques.</p></div>","PeriodicalId":37151,"journal":{"name":"Smart Health","volume":"32 ","pages":"Article 100470"},"PeriodicalIF":0.0000,"publicationDate":"2024-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2352648324000266/pdfft?md5=7c470921f13bec160c26b55acac4f145&pid=1-s2.0-S2352648324000266-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Smart Health","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2352648324000266","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Health Professions","Score":null,"Total":0}
引用次数: 0
Abstract
Computer-aided analysis of digitized pathology samples has significantly advanced with the rapid progression of machine and Deep Learning (DL) methods. However, most existing approaches primarily focus on features extracted from patches due to the large image sizes. This focus limits the ability of Convolutional Neural Networks (CNNs) to capture global information from the samples, resulting in an incomplete phenotypical and topological representation and thereby restricting the diagnostic capabilities of these methods. The recent emergence of Graph Neural Networks (GNNs) offers new opportunities to overcome these limitations through graph-driven representations of pathological samples. This work introduces a graph-based framework that encompasses diverse cancer types and integrates different imaging modalities. In this framework, histopathology samples are represented as graphs, and a pipeline facilitating cell-wise and disease classification is developed. The results support this motivation: for cell-wise classification, we achieved an average accuracy of , and for disease-wise classification, an average accuracy of 83%, outperforming reference models such as XGBoost and standard CNNs. This approach not only provides flexibility in combining various diseases but also extends to integrating different staining techniques.