{"title":"组织病理学图像分析中基于深度学习的可解释性和ExaMode项目","authors":"Henning Müller, M. Atzori","doi":"10.47184/tp.2023.01.05","DOIUrl":null,"url":null,"abstract":"With digital clinical workflows in histopathology departments, the possibility to use machine-learning-based decision support is increasing. Still, there are many challenges despite often good results on retrospective data. Explainable AI can help to find bias in data and also integrated decision support with other available clinical data. The ExaMode project has implemented many tools and automatic pipelines for such decision support. Most of the algorithms are available for research use and can thus be of help for other researchers in the domain.","PeriodicalId":126763,"journal":{"name":"Trillium Pathology","volume":"34 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Deep-learning-based interpretability and the ExaMode project in histopathology image analysis\",\"authors\":\"Henning Müller, M. Atzori\",\"doi\":\"10.47184/tp.2023.01.05\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"With digital clinical workflows in histopathology departments, the possibility to use machine-learning-based decision support is increasing. Still, there are many challenges despite often good results on retrospective data. Explainable AI can help to find bias in data and also integrated decision support with other available clinical data. The ExaMode project has implemented many tools and automatic pipelines for such decision support. Most of the algorithms are available for research use and can thus be of help for other researchers in the domain.\",\"PeriodicalId\":126763,\"journal\":{\"name\":\"Trillium Pathology\",\"volume\":\"34 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Trillium Pathology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.47184/tp.2023.01.05\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Trillium Pathology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.47184/tp.2023.01.05","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Deep-learning-based interpretability and the ExaMode project in histopathology image analysis
With digital clinical workflows in histopathology departments, the possibility to use machine-learning-based decision support is increasing. Still, there are many challenges despite often good results on retrospective data. Explainable AI can help to find bias in data and also integrated decision support with other available clinical data. The ExaMode project has implemented many tools and automatic pipelines for such decision support. Most of the algorithms are available for research use and can thus be of help for other researchers in the domain.