U-Net multi-modality glioma MRIs segmentation combined with attention

Yixing Wang, Xiufen Ye
{"title":"U-Net multi-modality glioma MRIs segmentation combined with attention","authors":"Yixing Wang, Xiufen Ye","doi":"10.1109/ISBP57705.2023.10061312","DOIUrl":null,"url":null,"abstract":"Glioma, the most common primary intracranial tumor, is known as the “brain killer,” accounting for 27% of all central nervous system tumors and 80% of malignant tumors, and is one of the most difficult and refractory tumors to treat in neurosurgery. The development of medical imaging technology has simplified the diagnosis of the disease, and in order to avoid or reduce the errors of manual segmentation, deep learning based segmentation of glioma has become the hope of radiologists and clinicians. Accurate segmentation of gliomas is an important prerequisite for making glioma diagnosis, providing treatment plans and evaluating treatment outcomes. To effectively target the characteristics of multimodal glioma MRI and the shortcomings of CNNs-based, U-Net-based glioma segmentation methods, a method of 2D-CNNs segmentation results based on attention mechanism is proposed. In this study, the datasets of BraTS2018 and BraTS2019 were included and the segmentation results were evaluated using three metrics: Dice coefficient, positive predictive value, and sensitivity. The experimental results show that the proposed segmentation method can accurately segment gliomas.","PeriodicalId":309634,"journal":{"name":"2023 International Conference on Intelligent Supercomputing and BioPharma (ISBP)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-01-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Conference on Intelligent Supercomputing and BioPharma (ISBP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISBP57705.2023.10061312","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Glioma, the most common primary intracranial tumor, is known as the “brain killer,” accounting for 27% of all central nervous system tumors and 80% of malignant tumors, and is one of the most difficult and refractory tumors to treat in neurosurgery. The development of medical imaging technology has simplified the diagnosis of the disease, and in order to avoid or reduce the errors of manual segmentation, deep learning based segmentation of glioma has become the hope of radiologists and clinicians. Accurate segmentation of gliomas is an important prerequisite for making glioma diagnosis, providing treatment plans and evaluating treatment outcomes. To effectively target the characteristics of multimodal glioma MRI and the shortcomings of CNNs-based, U-Net-based glioma segmentation methods, a method of 2D-CNNs segmentation results based on attention mechanism is proposed. In this study, the datasets of BraTS2018 and BraTS2019 were included and the segmentation results were evaluated using three metrics: Dice coefficient, positive predictive value, and sensitivity. The experimental results show that the proposed segmentation method can accurately segment gliomas.
U-Net多模态胶质瘤mri分割结合关注
胶质瘤是最常见的原发性颅内肿瘤,被称为“脑杀手”,占所有中枢神经系统肿瘤的27%,恶性肿瘤的80%,是神经外科最难治疗的肿瘤之一。医学影像技术的发展简化了对疾病的诊断,为了避免或减少人工分割的错误,基于深度学习的胶质瘤分割成为放射科医生和临床医生的希望。胶质瘤的准确分割是进行胶质瘤诊断、提供治疗方案和评估治疗效果的重要前提。为了有效针对多模态胶质瘤MRI的特点和基于cnn、基于u - net的胶质瘤分割方法的不足,提出了一种基于注意机制的2d - cnn分割结果的方法。本研究采用BraTS2018和BraTS2019数据集,采用Dice系数、阳性预测值和灵敏度三个指标对分割结果进行评价。实验结果表明,该方法能够准确地分割胶质瘤。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信