增强核磁共振成像图像中的脑肿瘤分割:使用 UNet、注意力机制和变压器的混合方法

IF 5 3区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Thien B. Nguyen-Tat , Thien-Qua T. Nguyen , Hieu-Nghia Nguyen , Vuong M. Ngo
{"title":"增强核磁共振成像图像中的脑肿瘤分割:使用 UNet、注意力机制和变压器的混合方法","authors":"Thien B. Nguyen-Tat ,&nbsp;Thien-Qua T. Nguyen ,&nbsp;Hieu-Nghia Nguyen ,&nbsp;Vuong M. Ngo","doi":"10.1016/j.eij.2024.100528","DOIUrl":null,"url":null,"abstract":"<div><p>Accurate brain tumor segmentation in MRI images is crucial for effective treatment planning and monitoring. Traditional methods often encounter challenges due to the complexity and variability of tumor shapes and textures. Consequently, there is a growing need for automated solutions to assist healthcare professionals in segmentation tasks, improving efficiency and reducing workload. This study introduces an innovative method for accurately segmenting brain tumors in MRI images by employing a refined 3D UNet model integrated with a Transformer. The goal is to leverage self-attention mechanisms to enhance segmentation capabilities. The proposed model combines Contextual Transformer (CoT) and Double Attention (DA) architectures. CoT is extended to a 3D format and integrated with the baseline model to exploit intricate contextual details in MRI images. DA blocks in skip connections aggregate and distribute long-range features, emphasizing inter-dependencies within an expanded spatial scope. Experimental results demonstrate superior segmentation performance compared to current state-of-the-art methods. With its ability to accurately segment and delineate tumors in 3D, our segmentation model promises to be a powerful tool for medical image processing and performance optimization, saving time for healthcare professionals and healthcare systems.</p></div>","PeriodicalId":56010,"journal":{"name":"Egyptian Informatics Journal","volume":null,"pages":null},"PeriodicalIF":5.0000,"publicationDate":"2024-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S1110866524000914/pdfft?md5=2097a3a3d4a288323a47198f8f29bd1c&pid=1-s2.0-S1110866524000914-main.pdf","citationCount":"0","resultStr":"{\"title\":\"Enhancing brain tumor segmentation in MRI images: A hybrid approach using UNet, attention mechanisms, and transformers\",\"authors\":\"Thien B. Nguyen-Tat ,&nbsp;Thien-Qua T. Nguyen ,&nbsp;Hieu-Nghia Nguyen ,&nbsp;Vuong M. Ngo\",\"doi\":\"10.1016/j.eij.2024.100528\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Accurate brain tumor segmentation in MRI images is crucial for effective treatment planning and monitoring. Traditional methods often encounter challenges due to the complexity and variability of tumor shapes and textures. Consequently, there is a growing need for automated solutions to assist healthcare professionals in segmentation tasks, improving efficiency and reducing workload. This study introduces an innovative method for accurately segmenting brain tumors in MRI images by employing a refined 3D UNet model integrated with a Transformer. The goal is to leverage self-attention mechanisms to enhance segmentation capabilities. The proposed model combines Contextual Transformer (CoT) and Double Attention (DA) architectures. CoT is extended to a 3D format and integrated with the baseline model to exploit intricate contextual details in MRI images. DA blocks in skip connections aggregate and distribute long-range features, emphasizing inter-dependencies within an expanded spatial scope. Experimental results demonstrate superior segmentation performance compared to current state-of-the-art methods. With its ability to accurately segment and delineate tumors in 3D, our segmentation model promises to be a powerful tool for medical image processing and performance optimization, saving time for healthcare professionals and healthcare systems.</p></div>\",\"PeriodicalId\":56010,\"journal\":{\"name\":\"Egyptian Informatics Journal\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":5.0000,\"publicationDate\":\"2024-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S1110866524000914/pdfft?md5=2097a3a3d4a288323a47198f8f29bd1c&pid=1-s2.0-S1110866524000914-main.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Egyptian Informatics Journal\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1110866524000914\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Egyptian Informatics Journal","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1110866524000914","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

在核磁共振成像图像中进行准确的脑肿瘤分割对于有效的治疗计划和监测至关重要。由于肿瘤形状和纹理的复杂性和多变性,传统方法经常遇到挑战。因此,人们越来越需要自动化解决方案来协助医护人员完成分割任务,从而提高效率并减少工作量。本研究介绍了一种创新方法,通过采用与变压器集成的精制三维 UNet 模型,准确分割磁共振成像图像中的脑肿瘤。其目的是利用自我注意机制来增强分割能力。所提议的模型结合了上下文转换器(CoT)和双重注意(DA)架构。CoT 扩展为三维格式,并与基线模型集成,以利用核磁共振成像图像中错综复杂的上下文细节。跳转连接中的 DA 块聚合并分布远距离特征,在扩大的空间范围内强调相互依存关系。实验结果表明,与目前最先进的方法相比,该方法具有更出色的分割性能。我们的分割模型能够准确分割和划分三维肿瘤,有望成为医疗图像处理和性能优化的强大工具,为医护人员和医疗系统节省时间。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Enhancing brain tumor segmentation in MRI images: A hybrid approach using UNet, attention mechanisms, and transformers

Accurate brain tumor segmentation in MRI images is crucial for effective treatment planning and monitoring. Traditional methods often encounter challenges due to the complexity and variability of tumor shapes and textures. Consequently, there is a growing need for automated solutions to assist healthcare professionals in segmentation tasks, improving efficiency and reducing workload. This study introduces an innovative method for accurately segmenting brain tumors in MRI images by employing a refined 3D UNet model integrated with a Transformer. The goal is to leverage self-attention mechanisms to enhance segmentation capabilities. The proposed model combines Contextual Transformer (CoT) and Double Attention (DA) architectures. CoT is extended to a 3D format and integrated with the baseline model to exploit intricate contextual details in MRI images. DA blocks in skip connections aggregate and distribute long-range features, emphasizing inter-dependencies within an expanded spatial scope. Experimental results demonstrate superior segmentation performance compared to current state-of-the-art methods. With its ability to accurately segment and delineate tumors in 3D, our segmentation model promises to be a powerful tool for medical image processing and performance optimization, saving time for healthcare professionals and healthcare systems.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Egyptian Informatics Journal
Egyptian Informatics Journal Decision Sciences-Management Science and Operations Research
CiteScore
11.10
自引率
1.90%
发文量
59
审稿时长
110 days
期刊介绍: The Egyptian Informatics Journal is published by the Faculty of Computers and Artificial Intelligence, Cairo University. This Journal provides a forum for the state-of-the-art research and development in the fields of computing, including computer sciences, information technologies, information systems, operations research and decision support. Innovative and not-previously-published work in subjects covered by the Journal is encouraged to be submitted, whether from academic, research or commercial sources.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信