Multi-Attentional U-Net for Medical Image Segmentation

Zhifang Hong, Hewen Xi, Weijie Hu, Qing Wang, Jiayi Wang, Lingli Luo, Xiying Zhan, Yuping Wang, Junxi Chen, Lingna Chen
{"title":"Multi-Attentional U-Net for Medical Image Segmentation","authors":"Zhifang Hong, Hewen Xi, Weijie Hu, Qing Wang, Jiayi Wang, Lingli Luo, Xiying Zhan, Yuping Wang, Junxi Chen, Lingna Chen","doi":"10.1109/ISAIAM55748.2022.00033","DOIUrl":null,"url":null,"abstract":"Deep learning (DL) approaches for image segmentation have been gaining state-of-the-art performance in recent years. Particularly, in deep learning, U-Net model has been successfully used in the field of image segmentation. However, traditional U-Net methods extract features, aggregate remote information, and reconstruct images by stacking convolution, pooling, and up sampling blocks. The traditional approach is very inefficient due of the stacked local operators. In this paper, we propose the multi-attentional U-Net that is equipped with non-local blocks based self-attention, channel-attention, and spatial-attention for image segmentation. These blocks can be inserted into U-Net to flexibly aggregate information on the plane and spatial scales. We perform and evaluate the multi-attentional U-Net model on three benchmark data sets, which are COVID-19 segmentation, skin cancer segmentation, thyroid nodules segmentation. Results show that our proposed models achieve better performances with faster computation and fewer parameters. The multi-attention U-Net can improve the medical image segmentation results.","PeriodicalId":382895,"journal":{"name":"2022 2nd International Symposium on Artificial Intelligence and its Application on Media (ISAIAM)","volume":"148 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 2nd International Symposium on Artificial Intelligence and its Application on Media (ISAIAM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISAIAM55748.2022.00033","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Deep learning (DL) approaches for image segmentation have been gaining state-of-the-art performance in recent years. Particularly, in deep learning, U-Net model has been successfully used in the field of image segmentation. However, traditional U-Net methods extract features, aggregate remote information, and reconstruct images by stacking convolution, pooling, and up sampling blocks. The traditional approach is very inefficient due of the stacked local operators. In this paper, we propose the multi-attentional U-Net that is equipped with non-local blocks based self-attention, channel-attention, and spatial-attention for image segmentation. These blocks can be inserted into U-Net to flexibly aggregate information on the plane and spatial scales. We perform and evaluate the multi-attentional U-Net model on three benchmark data sets, which are COVID-19 segmentation, skin cancer segmentation, thyroid nodules segmentation. Results show that our proposed models achieve better performances with faster computation and fewer parameters. The multi-attention U-Net can improve the medical image segmentation results.
多注意力U-Net医学图像分割
近年来,用于图像分割的深度学习(DL)方法已经获得了最先进的性能。特别是在深度学习中,U-Net模型已成功应用于图像分割领域。传统的U-Net方法通过叠加卷积、池化、上采样块等方法提取特征、聚合远程信息、重构图像。由于局部算子的叠加,传统方法的效率非常低。本文提出了一种基于非局部块的自注意、通道注意和空间注意的多注意U-Net图像分割算法。这些块可以插入到U-Net中,在平面和空间尺度上灵活地聚合信息。我们在COVID-19分割、皮肤癌分割和甲状腺结节分割三个基准数据集上执行并评估了多注意力U-Net模型。结果表明,该模型具有计算速度快、参数少的优点。多注意力U-Net可以提高医学图像分割的效果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信