MAPFUNet:用于肝脏肿瘤分离的多注意感知-融合 U 网

IF 4.9 3区 计算机科学 Q1 ENGINEERING, MULTIDISCIPLINARY
Junding Sun, Biao Wang, Xiaosheng Wu, Chaosheng Tang, Shuihua Wang, Yudong Zhang
{"title":"MAPFUNet:用于肝脏肿瘤分离的多注意感知-融合 U 网","authors":"Junding Sun,&nbsp;Biao Wang,&nbsp;Xiaosheng Wu,&nbsp;Chaosheng Tang,&nbsp;Shuihua Wang,&nbsp;Yudong Zhang","doi":"10.1007/s42235-024-00562-y","DOIUrl":null,"url":null,"abstract":"<div><p>The second-leading cause of cancer-related deaths globally is liver cancer. The treatment of liver cancers depends heavily on the accurate segmentation of liver tumors from CT scans. The improved method based on U-Net has achieved good performance for liver tumor segmentation, but these methods can still be improved. To deal with the problems of poor performance from the original U-Net framework in the segmentation of small-sized liver tumors and the position information of tumors that is seriously lost in the down-sampling process, we propose the Multi-attention Perception-fusion U-Net (MAPFUNet). We propose the Position ResBlock (PResBlock) in the encoder stage to promote the feature extraction capability of MAPFUNet while retaining the position information regarding liver tumors. A Dual-branch Attention Module (DWAM) is proposed in the skip connections, which narrows the semantic gap between the encoder's and decoder's features and enables the network to utilize the encoder's multi-stage and multi-scale features. We propose the Channel-wise ASPP with Attention (CAA) module at the bottleneck, which can be combined with multi-scale features and contributes to the recovery of micro-tumor feature information. Finally, we evaluated MAPFUNet on the LITS2017 dataset and the 3DIRCADB-01 dataset, with Dice values of 85.81 and 83.84% for liver tumor segmentation, which were 2.89 and 7.89% higher than the baseline model, respectively. The experiment results show that MAPFUNet is superior to other networks with better tumor feature representation and higher accuracy of liver tumor segmentation. We also extended MAPFUNet to brain tumor segmentation on the BraTS2019 dataset. The results indicate that MAPFUNet performs well on the brain tumor segmentation task, and its Dice values on the three tumor regions are 83.27% (WT), 84.77% (TC), and 76.98% (ET), respectively.</p></div>","PeriodicalId":614,"journal":{"name":"Journal of Bionic Engineering","volume":"21 5","pages":"2515 - 2539"},"PeriodicalIF":4.9000,"publicationDate":"2024-06-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"MAPFUNet: Multi-attention Perception-Fusion U-Net for Liver Tumor Segmentation\",\"authors\":\"Junding Sun,&nbsp;Biao Wang,&nbsp;Xiaosheng Wu,&nbsp;Chaosheng Tang,&nbsp;Shuihua Wang,&nbsp;Yudong Zhang\",\"doi\":\"10.1007/s42235-024-00562-y\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>The second-leading cause of cancer-related deaths globally is liver cancer. The treatment of liver cancers depends heavily on the accurate segmentation of liver tumors from CT scans. The improved method based on U-Net has achieved good performance for liver tumor segmentation, but these methods can still be improved. To deal with the problems of poor performance from the original U-Net framework in the segmentation of small-sized liver tumors and the position information of tumors that is seriously lost in the down-sampling process, we propose the Multi-attention Perception-fusion U-Net (MAPFUNet). We propose the Position ResBlock (PResBlock) in the encoder stage to promote the feature extraction capability of MAPFUNet while retaining the position information regarding liver tumors. A Dual-branch Attention Module (DWAM) is proposed in the skip connections, which narrows the semantic gap between the encoder's and decoder's features and enables the network to utilize the encoder's multi-stage and multi-scale features. We propose the Channel-wise ASPP with Attention (CAA) module at the bottleneck, which can be combined with multi-scale features and contributes to the recovery of micro-tumor feature information. Finally, we evaluated MAPFUNet on the LITS2017 dataset and the 3DIRCADB-01 dataset, with Dice values of 85.81 and 83.84% for liver tumor segmentation, which were 2.89 and 7.89% higher than the baseline model, respectively. The experiment results show that MAPFUNet is superior to other networks with better tumor feature representation and higher accuracy of liver tumor segmentation. We also extended MAPFUNet to brain tumor segmentation on the BraTS2019 dataset. The results indicate that MAPFUNet performs well on the brain tumor segmentation task, and its Dice values on the three tumor regions are 83.27% (WT), 84.77% (TC), and 76.98% (ET), respectively.</p></div>\",\"PeriodicalId\":614,\"journal\":{\"name\":\"Journal of Bionic Engineering\",\"volume\":\"21 5\",\"pages\":\"2515 - 2539\"},\"PeriodicalIF\":4.9000,\"publicationDate\":\"2024-06-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Bionic Engineering\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://link.springer.com/article/10.1007/s42235-024-00562-y\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Bionic Engineering","FirstCategoryId":"94","ListUrlMain":"https://link.springer.com/article/10.1007/s42235-024-00562-y","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

摘要

肝癌是全球第二大癌症致死病因。肝癌的治疗在很大程度上取决于从 CT 扫描中准确分割肝脏肿瘤。基于 U-Net 的改进方法在肝脏肿瘤分割方面取得了良好的性能,但这些方法仍有待改进。针对原始 U-Net 框架在分割小尺寸肝脏肿瘤时性能不佳,以及下采样过程中肿瘤位置信息丢失严重的问题,我们提出了多注意感知融合 U-Net (MAPFUNet)。我们在编码器阶段提出了位置重块(PResBlock),以提高 MAPFUNet 的特征提取能力,同时保留有关肝脏肿瘤的位置信息。在跳转连接中提出了双分支注意模块(DWAM),它缩小了编码器和解码器特征之间的语义差距,使网络能够利用编码器的多阶段和多尺度特征。我们在瓶颈处提出了带有注意力的通道式 ASPP(CAA)模块,该模块可与多尺度特征相结合,有助于恢复微小肿瘤特征信息。最后,我们在 LITS2017 数据集和 3DIRCADB-01 数据集上对 MAPFUNet 进行了评估,肝脏肿瘤分割的 Dice 值分别为 85.81% 和 83.84%,比基线模型分别高出 2.89% 和 7.89%。实验结果表明,MAPFUNet 优于其他网络,具有更好的肿瘤特征表示和更高的肝脏肿瘤分割准确率。我们还将 MAPFUNet 扩展到 BraTS2019 数据集上的脑肿瘤分割。结果表明,MAPFUNet 在脑肿瘤分割任务中表现良好,其在三个肿瘤区域的 Dice 值分别为 83.27%(WT)、84.77%(TC)和 76.98%(ET)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

MAPFUNet: Multi-attention Perception-Fusion U-Net for Liver Tumor Segmentation

MAPFUNet: Multi-attention Perception-Fusion U-Net for Liver Tumor Segmentation

MAPFUNet: Multi-attention Perception-Fusion U-Net for Liver Tumor Segmentation

The second-leading cause of cancer-related deaths globally is liver cancer. The treatment of liver cancers depends heavily on the accurate segmentation of liver tumors from CT scans. The improved method based on U-Net has achieved good performance for liver tumor segmentation, but these methods can still be improved. To deal with the problems of poor performance from the original U-Net framework in the segmentation of small-sized liver tumors and the position information of tumors that is seriously lost in the down-sampling process, we propose the Multi-attention Perception-fusion U-Net (MAPFUNet). We propose the Position ResBlock (PResBlock) in the encoder stage to promote the feature extraction capability of MAPFUNet while retaining the position information regarding liver tumors. A Dual-branch Attention Module (DWAM) is proposed in the skip connections, which narrows the semantic gap between the encoder's and decoder's features and enables the network to utilize the encoder's multi-stage and multi-scale features. We propose the Channel-wise ASPP with Attention (CAA) module at the bottleneck, which can be combined with multi-scale features and contributes to the recovery of micro-tumor feature information. Finally, we evaluated MAPFUNet on the LITS2017 dataset and the 3DIRCADB-01 dataset, with Dice values of 85.81 and 83.84% for liver tumor segmentation, which were 2.89 and 7.89% higher than the baseline model, respectively. The experiment results show that MAPFUNet is superior to other networks with better tumor feature representation and higher accuracy of liver tumor segmentation. We also extended MAPFUNet to brain tumor segmentation on the BraTS2019 dataset. The results indicate that MAPFUNet performs well on the brain tumor segmentation task, and its Dice values on the three tumor regions are 83.27% (WT), 84.77% (TC), and 76.98% (ET), respectively.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of Bionic Engineering
Journal of Bionic Engineering 工程技术-材料科学:生物材料
CiteScore
7.10
自引率
10.00%
发文量
162
审稿时长
10.0 months
期刊介绍: The Journal of Bionic Engineering (JBE) is a peer-reviewed journal that publishes original research papers and reviews that apply the knowledge learned from nature and biological systems to solve concrete engineering problems. The topics that JBE covers include but are not limited to: Mechanisms, kinematical mechanics and control of animal locomotion, development of mobile robots with walking (running and crawling), swimming or flying abilities inspired by animal locomotion. Structures, morphologies, composition and physical properties of natural and biomaterials; fabrication of new materials mimicking the properties and functions of natural and biomaterials. Biomedical materials, artificial organs and tissue engineering for medical applications; rehabilitation equipment and devices. Development of bioinspired computation methods and artificial intelligence for engineering applications.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信