MDEANet: A multi-scale deep enhanced attention net for popliteal fossa segmentation in ultrasound images

IF 5.4 2区 医学 Q1 ENGINEERING, BIOMEDICAL
Fangfang Chen , Wei Fang , Qinghua Wu , Miao Zhou , Wenhui Guo , Liangqing Lin , Zhanheng Chen , Zui Zou
{"title":"MDEANet: A multi-scale deep enhanced attention net for popliteal fossa segmentation in ultrasound images","authors":"Fangfang Chen ,&nbsp;Wei Fang ,&nbsp;Qinghua Wu ,&nbsp;Miao Zhou ,&nbsp;Wenhui Guo ,&nbsp;Liangqing Lin ,&nbsp;Zhanheng Chen ,&nbsp;Zui Zou","doi":"10.1016/j.compmedimag.2025.102570","DOIUrl":null,"url":null,"abstract":"<div><div>Popliteal sciatic nerve block is a widely used technique for lower limb anesthesia. However, despite ultrasound guidance, the complex anatomical structures of the popliteal fossa can present challenges, potentially leading to complications. To accurately identify the bifurcation of the sciatic nerve for nerve blockade, we propose MDEANet, a deep learning-based segmentation network designed for the precise localization of nerves, muscles, and arteries in ultrasound images of the popliteal region. MDEANet incorporates Cascaded Multi-scale Atrous Convolutions (CMAC) to enhance multi-scale feature extraction, Enhanced Spatial Attention Mechanism (ESAM) to focus on key anatomical regions, and Cross-level Feature Fusion (CLFF) to improve contextual representation. This integration markedly improves segmentation of nerves, muscles, and arteries. Experimental results demonstrate that MDEANet achieves an average Intersection over Union (IoU) of 88.60% and a Dice coefficient of 93.95% across all target structures, outperforming state-of-the-art models by 1.68% in IoU and 1.66% in Dice coefficient. Specifically, for nerve segmentation, the Dice coefficient reaches 93.31%, underscoring the effectiveness of our approach. MDEANet has the potential to provide decision-support assistance for anesthesiologists, thereby enhancing the accuracy and efficiency of ultrasound-guided nerve blockade procedures.</div></div>","PeriodicalId":50631,"journal":{"name":"Computerized Medical Imaging and Graphics","volume":"124 ","pages":"Article 102570"},"PeriodicalIF":5.4000,"publicationDate":"2025-06-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computerized Medical Imaging and Graphics","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0895611125000795","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Popliteal sciatic nerve block is a widely used technique for lower limb anesthesia. However, despite ultrasound guidance, the complex anatomical structures of the popliteal fossa can present challenges, potentially leading to complications. To accurately identify the bifurcation of the sciatic nerve for nerve blockade, we propose MDEANet, a deep learning-based segmentation network designed for the precise localization of nerves, muscles, and arteries in ultrasound images of the popliteal region. MDEANet incorporates Cascaded Multi-scale Atrous Convolutions (CMAC) to enhance multi-scale feature extraction, Enhanced Spatial Attention Mechanism (ESAM) to focus on key anatomical regions, and Cross-level Feature Fusion (CLFF) to improve contextual representation. This integration markedly improves segmentation of nerves, muscles, and arteries. Experimental results demonstrate that MDEANet achieves an average Intersection over Union (IoU) of 88.60% and a Dice coefficient of 93.95% across all target structures, outperforming state-of-the-art models by 1.68% in IoU and 1.66% in Dice coefficient. Specifically, for nerve segmentation, the Dice coefficient reaches 93.31%, underscoring the effectiveness of our approach. MDEANet has the potential to provide decision-support assistance for anesthesiologists, thereby enhancing the accuracy and efficiency of ultrasound-guided nerve blockade procedures.
MDEANet:用于超声图像中腘窝分割的多尺度深度增强注意网
腘窝坐骨神经阻滞是一种应用广泛的下肢麻醉技术。然而,尽管超声引导,复杂的腘窝解剖结构可能会带来挑战,潜在地导致并发症。为了准确识别坐骨神经的分叉以进行神经阻滞,我们提出了MDEANet,这是一种基于深度学习的分割网络,旨在精确定位腘窝区域超声图像中的神经、肌肉和动脉。MDEANet集成了级联多尺度亚特卷积(CMAC)来增强多尺度特征提取,增强空间注意机制(ESAM)来关注关键解剖区域,以及跨水平特征融合(CLFF)来改善上下文表示。这种整合显著改善了神经、肌肉和动脉的分割。实验结果表明,MDEANet在所有目标结构上实现了平均IoU为88.60%,Dice系数为93.95%,IoU和Dice系数分别比现有模型高1.68%和1.66%。具体来说,对于神经分割,Dice系数达到了93.31%,表明了我们方法的有效性。MDEANet有潜力为麻醉师提供决策支持协助,从而提高超声引导神经阻滞手术的准确性和效率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
10.70
自引率
3.50%
发文量
71
审稿时长
26 days
期刊介绍: The purpose of the journal Computerized Medical Imaging and Graphics is to act as a source for the exchange of research results concerning algorithmic advances, development, and application of digital imaging in disease detection, diagnosis, intervention, prevention, precision medicine, and population health. Included in the journal will be articles on novel computerized imaging or visualization techniques, including artificial intelligence and machine learning, augmented reality for surgical planning and guidance, big biomedical data visualization, computer-aided diagnosis, computerized-robotic surgery, image-guided therapy, imaging scanning and reconstruction, mobile and tele-imaging, radiomics, and imaging integration and modeling with other information relevant to digital health. The types of biomedical imaging include: magnetic resonance, computed tomography, ultrasound, nuclear medicine, X-ray, microwave, optical and multi-photon microscopy, video and sensory imaging, and the convergence of biomedical images with other non-imaging datasets.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信