基于多层空间融合的双分支动态分层U-Net医学图像分割。

IF 3.9 2区 综合性期刊 Q1 MULTIDISCIPLINARY SCIENCES
Zhen Wang, Shuang Fu, Hongguang Zhang, Chunyang Wang, Chunhui Xia, Pen Hou, Chunxue Shun, Ge Shun
{"title":"基于多层空间融合的双分支动态分层U-Net医学图像分割。","authors":"Zhen Wang, Shuang Fu, Hongguang Zhang, Chunyang Wang, Chunhui Xia, Pen Hou, Chunxue Shun, Ge Shun","doi":"10.1038/s41598-025-92715-0","DOIUrl":null,"url":null,"abstract":"<p><p>Accurate segmentation of organs or lesions from medical images is essential for accurate disease diagnosis and organ morphometrics. Previously, most researchers mainly added feature extraction modules and simply aggregated the semantic features to U-Net network to improve the segmentation accuracy of medical images. However, these improved U-Net networks ignore the semantic differences of different organs in medical images and lack the fusion of high-level semantic features and low-level semantic features, which will lead to blurred or miss boundaries between similar organs and diseased areas. To solve this problem, we propose Dual-branch dynamic hierarchical U-Net with multi-layer space fusion attention (D2HU-Net). Firstly, we propose a multi-layer spatial attention fusion module, which makes the shallow decoding path provide predictive graph supplement to the deep decoding path. Under the guidance of higher semantic features, useful context features are selected from lower semantic features to obtain deeper useful spatial information, which makes up for the semantic differences between organs in different medical images. Secondly, we propose a dynamic multi-scale layered module that enhances the multi-scale representation of the network at a finer granularity level and selectively refines single-scale features. Finally, the network provides guiding optimization for subsequent decoding based on multi-scale loss functions. The experimental results on four medical data sets show D2HU-Net enables the most advanced segmentation capabilities on different medical image datasets, which can help doctors diagnose and treat diseases.</p>","PeriodicalId":21811,"journal":{"name":"Scientific Reports","volume":"15 1","pages":"8194"},"PeriodicalIF":3.9000,"publicationDate":"2025-03-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11894187/pdf/","citationCount":"0","resultStr":"{\"title\":\"Dual-branch dynamic hierarchical U-Net with multi-layer space fusion attention for medical image segmentation.\",\"authors\":\"Zhen Wang, Shuang Fu, Hongguang Zhang, Chunyang Wang, Chunhui Xia, Pen Hou, Chunxue Shun, Ge Shun\",\"doi\":\"10.1038/s41598-025-92715-0\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Accurate segmentation of organs or lesions from medical images is essential for accurate disease diagnosis and organ morphometrics. Previously, most researchers mainly added feature extraction modules and simply aggregated the semantic features to U-Net network to improve the segmentation accuracy of medical images. However, these improved U-Net networks ignore the semantic differences of different organs in medical images and lack the fusion of high-level semantic features and low-level semantic features, which will lead to blurred or miss boundaries between similar organs and diseased areas. To solve this problem, we propose Dual-branch dynamic hierarchical U-Net with multi-layer space fusion attention (D2HU-Net). Firstly, we propose a multi-layer spatial attention fusion module, which makes the shallow decoding path provide predictive graph supplement to the deep decoding path. Under the guidance of higher semantic features, useful context features are selected from lower semantic features to obtain deeper useful spatial information, which makes up for the semantic differences between organs in different medical images. Secondly, we propose a dynamic multi-scale layered module that enhances the multi-scale representation of the network at a finer granularity level and selectively refines single-scale features. Finally, the network provides guiding optimization for subsequent decoding based on multi-scale loss functions. The experimental results on four medical data sets show D2HU-Net enables the most advanced segmentation capabilities on different medical image datasets, which can help doctors diagnose and treat diseases.</p>\",\"PeriodicalId\":21811,\"journal\":{\"name\":\"Scientific Reports\",\"volume\":\"15 1\",\"pages\":\"8194\"},\"PeriodicalIF\":3.9000,\"publicationDate\":\"2025-03-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11894187/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Scientific Reports\",\"FirstCategoryId\":\"103\",\"ListUrlMain\":\"https://doi.org/10.1038/s41598-025-92715-0\",\"RegionNum\":2,\"RegionCategory\":\"综合性期刊\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MULTIDISCIPLINARY SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Scientific Reports","FirstCategoryId":"103","ListUrlMain":"https://doi.org/10.1038/s41598-025-92715-0","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
引用次数: 0

摘要

从医学图像中准确分割器官或病变对于准确的疾病诊断和器官形态计量学至关重要。以往的研究主要是增加特征提取模块,简单地将语义特征聚合到U-Net网络中,以提高医学图像的分割精度。然而,这些改进的U-Net网络忽略了医学图像中不同器官的语义差异,缺乏高级语义特征和低级语义特征的融合,这将导致相似器官和病变区域之间的边界模糊或缺失。为了解决这一问题,我们提出了具有多层空间融合关注的双分支动态分层U-Net (D2HU-Net)。首先,我们提出了一种多层空间注意力融合模块,使浅层解码路径为深度解码路径提供预测图补充;在较高语义特征的指导下,从较低语义特征中选择有用的上下文特征,获得更深层次的有用空间信息,弥补了不同医学图像中器官之间的语义差异。其次,我们提出了一个动态的多尺度分层模块,在更细的粒度水平上增强网络的多尺度表示,并有选择地细化单尺度特征。最后,该网络为后续基于多尺度损失函数的解码提供了指导性优化。在四个医学数据集上的实验结果表明,D2HU-Net在不同的医学图像数据集上实现了最先进的分割能力,可以帮助医生诊断和治疗疾病。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Dual-branch dynamic hierarchical U-Net with multi-layer space fusion attention for medical image segmentation.

Dual-branch dynamic hierarchical U-Net with multi-layer space fusion attention for medical image segmentation.

Dual-branch dynamic hierarchical U-Net with multi-layer space fusion attention for medical image segmentation.

Dual-branch dynamic hierarchical U-Net with multi-layer space fusion attention for medical image segmentation.

Accurate segmentation of organs or lesions from medical images is essential for accurate disease diagnosis and organ morphometrics. Previously, most researchers mainly added feature extraction modules and simply aggregated the semantic features to U-Net network to improve the segmentation accuracy of medical images. However, these improved U-Net networks ignore the semantic differences of different organs in medical images and lack the fusion of high-level semantic features and low-level semantic features, which will lead to blurred or miss boundaries between similar organs and diseased areas. To solve this problem, we propose Dual-branch dynamic hierarchical U-Net with multi-layer space fusion attention (D2HU-Net). Firstly, we propose a multi-layer spatial attention fusion module, which makes the shallow decoding path provide predictive graph supplement to the deep decoding path. Under the guidance of higher semantic features, useful context features are selected from lower semantic features to obtain deeper useful spatial information, which makes up for the semantic differences between organs in different medical images. Secondly, we propose a dynamic multi-scale layered module that enhances the multi-scale representation of the network at a finer granularity level and selectively refines single-scale features. Finally, the network provides guiding optimization for subsequent decoding based on multi-scale loss functions. The experimental results on four medical data sets show D2HU-Net enables the most advanced segmentation capabilities on different medical image datasets, which can help doctors diagnose and treat diseases.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Scientific Reports
Scientific Reports Natural Science Disciplines-
CiteScore
7.50
自引率
4.30%
发文量
19567
审稿时长
3.9 months
期刊介绍: We publish original research from all areas of the natural sciences, psychology, medicine and engineering. You can learn more about what we publish by browsing our specific scientific subject areas below or explore Scientific Reports by browsing all articles and collections. Scientific Reports has a 2-year impact factor: 4.380 (2021), and is the 6th most-cited journal in the world, with more than 540,000 citations in 2020 (Clarivate Analytics, 2021). •Engineering Engineering covers all aspects of engineering, technology, and applied science. It plays a crucial role in the development of technologies to address some of the world''s biggest challenges, helping to save lives and improve the way we live. •Physical sciences Physical sciences are those academic disciplines that aim to uncover the underlying laws of nature — often written in the language of mathematics. It is a collective term for areas of study including astronomy, chemistry, materials science and physics. •Earth and environmental sciences Earth and environmental sciences cover all aspects of Earth and planetary science and broadly encompass solid Earth processes, surface and atmospheric dynamics, Earth system history, climate and climate change, marine and freshwater systems, and ecology. It also considers the interactions between humans and these systems. •Biological sciences Biological sciences encompass all the divisions of natural sciences examining various aspects of vital processes. The concept includes anatomy, physiology, cell biology, biochemistry and biophysics, and covers all organisms from microorganisms, animals to plants. •Health sciences The health sciences study health, disease and healthcare. This field of study aims to develop knowledge, interventions and technology for use in healthcare to improve the treatment of patients.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信