FABRF-Net: A frequency-aware boundary and region fusion network for breast ultrasound image segmentation

IF 14.7 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Yan Liu, Yan Yang, Yongquan Jiang, Xiaole Zhao, Zhuyang Xie
{"title":"FABRF-Net: A frequency-aware boundary and region fusion network for breast ultrasound image segmentation","authors":"Yan Liu,&nbsp;Yan Yang,&nbsp;Yongquan Jiang,&nbsp;Xiaole Zhao,&nbsp;Zhuyang Xie","doi":"10.1016/j.inffus.2025.103299","DOIUrl":null,"url":null,"abstract":"<div><div>Breast ultrasound (BUS) image segmentation is crucial for tumor analysis and cancer diagnosis. However, the challenges of lesion segmentation in BUS images arise from inter-class indistinction caused by low contrast, high speckle noise, artifacts, and blurred boundaries, as well as intra-class inconsistency due to variations in lesion size, shape, and location. To address these challenges, we propose a novel frequency-aware boundary and region fusion network (FABRF-Net). The core of our FABRF-Net is the frequency domain-based Haar wavelet decomposition module (HWDM), which effectively captures multi-scale frequency feature information from global spatial contexts. This allows our network to integrate the advantages of CNNs and Transformers for more comprehensive frequency and spatial feature modeling, effectively addressing intra-class inconsistency. Moreover, the frequency awareness based on HWDM is used to separate features into boundary and region streams, enhancing detailed edges in boundary features and reducing the impact of noise on lesion region features. We further develop a boundary-region fusion module (BRFM) to enable adaptive fusion and mutual guidance of frequency-aware region and boundary features, effectively mitigating inter-class indistinction and achieving accurate breast lesion segmentation. Quantitative and qualitative experimental results demonstrate that FABRF-Net achieves state-of-the-art segmentation accuracy on six cross-domain ultrasound datasets and has obvious advantages in segmenting small breast tumors.</div></div>","PeriodicalId":50367,"journal":{"name":"Information Fusion","volume":"123 ","pages":"Article 103299"},"PeriodicalIF":14.7000,"publicationDate":"2025-05-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Fusion","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1566253525003720","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Breast ultrasound (BUS) image segmentation is crucial for tumor analysis and cancer diagnosis. However, the challenges of lesion segmentation in BUS images arise from inter-class indistinction caused by low contrast, high speckle noise, artifacts, and blurred boundaries, as well as intra-class inconsistency due to variations in lesion size, shape, and location. To address these challenges, we propose a novel frequency-aware boundary and region fusion network (FABRF-Net). The core of our FABRF-Net is the frequency domain-based Haar wavelet decomposition module (HWDM), which effectively captures multi-scale frequency feature information from global spatial contexts. This allows our network to integrate the advantages of CNNs and Transformers for more comprehensive frequency and spatial feature modeling, effectively addressing intra-class inconsistency. Moreover, the frequency awareness based on HWDM is used to separate features into boundary and region streams, enhancing detailed edges in boundary features and reducing the impact of noise on lesion region features. We further develop a boundary-region fusion module (BRFM) to enable adaptive fusion and mutual guidance of frequency-aware region and boundary features, effectively mitigating inter-class indistinction and achieving accurate breast lesion segmentation. Quantitative and qualitative experimental results demonstrate that FABRF-Net achieves state-of-the-art segmentation accuracy on six cross-domain ultrasound datasets and has obvious advantages in segmenting small breast tumors.
用于乳腺超声图像分割的频率感知边界和区域融合网络
乳腺超声图像分割是肿瘤分析和诊断的关键。然而,由于低对比度、高散斑噪声、伪影和模糊的边界引起的类间不一致,以及由于病变大小、形状和位置的变化引起的类内不一致,对BUS图像的病变分割提出了挑战。为了解决这些问题,我们提出了一种新的频率感知边界和区域融合网络(FABRF-Net)。我们的fabf - net的核心是基于频域的Haar小波分解模块(HWDM),它可以有效地从全球空间环境中捕获多尺度频率特征信息。这使得我们的网络可以整合cnn和transformer的优势,进行更全面的频率和空间特征建模,有效地解决类内不一致问题。利用基于HWDM的频率感知将特征分离为边界流和区域流,增强边界特征的细节边缘,降低噪声对病灶区域特征的影响。我们进一步开发了边界区域融合模块(BRFM),实现频率感知区域和边界特征的自适应融合和相互引导,有效缓解类间不区分,实现准确的乳腺病变分割。定量和定性实验结果表明,该算法在6个跨域超声数据集上达到了最先进的分割精度,在乳腺小肿瘤的分割中具有明显的优势。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Information Fusion
Information Fusion 工程技术-计算机:理论方法
CiteScore
33.20
自引率
4.30%
发文量
161
审稿时长
7.9 months
期刊介绍: Information Fusion serves as a central platform for showcasing advancements in multi-sensor, multi-source, multi-process information fusion, fostering collaboration among diverse disciplines driving its progress. It is the leading outlet for sharing research and development in this field, focusing on architectures, algorithms, and applications. Papers dealing with fundamental theoretical analyses as well as those demonstrating their application to real-world problems will be welcome.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信