Yan Liu, Yan Yang, Yongquan Jiang, Xiaole Zhao, Zhuyang Xie
{"title":"FABRF-Net: A frequency-aware boundary and region fusion network for breast ultrasound image segmentation","authors":"Yan Liu, Yan Yang, Yongquan Jiang, Xiaole Zhao, Zhuyang Xie","doi":"10.1016/j.inffus.2025.103299","DOIUrl":null,"url":null,"abstract":"<div><div>Breast ultrasound (BUS) image segmentation is crucial for tumor analysis and cancer diagnosis. However, the challenges of lesion segmentation in BUS images arise from inter-class indistinction caused by low contrast, high speckle noise, artifacts, and blurred boundaries, as well as intra-class inconsistency due to variations in lesion size, shape, and location. To address these challenges, we propose a novel frequency-aware boundary and region fusion network (FABRF-Net). The core of our FABRF-Net is the frequency domain-based Haar wavelet decomposition module (HWDM), which effectively captures multi-scale frequency feature information from global spatial contexts. This allows our network to integrate the advantages of CNNs and Transformers for more comprehensive frequency and spatial feature modeling, effectively addressing intra-class inconsistency. Moreover, the frequency awareness based on HWDM is used to separate features into boundary and region streams, enhancing detailed edges in boundary features and reducing the impact of noise on lesion region features. We further develop a boundary-region fusion module (BRFM) to enable adaptive fusion and mutual guidance of frequency-aware region and boundary features, effectively mitigating inter-class indistinction and achieving accurate breast lesion segmentation. Quantitative and qualitative experimental results demonstrate that FABRF-Net achieves state-of-the-art segmentation accuracy on six cross-domain ultrasound datasets and has obvious advantages in segmenting small breast tumors.</div></div>","PeriodicalId":50367,"journal":{"name":"Information Fusion","volume":"123 ","pages":"Article 103299"},"PeriodicalIF":14.7000,"publicationDate":"2025-05-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Fusion","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1566253525003720","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Breast ultrasound (BUS) image segmentation is crucial for tumor analysis and cancer diagnosis. However, the challenges of lesion segmentation in BUS images arise from inter-class indistinction caused by low contrast, high speckle noise, artifacts, and blurred boundaries, as well as intra-class inconsistency due to variations in lesion size, shape, and location. To address these challenges, we propose a novel frequency-aware boundary and region fusion network (FABRF-Net). The core of our FABRF-Net is the frequency domain-based Haar wavelet decomposition module (HWDM), which effectively captures multi-scale frequency feature information from global spatial contexts. This allows our network to integrate the advantages of CNNs and Transformers for more comprehensive frequency and spatial feature modeling, effectively addressing intra-class inconsistency. Moreover, the frequency awareness based on HWDM is used to separate features into boundary and region streams, enhancing detailed edges in boundary features and reducing the impact of noise on lesion region features. We further develop a boundary-region fusion module (BRFM) to enable adaptive fusion and mutual guidance of frequency-aware region and boundary features, effectively mitigating inter-class indistinction and achieving accurate breast lesion segmentation. Quantitative and qualitative experimental results demonstrate that FABRF-Net achieves state-of-the-art segmentation accuracy on six cross-domain ultrasound datasets and has obvious advantages in segmenting small breast tumors.
期刊介绍:
Information Fusion serves as a central platform for showcasing advancements in multi-sensor, multi-source, multi-process information fusion, fostering collaboration among diverse disciplines driving its progress. It is the leading outlet for sharing research and development in this field, focusing on architectures, algorithms, and applications. Papers dealing with fundamental theoretical analyses as well as those demonstrating their application to real-world problems will be welcome.