{"title":"Multiparametric Ultrasound Breast Tumors Diagnosis Within BI-RADS Category 4 via Feature Disentanglement and Cross-Fusion","authors":"Zhikai Ruan;Canxu Song;Pengfei Xu;Chaoyu Wang;Jing Zhao;Meng Chen;Suoni Li;Qiang Su;Xiaozhen Zhuo;Yue Wu;Mingxi Wan;Diya Wang","doi":"10.1109/TMI.2025.3558786","DOIUrl":null,"url":null,"abstract":"BI-RADS category 4 is the diagnostic threshold between benign and malignant breast tumors and is critical in determining clinical breast cancer treatment options. However, breast tumors within BI-RADS category 4 tend to show subtle or contradictory differences between benign and malignant on B-mode images, leading to uncertainty in clinical diagnosis. Recently, many deep learning studies have realized the value of multimodal and multiparametric ultrasound in the diagnosis of breast tumors. However, due to the heterogeneity of data, how to effectively represent and fuse common and specific features from multiple sources of information is an open question, which is often overlooked by existing computer-aided diagnosis methods. To address these problems, we propose a novel framework that integrates multiparametric ultrasound information (B-mode images, Nakagami parametric images, and semantic attributes) to assist the diagnosis of BI-RADS 4 breast tumors. The framework extracts and disentangles common and specific features from B-mode and Nakagami parametric images based on a dual-branch Transformer-CNN encoder. Meanwhile, we propose a novel feature disentanglement loss to further ensure the complementarity and consistency of multiparametric features. In addition, we construct a multiparameter cross-fusion module to integrate the high-level features extracted from multiparametric images and semantic attributes. Extensive experiments on the multicenter multiparametric dataset demonstrated the superiority of the proposed framework over the state-of-the-art methods in the diagnosis for BI-RADS 4 breast tumors. The code is available at <uri>https://github.com/rzk-code/MUBTD</uri>","PeriodicalId":94033,"journal":{"name":"IEEE transactions on medical imaging","volume":"44 7","pages":"3064-3075"},"PeriodicalIF":0.0000,"publicationDate":"2025-04-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on medical imaging","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10955401/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
BI-RADS category 4 is the diagnostic threshold between benign and malignant breast tumors and is critical in determining clinical breast cancer treatment options. However, breast tumors within BI-RADS category 4 tend to show subtle or contradictory differences between benign and malignant on B-mode images, leading to uncertainty in clinical diagnosis. Recently, many deep learning studies have realized the value of multimodal and multiparametric ultrasound in the diagnosis of breast tumors. However, due to the heterogeneity of data, how to effectively represent and fuse common and specific features from multiple sources of information is an open question, which is often overlooked by existing computer-aided diagnosis methods. To address these problems, we propose a novel framework that integrates multiparametric ultrasound information (B-mode images, Nakagami parametric images, and semantic attributes) to assist the diagnosis of BI-RADS 4 breast tumors. The framework extracts and disentangles common and specific features from B-mode and Nakagami parametric images based on a dual-branch Transformer-CNN encoder. Meanwhile, we propose a novel feature disentanglement loss to further ensure the complementarity and consistency of multiparametric features. In addition, we construct a multiparameter cross-fusion module to integrate the high-level features extracted from multiparametric images and semantic attributes. Extensive experiments on the multicenter multiparametric dataset demonstrated the superiority of the proposed framework over the state-of-the-art methods in the diagnosis for BI-RADS 4 breast tumors. The code is available at https://github.com/rzk-code/MUBTD