{"title":"增强乳腺癌筛查:揭示可解释的交叉视图贡献双视图乳房x线摄影与稀疏二部图注意网络","authors":"Guillaume Pelluet , Mira Rizkallah , Mickael Tardy , Diana Mateus","doi":"10.1016/j.compmedimag.2025.102620","DOIUrl":null,"url":null,"abstract":"<div><div>Medical imaging techniques like mammography enable early breast cancer detection and are part of regular screening programs. Typically, a mammogram exam involves two views of each breast, providing complementary information, but physicians rate the breast as a whole. Computer-Aided Diagnostic tools focus on detecting lesions in a single view, which is challenging due to high image resolution and varying scales of abnormalities. The projective nature of the two views and different acquisition protocols add complexity to dual-view analysis. To address these challenges, we propose a Graph Neural Network approach that models image information at multiple scales and the complementarity of the two views. To this end, we rely on a superpixel decomposition, assigning hierarchical features to superpixels, designing a dual-view graph to share information, and introducing a modified Sparse Graph Attention Layer to keep relevant dual-view relations. This improves interpretability of decisions and avoids the need to register pairs of views under strong deformations. Our model is trained with a fully supervised approach and evaluated on public and private datasets. Experiments demonstrate state-of-the-art classification and detection performance on Full Field Digital Mammographies, achieving a breast-wise AUC of 0.96 for the INbreast dataset, a sensitivity of 0.97 with few false positives per image (0.33), and a case-wise AUC of 0.92 for the VinDr dataset. This study presents a Sparse Graph Attention method for dual-view mammography analysis, generating meaningful explanations that radiologists can interpret. Extensive evaluation shows the relevance of our approach in breast cancer detection and classification.</div></div>","PeriodicalId":50631,"journal":{"name":"Computerized Medical Imaging and Graphics","volume":"125 ","pages":"Article 102620"},"PeriodicalIF":4.9000,"publicationDate":"2025-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Enhancing breast cancer screening: Unveiling explainable cross-view contributions in dual-view mammography with Sparse Bipartite Graphs Attention Networks\",\"authors\":\"Guillaume Pelluet , Mira Rizkallah , Mickael Tardy , Diana Mateus\",\"doi\":\"10.1016/j.compmedimag.2025.102620\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Medical imaging techniques like mammography enable early breast cancer detection and are part of regular screening programs. Typically, a mammogram exam involves two views of each breast, providing complementary information, but physicians rate the breast as a whole. Computer-Aided Diagnostic tools focus on detecting lesions in a single view, which is challenging due to high image resolution and varying scales of abnormalities. The projective nature of the two views and different acquisition protocols add complexity to dual-view analysis. To address these challenges, we propose a Graph Neural Network approach that models image information at multiple scales and the complementarity of the two views. To this end, we rely on a superpixel decomposition, assigning hierarchical features to superpixels, designing a dual-view graph to share information, and introducing a modified Sparse Graph Attention Layer to keep relevant dual-view relations. This improves interpretability of decisions and avoids the need to register pairs of views under strong deformations. Our model is trained with a fully supervised approach and evaluated on public and private datasets. Experiments demonstrate state-of-the-art classification and detection performance on Full Field Digital Mammographies, achieving a breast-wise AUC of 0.96 for the INbreast dataset, a sensitivity of 0.97 with few false positives per image (0.33), and a case-wise AUC of 0.92 for the VinDr dataset. This study presents a Sparse Graph Attention method for dual-view mammography analysis, generating meaningful explanations that radiologists can interpret. Extensive evaluation shows the relevance of our approach in breast cancer detection and classification.</div></div>\",\"PeriodicalId\":50631,\"journal\":{\"name\":\"Computerized Medical Imaging and Graphics\",\"volume\":\"125 \",\"pages\":\"Article 102620\"},\"PeriodicalIF\":4.9000,\"publicationDate\":\"2025-08-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computerized Medical Imaging and Graphics\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0895611125001296\",\"RegionNum\":2,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, BIOMEDICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computerized Medical Imaging and Graphics","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0895611125001296","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
Enhancing breast cancer screening: Unveiling explainable cross-view contributions in dual-view mammography with Sparse Bipartite Graphs Attention Networks
Medical imaging techniques like mammography enable early breast cancer detection and are part of regular screening programs. Typically, a mammogram exam involves two views of each breast, providing complementary information, but physicians rate the breast as a whole. Computer-Aided Diagnostic tools focus on detecting lesions in a single view, which is challenging due to high image resolution and varying scales of abnormalities. The projective nature of the two views and different acquisition protocols add complexity to dual-view analysis. To address these challenges, we propose a Graph Neural Network approach that models image information at multiple scales and the complementarity of the two views. To this end, we rely on a superpixel decomposition, assigning hierarchical features to superpixels, designing a dual-view graph to share information, and introducing a modified Sparse Graph Attention Layer to keep relevant dual-view relations. This improves interpretability of decisions and avoids the need to register pairs of views under strong deformations. Our model is trained with a fully supervised approach and evaluated on public and private datasets. Experiments demonstrate state-of-the-art classification and detection performance on Full Field Digital Mammographies, achieving a breast-wise AUC of 0.96 for the INbreast dataset, a sensitivity of 0.97 with few false positives per image (0.33), and a case-wise AUC of 0.92 for the VinDr dataset. This study presents a Sparse Graph Attention method for dual-view mammography analysis, generating meaningful explanations that radiologists can interpret. Extensive evaluation shows the relevance of our approach in breast cancer detection and classification.
期刊介绍:
The purpose of the journal Computerized Medical Imaging and Graphics is to act as a source for the exchange of research results concerning algorithmic advances, development, and application of digital imaging in disease detection, diagnosis, intervention, prevention, precision medicine, and population health. Included in the journal will be articles on novel computerized imaging or visualization techniques, including artificial intelligence and machine learning, augmented reality for surgical planning and guidance, big biomedical data visualization, computer-aided diagnosis, computerized-robotic surgery, image-guided therapy, imaging scanning and reconstruction, mobile and tele-imaging, radiomics, and imaging integration and modeling with other information relevant to digital health. The types of biomedical imaging include: magnetic resonance, computed tomography, ultrasound, nuclear medicine, X-ray, microwave, optical and multi-photon microscopy, video and sensory imaging, and the convergence of biomedical images with other non-imaging datasets.