{"title":"Attention-based Interactions Network for Breast Tumor Classification with Multi-modality Images","authors":"Xiao Yang, Xiaoming Xi, Chuanzhen Xu, Liangyun Sun, Lingzhao Meng, Xiushan Nie","doi":"10.1109/HSI55341.2022.9869477","DOIUrl":null,"url":null,"abstract":"Benefiting from the development of medical imaging, the automatic breast image classification has been extensively studied in a variety of breast cancer diagnosis tasks recently. The multi-modality image fusion was helpful to further improve classification performance. However, existing multi-modality fusion methods focused on the fusion of modalities, ignoring the interactions between modalities, which caused the inefficient performance. To address the above issues, we proposed a novel attention-based interactions network for breast tumor classification by using diffusion-weighted imaging (DWI) and apparent dispersion coefficient (ADC) images. Specifically, we proposed a multi-modality interaction mechanism, including relational interaction, channel interaction, and discriminative interaction, to design an attention-based interaction module, which enhanced the abilities of inter-modal interactions. Extensive ablation studies have been carried out, which provably affirmed the advantages of each component. The area under the receiver operating characteristic curve (AUC), accuracy (ACC), specificity (SPC), and sensitivity (SEN) were 87.0%, 87.0%, 88.0%, and 86.0%, respectively, also verifying its effectiveness.","PeriodicalId":282607,"journal":{"name":"2022 15th International Conference on Human System Interaction (HSI)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 15th International Conference on Human System Interaction (HSI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HSI55341.2022.9869477","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Benefiting from the development of medical imaging, the automatic breast image classification has been extensively studied in a variety of breast cancer diagnosis tasks recently. The multi-modality image fusion was helpful to further improve classification performance. However, existing multi-modality fusion methods focused on the fusion of modalities, ignoring the interactions between modalities, which caused the inefficient performance. To address the above issues, we proposed a novel attention-based interactions network for breast tumor classification by using diffusion-weighted imaging (DWI) and apparent dispersion coefficient (ADC) images. Specifically, we proposed a multi-modality interaction mechanism, including relational interaction, channel interaction, and discriminative interaction, to design an attention-based interaction module, which enhanced the abilities of inter-modal interactions. Extensive ablation studies have been carried out, which provably affirmed the advantages of each component. The area under the receiver operating characteristic curve (AUC), accuracy (ACC), specificity (SPC), and sensitivity (SEN) were 87.0%, 87.0%, 88.0%, and 86.0%, respectively, also verifying its effectiveness.