{"title":"基于组织病理图像的深度特征融合诊断乳腺癌","authors":"Hung Le Minh, Manh Mai Van, T. Lang","doi":"10.1109/KSE.2019.8919462","DOIUrl":null,"url":null,"abstract":"This paper presents a deep feature fusion method based on the concept of 'residual connection' of ResNet to effectively extract distinguishable features which help to improve the classification performance of the Breast cancer prediction on histopathology images. Specifically, we fuse the features extracted from different blocks of Inception-V3 to merge the features learned. The concatenated features are considered as rich information which could capture the deep features of the images. Three experiments were also conducted to investigate the three factors that may affect the classification performance: 1) Feature extractor or Fine-tuningƒ 2) Normalization vs. Non-normalization and 3) The effectiveness of our deep feature fusion method. The dataset used in this study includes 400 microscopy images collected from the ICIAR 2018 Grand Challenge on Breast Cancer histopathology images. The images are divided into 4 classes which indicate the aggressiveness levels of breast cancer, described as Normal (N), Benign (B), In Situ Carcinoma (IS) or Invasive Carcinoma (IV) according to the predominant cancer type in each image. Experimental results show that our proposed deep feature fusion method can achieve a very high classification accuracy with 95% in distinguishing 4 types of cancer classes and 97.5% for differentiating two combined groups of cancer, which are Carcinoma (N+B) and Non-carcinoma (IS+IV).","PeriodicalId":439841,"journal":{"name":"2019 11th International Conference on Knowledge and Systems Engineering (KSE)","volume":"44 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Deep Feature Fusion for Breast Cancer Diagnosis on Histopathology Images\",\"authors\":\"Hung Le Minh, Manh Mai Van, T. Lang\",\"doi\":\"10.1109/KSE.2019.8919462\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents a deep feature fusion method based on the concept of 'residual connection' of ResNet to effectively extract distinguishable features which help to improve the classification performance of the Breast cancer prediction on histopathology images. Specifically, we fuse the features extracted from different blocks of Inception-V3 to merge the features learned. The concatenated features are considered as rich information which could capture the deep features of the images. Three experiments were also conducted to investigate the three factors that may affect the classification performance: 1) Feature extractor or Fine-tuningƒ 2) Normalization vs. Non-normalization and 3) The effectiveness of our deep feature fusion method. The dataset used in this study includes 400 microscopy images collected from the ICIAR 2018 Grand Challenge on Breast Cancer histopathology images. The images are divided into 4 classes which indicate the aggressiveness levels of breast cancer, described as Normal (N), Benign (B), In Situ Carcinoma (IS) or Invasive Carcinoma (IV) according to the predominant cancer type in each image. Experimental results show that our proposed deep feature fusion method can achieve a very high classification accuracy with 95% in distinguishing 4 types of cancer classes and 97.5% for differentiating two combined groups of cancer, which are Carcinoma (N+B) and Non-carcinoma (IS+IV).\",\"PeriodicalId\":439841,\"journal\":{\"name\":\"2019 11th International Conference on Knowledge and Systems Engineering (KSE)\",\"volume\":\"44 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 11th International Conference on Knowledge and Systems Engineering (KSE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/KSE.2019.8919462\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 11th International Conference on Knowledge and Systems Engineering (KSE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/KSE.2019.8919462","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Deep Feature Fusion for Breast Cancer Diagnosis on Histopathology Images
This paper presents a deep feature fusion method based on the concept of 'residual connection' of ResNet to effectively extract distinguishable features which help to improve the classification performance of the Breast cancer prediction on histopathology images. Specifically, we fuse the features extracted from different blocks of Inception-V3 to merge the features learned. The concatenated features are considered as rich information which could capture the deep features of the images. Three experiments were also conducted to investigate the three factors that may affect the classification performance: 1) Feature extractor or Fine-tuningƒ 2) Normalization vs. Non-normalization and 3) The effectiveness of our deep feature fusion method. The dataset used in this study includes 400 microscopy images collected from the ICIAR 2018 Grand Challenge on Breast Cancer histopathology images. The images are divided into 4 classes which indicate the aggressiveness levels of breast cancer, described as Normal (N), Benign (B), In Situ Carcinoma (IS) or Invasive Carcinoma (IV) according to the predominant cancer type in each image. Experimental results show that our proposed deep feature fusion method can achieve a very high classification accuracy with 95% in distinguishing 4 types of cancer classes and 97.5% for differentiating two combined groups of cancer, which are Carcinoma (N+B) and Non-carcinoma (IS+IV).