Suhaib Chughtai;Zakaria Senousy;Ahmed Mahany;Nouh Sabri Elmitwally;Khalid N. Ismail;Mohamed Medhat Gaber;Mohammed M. Abdelsamea
{"title":"DeepCon:为结直肠癌分类释放分而治之深度学习的力量","authors":"Suhaib Chughtai;Zakaria Senousy;Ahmed Mahany;Nouh Sabri Elmitwally;Khalid N. Ismail;Mohamed Medhat Gaber;Mohammed M. Abdelsamea","doi":"10.1109/OJCS.2024.3428970","DOIUrl":null,"url":null,"abstract":"Colorectal cancer (CRC) is the second leading cause of cancer-related mortality. Precise diagnosis of CRC plays a crucial role in increasing patient survival rates and formulating effective treatment strategies. Deep learning algorithms have demonstrated remarkable proficiency in the precise categorization of histopathology images. In this article, we introduce a novel deep learning model, termed \n<italic>DeepCon</i>\n which incorporates the divide-and-conquer principle into the classification task. \n<italic>DeepCon</i>\n has been methodically conceived to scrutinize the influence of acquired composition on the learning process, with a specific application to the classification of histology images related to CRC. Our model harnesses pre-trained networks to extract features from both the source and target domains, employing a two-stage transfer learning approach encompassing multiple loss functions. Our transfer learning strategy exploits a learned composition of decomposed images to enhance the transferability of extracted features. The efficacy of the proposed model was assessed using a clinically valid dataset of 5000 CRC images. The experimental results reveal that \n<italic>DeepCon</i>\n when coupled with the Xception network as the backbone model and subjected to extensive fine-tuning, achieved a remarkable accuracy rate of 98.4% and an F1 score of 98.4%.","PeriodicalId":13205,"journal":{"name":"IEEE Open Journal of the Computer Society","volume":"5 ","pages":"380-388"},"PeriodicalIF":0.0000,"publicationDate":"2024-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10599835","citationCount":"0","resultStr":"{\"title\":\"DeepCon: Unleashing the Power of Divide and Conquer Deep Learning for Colorectal Cancer Classification\",\"authors\":\"Suhaib Chughtai;Zakaria Senousy;Ahmed Mahany;Nouh Sabri Elmitwally;Khalid N. Ismail;Mohamed Medhat Gaber;Mohammed M. Abdelsamea\",\"doi\":\"10.1109/OJCS.2024.3428970\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Colorectal cancer (CRC) is the second leading cause of cancer-related mortality. Precise diagnosis of CRC plays a crucial role in increasing patient survival rates and formulating effective treatment strategies. Deep learning algorithms have demonstrated remarkable proficiency in the precise categorization of histopathology images. In this article, we introduce a novel deep learning model, termed \\n<italic>DeepCon</i>\\n which incorporates the divide-and-conquer principle into the classification task. \\n<italic>DeepCon</i>\\n has been methodically conceived to scrutinize the influence of acquired composition on the learning process, with a specific application to the classification of histology images related to CRC. Our model harnesses pre-trained networks to extract features from both the source and target domains, employing a two-stage transfer learning approach encompassing multiple loss functions. Our transfer learning strategy exploits a learned composition of decomposed images to enhance the transferability of extracted features. The efficacy of the proposed model was assessed using a clinically valid dataset of 5000 CRC images. The experimental results reveal that \\n<italic>DeepCon</i>\\n when coupled with the Xception network as the backbone model and subjected to extensive fine-tuning, achieved a remarkable accuracy rate of 98.4% and an F1 score of 98.4%.\",\"PeriodicalId\":13205,\"journal\":{\"name\":\"IEEE Open Journal of the Computer Society\",\"volume\":\"5 \",\"pages\":\"380-388\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-07-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10599835\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Open Journal of the Computer Society\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10599835/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Open Journal of the Computer Society","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10599835/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
DeepCon: Unleashing the Power of Divide and Conquer Deep Learning for Colorectal Cancer Classification
Colorectal cancer (CRC) is the second leading cause of cancer-related mortality. Precise diagnosis of CRC plays a crucial role in increasing patient survival rates and formulating effective treatment strategies. Deep learning algorithms have demonstrated remarkable proficiency in the precise categorization of histopathology images. In this article, we introduce a novel deep learning model, termed
DeepCon
which incorporates the divide-and-conquer principle into the classification task.
DeepCon
has been methodically conceived to scrutinize the influence of acquired composition on the learning process, with a specific application to the classification of histology images related to CRC. Our model harnesses pre-trained networks to extract features from both the source and target domains, employing a two-stage transfer learning approach encompassing multiple loss functions. Our transfer learning strategy exploits a learned composition of decomposed images to enhance the transferability of extracted features. The efficacy of the proposed model was assessed using a clinically valid dataset of 5000 CRC images. The experimental results reveal that
DeepCon
when coupled with the Xception network as the backbone model and subjected to extensive fine-tuning, achieved a remarkable accuracy rate of 98.4% and an F1 score of 98.4%.