{"title":"Deep Learning with Histogram of Oriented Gradients- based Computer-Aided Diagnosis for Breast Cancer Detection and Classification","authors":"A. Ponraj, R. Canessane","doi":"10.1109/ICSMDI57622.2023.00099","DOIUrl":null,"url":null,"abstract":"In the modern era, cancer is a major public health concern. Breast cancer is one of the leading causes of death among women. Breast cancer is becoming the top cause of death in women worldwide. Early identification of breast cancer allows patients to receive proper treatment, improving their chances of survival. The proposed Generative Adversarial Networks (GAN) approach is designed to aid in the detection and diagnosis of breast cancer. GANs are deep learning algorithms that generate new data instances that mimic the training data. GAN is made up of two parts: a generator that learns to generate false data and a discriminator that learns from this false data. Furthermore, the histogram of oriented gradients (HOG) is utilized as a feature descriptor in image processing and other computer vision techniques. Gradient orientation in the detection window and region of interest is determined by the histogram of oriented gradients descriptor approach. Using an image dataset and deep learning techniques, the proposed research (GAN-HOG) aims to improve the efficiency and performance of breast cancer diagnosis. The deep learning method is used here to analyze image data by segmenting and classifying the input photographs from the dataset. Unlike many existing nonlinear classification models, the proposed method employs a conditional distribution for the outputs. The proposed model GAN-HOG had an accuracy of 98.435%, a ResNet50 accuracy of 87.826%, a DCNN accuracy of 92.547%, a VGG16 accuracy of 89.453%, and an SVM accuracy of 95.546%.","PeriodicalId":373017,"journal":{"name":"2023 3rd International Conference on Smart Data Intelligence (ICSMDI)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 3rd International Conference on Smart Data Intelligence (ICSMDI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICSMDI57622.2023.00099","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
In the modern era, cancer is a major public health concern. Breast cancer is one of the leading causes of death among women. Breast cancer is becoming the top cause of death in women worldwide. Early identification of breast cancer allows patients to receive proper treatment, improving their chances of survival. The proposed Generative Adversarial Networks (GAN) approach is designed to aid in the detection and diagnosis of breast cancer. GANs are deep learning algorithms that generate new data instances that mimic the training data. GAN is made up of two parts: a generator that learns to generate false data and a discriminator that learns from this false data. Furthermore, the histogram of oriented gradients (HOG) is utilized as a feature descriptor in image processing and other computer vision techniques. Gradient orientation in the detection window and region of interest is determined by the histogram of oriented gradients descriptor approach. Using an image dataset and deep learning techniques, the proposed research (GAN-HOG) aims to improve the efficiency and performance of breast cancer diagnosis. The deep learning method is used here to analyze image data by segmenting and classifying the input photographs from the dataset. Unlike many existing nonlinear classification models, the proposed method employs a conditional distribution for the outputs. The proposed model GAN-HOG had an accuracy of 98.435%, a ResNet50 accuracy of 87.826%, a DCNN accuracy of 92.547%, a VGG16 accuracy of 89.453%, and an SVM accuracy of 95.546%.