{"title":"Intelligant Segmentation and Classification for Skin Cancer Prediction","authors":"S. Kavitha, R. Shalini, N. Harini Sree, J. Akash","doi":"10.1109/ICAECA56562.2023.10200284","DOIUrl":null,"url":null,"abstract":"Skin cancer is a rising health problem, with early detection being vital for effective treatment. Skin cancer diagnosis via image analysis is still challenging in the medical world. These problems become dangerous when they reach a malignant state. Since it is challenging and more expensive to diagnose skin cancer manually, automated computer-aided diagnostics procedures must be developed to support healthcare workers in timely identification of skin cancer. This research is to enhance the early identification, treatment, and prevention of skin cancer to save lives and reduce the burden on healthcare systems. We have proposed a skin cancer segmentation model using the deep learning algorithm called Feature Pyramid Network (FPN) with three popular backbone architectures ResNet34, DenseNet121, and MobileNet-v2 for segmentation and classification model using DenseNet121 for classification, on the HAM10000 dataset which includes the images of skin lesions. The FPN method is a deep learning strategy that integrates the advantages of convolutional neural networks (CNN) and multi-scale feature representation which is used to perform semantic segmentation of skin cancer. Classification using the DenseNet121 model is an effective method for solving classification issues in computer vision. Segmentation results are evaluated using IOU score and loss values. The study shows that the proposed methodology gives accuracy of above 80%, 70% and 75% by using the ResNet34, DenseNet121 and MobileNet-v2 as a backbone respectively in segmentation and 80% accuracy in classification.","PeriodicalId":401373,"journal":{"name":"2023 2nd International Conference on Advancements in Electrical, Electronics, Communication, Computing and Automation (ICAECA)","volume":"29 5","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 2nd International Conference on Advancements in Electrical, Electronics, Communication, Computing and Automation (ICAECA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAECA56562.2023.10200284","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Skin cancer is a rising health problem, with early detection being vital for effective treatment. Skin cancer diagnosis via image analysis is still challenging in the medical world. These problems become dangerous when they reach a malignant state. Since it is challenging and more expensive to diagnose skin cancer manually, automated computer-aided diagnostics procedures must be developed to support healthcare workers in timely identification of skin cancer. This research is to enhance the early identification, treatment, and prevention of skin cancer to save lives and reduce the burden on healthcare systems. We have proposed a skin cancer segmentation model using the deep learning algorithm called Feature Pyramid Network (FPN) with three popular backbone architectures ResNet34, DenseNet121, and MobileNet-v2 for segmentation and classification model using DenseNet121 for classification, on the HAM10000 dataset which includes the images of skin lesions. The FPN method is a deep learning strategy that integrates the advantages of convolutional neural networks (CNN) and multi-scale feature representation which is used to perform semantic segmentation of skin cancer. Classification using the DenseNet121 model is an effective method for solving classification issues in computer vision. Segmentation results are evaluated using IOU score and loss values. The study shows that the proposed methodology gives accuracy of above 80%, 70% and 75% by using the ResNet34, DenseNet121 and MobileNet-v2 as a backbone respectively in segmentation and 80% accuracy in classification.