{"title":"基于增量增强的深度CNN迁移学习性能分析","authors":"G. S, H. R","doi":"10.1109/C2I456876.2022.10051386","DOIUrl":null,"url":null,"abstract":"Imbalanced datasets are one of the important research constraints available in image classification. Due to which the classifier performance is greatly reduced leading to overfitting and under fitting problems. However, this is greatly applicable for better well-balanced datasets. Adaboost classifier model is one such technique proven for its accuracy both in terms of margin theory and in terms of statistical point of view. Many novel approaches use boosting and bagging methods to improve the performance of classifier models. In this research, we are focusing on the effectiveness of boosting procedures in deep Convolution neural network (deep CNN) for classification and modification of ensemble approaches are done with transfer learning techniques. The Computational Complexity of the classifier affects the performance accuracy of the same. Based on the above idea, the input data for training the model is subsampled and reweighted for better efficiency and less complexity. Performance metrics used to analyze the performance of simple AdaBoost classifier, boosted GMM, boosted SVM, incremental boosting based transfer learning approaches using GMM and SVM with and without subsampling procedures are the accuracy, the training time, the predicting time of testing, the volume of the model, and the loss function. Along with the above-said metrics, three more essential parameters, namely the Jaccard index, Dice coefficient, and Mathews correlation coefficients, are used. Based on the experiments carried over the benign and malignant melanoma images from the ISIC database, the boosting based transfer learning approach in deep CNN gives an accuracy of 99.19% and confusion matrix created over the classifier has given the sensitivity of 98.46%.","PeriodicalId":165055,"journal":{"name":"2022 3rd International Conference on Communication, Computing and Industry 4.0 (C2I4)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Performance analysis of Incremental boosting based Transfer Learning in Deep CNN\",\"authors\":\"G. S, H. R\",\"doi\":\"10.1109/C2I456876.2022.10051386\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Imbalanced datasets are one of the important research constraints available in image classification. Due to which the classifier performance is greatly reduced leading to overfitting and under fitting problems. However, this is greatly applicable for better well-balanced datasets. Adaboost classifier model is one such technique proven for its accuracy both in terms of margin theory and in terms of statistical point of view. Many novel approaches use boosting and bagging methods to improve the performance of classifier models. In this research, we are focusing on the effectiveness of boosting procedures in deep Convolution neural network (deep CNN) for classification and modification of ensemble approaches are done with transfer learning techniques. The Computational Complexity of the classifier affects the performance accuracy of the same. Based on the above idea, the input data for training the model is subsampled and reweighted for better efficiency and less complexity. Performance metrics used to analyze the performance of simple AdaBoost classifier, boosted GMM, boosted SVM, incremental boosting based transfer learning approaches using GMM and SVM with and without subsampling procedures are the accuracy, the training time, the predicting time of testing, the volume of the model, and the loss function. Along with the above-said metrics, three more essential parameters, namely the Jaccard index, Dice coefficient, and Mathews correlation coefficients, are used. Based on the experiments carried over the benign and malignant melanoma images from the ISIC database, the boosting based transfer learning approach in deep CNN gives an accuracy of 99.19% and confusion matrix created over the classifier has given the sensitivity of 98.46%.\",\"PeriodicalId\":165055,\"journal\":{\"name\":\"2022 3rd International Conference on Communication, Computing and Industry 4.0 (C2I4)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 3rd International Conference on Communication, Computing and Industry 4.0 (C2I4)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/C2I456876.2022.10051386\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 3rd International Conference on Communication, Computing and Industry 4.0 (C2I4)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/C2I456876.2022.10051386","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Performance analysis of Incremental boosting based Transfer Learning in Deep CNN
Imbalanced datasets are one of the important research constraints available in image classification. Due to which the classifier performance is greatly reduced leading to overfitting and under fitting problems. However, this is greatly applicable for better well-balanced datasets. Adaboost classifier model is one such technique proven for its accuracy both in terms of margin theory and in terms of statistical point of view. Many novel approaches use boosting and bagging methods to improve the performance of classifier models. In this research, we are focusing on the effectiveness of boosting procedures in deep Convolution neural network (deep CNN) for classification and modification of ensemble approaches are done with transfer learning techniques. The Computational Complexity of the classifier affects the performance accuracy of the same. Based on the above idea, the input data for training the model is subsampled and reweighted for better efficiency and less complexity. Performance metrics used to analyze the performance of simple AdaBoost classifier, boosted GMM, boosted SVM, incremental boosting based transfer learning approaches using GMM and SVM with and without subsampling procedures are the accuracy, the training time, the predicting time of testing, the volume of the model, and the loss function. Along with the above-said metrics, three more essential parameters, namely the Jaccard index, Dice coefficient, and Mathews correlation coefficients, are used. Based on the experiments carried over the benign and malignant melanoma images from the ISIC database, the boosting based transfer learning approach in deep CNN gives an accuracy of 99.19% and confusion matrix created over the classifier has given the sensitivity of 98.46%.