{"title":"Performance analysis of convolutional neural networks for image classification with appropriate optimizers","authors":"Danish Sana, Ul Rahman Jamshaid, Haider Gulfam","doi":"10.26634/jmat.12.1.19398","DOIUrl":null,"url":null,"abstract":"Optimizers in Convolutional Neural Networks play an important role in many advanced deep learning models. Studies on advanced optimizers and modifications of existing optimizers continue to hold significant importance in the study of machine tools and algorithms. There are a number of studies to defend and the selection of these optimizers illustrate some of the challenges on the effectiveness of these optimizers. Comprehensive analysis on the optimizers and alteration with famous activation function Rectified Linear Unit (ReLU) offered to protect effectiveness. Significance is determined based on the adjustment with the original Softmax and ReLU. Experiments were performed with Adam, Root Mean Squared Propagation (RMSprop), Adaptive Learning Rate Method (Adadelta), Adaptive Gradient Algorithm (Adagrad) and Stochastic Gradient Descent (SGD) to examine the performance of Convolutional Neural Networks for image classification using the Canadian Institute for Advanced Research dataset (CIFAR-10).","PeriodicalId":297202,"journal":{"name":"i-manager’s Journal on Mathematics","volume":"5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"i-manager’s Journal on Mathematics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.26634/jmat.12.1.19398","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Optimizers in Convolutional Neural Networks play an important role in many advanced deep learning models. Studies on advanced optimizers and modifications of existing optimizers continue to hold significant importance in the study of machine tools and algorithms. There are a number of studies to defend and the selection of these optimizers illustrate some of the challenges on the effectiveness of these optimizers. Comprehensive analysis on the optimizers and alteration with famous activation function Rectified Linear Unit (ReLU) offered to protect effectiveness. Significance is determined based on the adjustment with the original Softmax and ReLU. Experiments were performed with Adam, Root Mean Squared Propagation (RMSprop), Adaptive Learning Rate Method (Adadelta), Adaptive Gradient Algorithm (Adagrad) and Stochastic Gradient Descent (SGD) to examine the performance of Convolutional Neural Networks for image classification using the Canadian Institute for Advanced Research dataset (CIFAR-10).