{"title":"Revision of the LeNet algorithm——Construction of LeNet deformation algorithm based on multi-conditional hyperparameter adjustment","authors":"","doi":"10.25236/ajcis.2023.060803","DOIUrl":null,"url":null,"abstract":"This paper explores two main issues. First, this paper explores the optimal hyperparameters of the LeNet algorithm under the Fashion-MNIST dataset based on the grid method: where when the learning rate is 0.032, the regularization coefficient is 0.03, the momentum is 0.9, the weight decay parameter is 0.001, and the number of iterative rounds is 50, the model has the best results under the Fashion-MNIST dataset of 10% uniformly sampled samples has the relatively best results, i.e., the test accuracy converges to 85.8%. In addition, this paper improves the LeNet algorithm by constructing a LeNet deformation algorithm based on multi-conditional hyperparameter adjustment, specifically, the learning rate, momentum, and regularization coefficients change with the increase of the number of iteration rounds; in addition, in the construction of the model, the model introduces two blocks containing a convolutional layer, a batch normalization layer (BatchNorm), and a maximum pooling layer, and three linear neuron layers . After tuning, the tested accuracy of the algorithm is 91.5% under the full sample based on the Fashion-MNIST dataset.","PeriodicalId":387664,"journal":{"name":"Academic Journal of Computing & Information Science","volume":"42 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Academic Journal of Computing & Information Science","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.25236/ajcis.2023.060803","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
This paper explores two main issues. First, this paper explores the optimal hyperparameters of the LeNet algorithm under the Fashion-MNIST dataset based on the grid method: where when the learning rate is 0.032, the regularization coefficient is 0.03, the momentum is 0.9, the weight decay parameter is 0.001, and the number of iterative rounds is 50, the model has the best results under the Fashion-MNIST dataset of 10% uniformly sampled samples has the relatively best results, i.e., the test accuracy converges to 85.8%. In addition, this paper improves the LeNet algorithm by constructing a LeNet deformation algorithm based on multi-conditional hyperparameter adjustment, specifically, the learning rate, momentum, and regularization coefficients change with the increase of the number of iteration rounds; in addition, in the construction of the model, the model introduces two blocks containing a convolutional layer, a batch normalization layer (BatchNorm), and a maximum pooling layer, and three linear neuron layers . After tuning, the tested accuracy of the algorithm is 91.5% under the full sample based on the Fashion-MNIST dataset.