{"title":"Multi-Objective Optimization for Neural Network Structure","authors":"M. Shokoohi, M. Teshnehlab","doi":"10.1109/CSICC58665.2023.10105405","DOIUrl":null,"url":null,"abstract":"This study presents a new algorithm for training flexible perceptron multilayer neural networks. This algorithm is based on the multi-objective evolutionary optimization and tries to find the smallest optimal structure simultaneously by reducing the network error. In this method, a compatibility is established between the mean squared error and the vector length of the parameters of the activation functions by using flexible neurons which cause a greater degree of freedom leading to a faster convergence of the neural network. Then, the network structure decreases and the problem of overfitting and local minimum is prevented based on the values of these parameters and the use of the integration method of neurons. Moreover, it increases the power of the generalizability of the neural network. This method was used for classification problems, and the results were compared with AMGA, BCPA, LASSO, and Early Stopping methods. Based on the results, the algorithm proposed in this study usually works better compared to the similar algorithms. In addition, the proposed algorithm is a systematic method for finding the optimal neural network structure.","PeriodicalId":127277,"journal":{"name":"2023 28th International Computer Conference, Computer Society of Iran (CSICC)","volume":"38 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 28th International Computer Conference, Computer Society of Iran (CSICC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CSICC58665.2023.10105405","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
This study presents a new algorithm for training flexible perceptron multilayer neural networks. This algorithm is based on the multi-objective evolutionary optimization and tries to find the smallest optimal structure simultaneously by reducing the network error. In this method, a compatibility is established between the mean squared error and the vector length of the parameters of the activation functions by using flexible neurons which cause a greater degree of freedom leading to a faster convergence of the neural network. Then, the network structure decreases and the problem of overfitting and local minimum is prevented based on the values of these parameters and the use of the integration method of neurons. Moreover, it increases the power of the generalizability of the neural network. This method was used for classification problems, and the results were compared with AMGA, BCPA, LASSO, and Early Stopping methods. Based on the results, the algorithm proposed in this study usually works better compared to the similar algorithms. In addition, the proposed algorithm is a systematic method for finding the optimal neural network structure.