{"title":"Review and analysis of hidden neuron number effect of shallow backpropagation neural networks","authors":"B. Şekeroğlu, Kamil Dimililer","doi":"10.14311/nnw.2020.30.008","DOIUrl":null,"url":null,"abstract":"Shallow neural network implementations are still popular for real-life classification problems that require rapid achievements with limited data. Parameters selection such as hidden neuron number, learning rate and momentum factor of neural networks are the main challenges that causes time loss during these implementations. In these parameters, the determination of hidden neuron numbers is the main drawback that affects both training and generalization phases of any neural system for learning efficiency and system accuracy. In this study, several experiments are performed in order to observe the effect of hidden neuron number of 3-layered backpropagation neural network on the generalization rate of classification problems using both numerical datasets and image databases. Experiments are performed by considering the increasing number of total processing elements, and various numbers of hidden neurons are used during the training. The results of each hidden neuron number are analyzed according to the accuracy rates and iteration numbers during the convergence. Results show that the effect of the hidden neuron numbers mainly depends on the number of training patterns. Also obtained results suggest intervals of hidden neuron numbers for different number of total processing elements and training patterns.","PeriodicalId":49765,"journal":{"name":"Neural Network World","volume":"30 1","pages":"97-112"},"PeriodicalIF":0.7000,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Network World","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.14311/nnw.2020.30.008","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 3
Abstract
Shallow neural network implementations are still popular for real-life classification problems that require rapid achievements with limited data. Parameters selection such as hidden neuron number, learning rate and momentum factor of neural networks are the main challenges that causes time loss during these implementations. In these parameters, the determination of hidden neuron numbers is the main drawback that affects both training and generalization phases of any neural system for learning efficiency and system accuracy. In this study, several experiments are performed in order to observe the effect of hidden neuron number of 3-layered backpropagation neural network on the generalization rate of classification problems using both numerical datasets and image databases. Experiments are performed by considering the increasing number of total processing elements, and various numbers of hidden neurons are used during the training. The results of each hidden neuron number are analyzed according to the accuracy rates and iteration numbers during the convergence. Results show that the effect of the hidden neuron numbers mainly depends on the number of training patterns. Also obtained results suggest intervals of hidden neuron numbers for different number of total processing elements and training patterns.
期刊介绍:
Neural Network World is a bimonthly journal providing the latest developments in the field of informatics with attention mainly devoted to the problems of:
brain science,
theory and applications of neural networks (both artificial and natural),
fuzzy-neural systems,
methods and applications of evolutionary algorithms,
methods of parallel and mass-parallel computing,
problems of soft-computing,
methods of artificial intelligence.