浅层反向传播神经网络隐神经元数效应综述与分析

IF 0.7 4区 计算机科学 Q4 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
B. Şekeroğlu, Kamil Dimililer
{"title":"浅层反向传播神经网络隐神经元数效应综述与分析","authors":"B. Şekeroğlu, Kamil Dimililer","doi":"10.14311/nnw.2020.30.008","DOIUrl":null,"url":null,"abstract":"Shallow neural network implementations are still popular for real-life classification problems that require rapid achievements with limited data. Parameters selection such as hidden neuron number, learning rate and momentum factor of neural networks are the main challenges that causes time loss during these implementations. In these parameters, the determination of hidden neuron numbers is the main drawback that affects both training and generalization phases of any neural system for learning efficiency and system accuracy. In this study, several experiments are performed in order to observe the effect of hidden neuron number of 3-layered backpropagation neural network on the generalization rate of classification problems using both numerical datasets and image databases. Experiments are performed by considering the increasing number of total processing elements, and various numbers of hidden neurons are used during the training. The results of each hidden neuron number are analyzed according to the accuracy rates and iteration numbers during the convergence. Results show that the effect of the hidden neuron numbers mainly depends on the number of training patterns. Also obtained results suggest intervals of hidden neuron numbers for different number of total processing elements and training patterns.","PeriodicalId":49765,"journal":{"name":"Neural Network World","volume":"30 1","pages":"97-112"},"PeriodicalIF":0.7000,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Review and analysis of hidden neuron number effect of shallow backpropagation neural networks\",\"authors\":\"B. Şekeroğlu, Kamil Dimililer\",\"doi\":\"10.14311/nnw.2020.30.008\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Shallow neural network implementations are still popular for real-life classification problems that require rapid achievements with limited data. Parameters selection such as hidden neuron number, learning rate and momentum factor of neural networks are the main challenges that causes time loss during these implementations. In these parameters, the determination of hidden neuron numbers is the main drawback that affects both training and generalization phases of any neural system for learning efficiency and system accuracy. In this study, several experiments are performed in order to observe the effect of hidden neuron number of 3-layered backpropagation neural network on the generalization rate of classification problems using both numerical datasets and image databases. Experiments are performed by considering the increasing number of total processing elements, and various numbers of hidden neurons are used during the training. The results of each hidden neuron number are analyzed according to the accuracy rates and iteration numbers during the convergence. Results show that the effect of the hidden neuron numbers mainly depends on the number of training patterns. Also obtained results suggest intervals of hidden neuron numbers for different number of total processing elements and training patterns.\",\"PeriodicalId\":49765,\"journal\":{\"name\":\"Neural Network World\",\"volume\":\"30 1\",\"pages\":\"97-112\"},\"PeriodicalIF\":0.7000,\"publicationDate\":\"2020-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Network World\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.14311/nnw.2020.30.008\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Network World","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.14311/nnw.2020.30.008","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 3

摘要

浅层神经网络实现在现实生活中的分类问题中仍然很受欢迎,这些问题需要在有限的数据下快速完成。隐神经元数、学习率和动量因子等参数的选择是导致实现过程中时间损失的主要挑战。在这些参数中,隐藏神经元数的确定是影响任何神经系统的训练和泛化阶段的学习效率和系统准确性的主要缺点。在本研究中,为了观察3层反向传播神经网络的隐藏神经元数对分类问题泛化率的影响,我们使用了数值数据集和图像数据库。实验考虑了处理单元总数的增加,并在训练过程中使用了不同数量的隐藏神经元。根据收敛过程中的准确率和迭代次数对每个隐藏神经元数的结果进行分析。结果表明,隐藏神经元数量的效果主要取决于训练模式的数量。同时给出了不同处理单元个数和训练模式下隐藏神经元个数的间隔。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Review and analysis of hidden neuron number effect of shallow backpropagation neural networks
Shallow neural network implementations are still popular for real-life classification problems that require rapid achievements with limited data. Parameters selection such as hidden neuron number, learning rate and momentum factor of neural networks are the main challenges that causes time loss during these implementations. In these parameters, the determination of hidden neuron numbers is the main drawback that affects both training and generalization phases of any neural system for learning efficiency and system accuracy. In this study, several experiments are performed in order to observe the effect of hidden neuron number of 3-layered backpropagation neural network on the generalization rate of classification problems using both numerical datasets and image databases. Experiments are performed by considering the increasing number of total processing elements, and various numbers of hidden neurons are used during the training. The results of each hidden neuron number are analyzed according to the accuracy rates and iteration numbers during the convergence. Results show that the effect of the hidden neuron numbers mainly depends on the number of training patterns. Also obtained results suggest intervals of hidden neuron numbers for different number of total processing elements and training patterns.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Neural Network World
Neural Network World 工程技术-计算机:人工智能
CiteScore
1.80
自引率
0.00%
发文量
0
审稿时长
12 months
期刊介绍: Neural Network World is a bimonthly journal providing the latest developments in the field of informatics with attention mainly devoted to the problems of: brain science, theory and applications of neural networks (both artificial and natural), fuzzy-neural systems, methods and applications of evolutionary algorithms, methods of parallel and mass-parallel computing, problems of soft-computing, methods of artificial intelligence.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信