如何避免对神经网络感到沮丧

B. Wilamowski
{"title":"如何避免对神经网络感到沮丧","authors":"B. Wilamowski","doi":"10.1109/ICIT.2011.5754336","DOIUrl":null,"url":null,"abstract":"In the presentation major difficulties of designing neural networks are shown. It turn out that popular MLP (Multi Layer Perceptron) networks in most cases produces far from satisfactory results. Also, popular EBP (Error Back Propagation) algorithm is very slow and often is not capable to train best neural network architectures. Very powerful and fast LM (Levenberg- Marquardt) algorithm was unfortunately implemented only for MLP networks. Also, because a necessity of the inversion of the matrix, which size is proportional to number of patterns, the LM algorithm can be used only for small problems. However, the major frustration with neural networks occurs when too large neural networks are used and it is being trained with too small number of training patterns. Indeed, such networks, with excessive number of neurons, can be trained to very small errors, but these networks will respond very poorly for new patterns, which were not used for training. The most of frustrations with neural network can be eliminated when smaller, more effective, architectures are used and trained by newly developed NBN (Neuron-by-Neuron) algorithm.","PeriodicalId":356868,"journal":{"name":"2011 IEEE International Conference on Industrial Technology","volume":"63 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"23","resultStr":"{\"title\":\"How to not get frustrated with neural networks\",\"authors\":\"B. Wilamowski\",\"doi\":\"10.1109/ICIT.2011.5754336\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In the presentation major difficulties of designing neural networks are shown. It turn out that popular MLP (Multi Layer Perceptron) networks in most cases produces far from satisfactory results. Also, popular EBP (Error Back Propagation) algorithm is very slow and often is not capable to train best neural network architectures. Very powerful and fast LM (Levenberg- Marquardt) algorithm was unfortunately implemented only for MLP networks. Also, because a necessity of the inversion of the matrix, which size is proportional to number of patterns, the LM algorithm can be used only for small problems. However, the major frustration with neural networks occurs when too large neural networks are used and it is being trained with too small number of training patterns. Indeed, such networks, with excessive number of neurons, can be trained to very small errors, but these networks will respond very poorly for new patterns, which were not used for training. The most of frustrations with neural network can be eliminated when smaller, more effective, architectures are used and trained by newly developed NBN (Neuron-by-Neuron) algorithm.\",\"PeriodicalId\":356868,\"journal\":{\"name\":\"2011 IEEE International Conference on Industrial Technology\",\"volume\":\"63 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-03-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"23\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2011 IEEE International Conference on Industrial Technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICIT.2011.5754336\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 IEEE International Conference on Industrial Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICIT.2011.5754336","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 23

摘要

介绍了设计神经网络的主要困难。事实证明,在大多数情况下,流行的MLP(多层感知器)网络产生的结果远远不能令人满意。此外,流行的EBP(误差反向传播)算法非常慢,通常无法训练出最好的神经网络架构。非常强大和快速的LM (Levenberg- Marquardt)算法不幸的是只实现了MLP网络。此外,由于矩阵的大小与模式的数量成正比,因此需要对矩阵进行反演,因此LM算法只能用于小问题。然而,当使用的神经网络太大,而训练模式太少时,神经网络的主要挫折就会发生。事实上,这样的网络,有过多的神经元,可以被训练到非常小的错误,但是这些网络对新的模式的反应非常差,这些模式没有被用于训练。当使用更小、更有效的架构并通过新开发的NBN (Neuron-by-Neuron)算法进行训练时,神经网络的大多数挫折都可以消除。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
How to not get frustrated with neural networks
In the presentation major difficulties of designing neural networks are shown. It turn out that popular MLP (Multi Layer Perceptron) networks in most cases produces far from satisfactory results. Also, popular EBP (Error Back Propagation) algorithm is very slow and often is not capable to train best neural network architectures. Very powerful and fast LM (Levenberg- Marquardt) algorithm was unfortunately implemented only for MLP networks. Also, because a necessity of the inversion of the matrix, which size is proportional to number of patterns, the LM algorithm can be used only for small problems. However, the major frustration with neural networks occurs when too large neural networks are used and it is being trained with too small number of training patterns. Indeed, such networks, with excessive number of neurons, can be trained to very small errors, but these networks will respond very poorly for new patterns, which were not used for training. The most of frustrations with neural network can be eliminated when smaller, more effective, architectures are used and trained by newly developed NBN (Neuron-by-Neuron) algorithm.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信