一种用于模式识别的改进RBF神经网络

Min Han, Wei Guo, Yunfeng Mu
{"title":"一种用于模式识别的改进RBF神经网络","authors":"Min Han, Wei Guo, Yunfeng Mu","doi":"10.1109/IJCNN.2007.4371356","DOIUrl":null,"url":null,"abstract":"This paper presents a modified radial basis function (RBF) neural network for pattern recognition problems, which uses a hybrid learning algorithm to adaptively adjust the structure of the network. Two strategies are used to attain the compromise between the network complexity and accuracy, one is a modified \"novelty\" condition to create a new neuron in the hidden layer; the other is a pruning technique to remove redundant neurons and corresponding connections. To verify the performance of the modified network, two pattern recognition simulations are completed. One is a two-class pattern recognition problem, and the other is a real-world problem, internal component recognition in the field of architecture engineering. Simulation results including final hidden neurons, error, and accuracy using the method proposed in this paper are compared with performance of radial basis functional link network, resource allocating network and RBF neural network with generalized competitive learning algorithm. And it can be concluded that the proposed network has more concise architecture, higher classifier accuracy and fewer running time.","PeriodicalId":350091,"journal":{"name":"2007 International Joint Conference on Neural Networks","volume":"220 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2007-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"A Modified RBF Neural Network in Pattern Recognition\",\"authors\":\"Min Han, Wei Guo, Yunfeng Mu\",\"doi\":\"10.1109/IJCNN.2007.4371356\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents a modified radial basis function (RBF) neural network for pattern recognition problems, which uses a hybrid learning algorithm to adaptively adjust the structure of the network. Two strategies are used to attain the compromise between the network complexity and accuracy, one is a modified \\\"novelty\\\" condition to create a new neuron in the hidden layer; the other is a pruning technique to remove redundant neurons and corresponding connections. To verify the performance of the modified network, two pattern recognition simulations are completed. One is a two-class pattern recognition problem, and the other is a real-world problem, internal component recognition in the field of architecture engineering. Simulation results including final hidden neurons, error, and accuracy using the method proposed in this paper are compared with performance of radial basis functional link network, resource allocating network and RBF neural network with generalized competitive learning algorithm. And it can be concluded that the proposed network has more concise architecture, higher classifier accuracy and fewer running time.\",\"PeriodicalId\":350091,\"journal\":{\"name\":\"2007 International Joint Conference on Neural Networks\",\"volume\":\"220 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2007-10-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2007 International Joint Conference on Neural Networks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN.2007.4371356\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2007 International Joint Conference on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2007.4371356","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10

摘要

针对模式识别问题,提出了一种改进的径向基函数(RBF)神经网络,该网络采用混合学习算法自适应调整网络结构。采用了两种策略来达到网络复杂度和精度的折衷,一种是修改“新奇”条件,在隐藏层创建一个新的神经元;另一种是修剪技术,去除多余的神经元和相应的连接。为了验证改进后的网络的性能,进行了两次模式识别仿真。一个是两类模式识别问题,另一个是建筑工程领域内构件识别的现实问题。通过与径向基函数链路网络、资源分配网络和采用广义竞争学习算法的RBF神经网络的性能进行比较,得到了最终隐藏神经元、误差和精度的仿真结果。结果表明,该网络结构更简洁,分类器准确率更高,运行时间更短。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A Modified RBF Neural Network in Pattern Recognition
This paper presents a modified radial basis function (RBF) neural network for pattern recognition problems, which uses a hybrid learning algorithm to adaptively adjust the structure of the network. Two strategies are used to attain the compromise between the network complexity and accuracy, one is a modified "novelty" condition to create a new neuron in the hidden layer; the other is a pruning technique to remove redundant neurons and corresponding connections. To verify the performance of the modified network, two pattern recognition simulations are completed. One is a two-class pattern recognition problem, and the other is a real-world problem, internal component recognition in the field of architecture engineering. Simulation results including final hidden neurons, error, and accuracy using the method proposed in this paper are compared with performance of radial basis functional link network, resource allocating network and RBF neural network with generalized competitive learning algorithm. And it can be concluded that the proposed network has more concise architecture, higher classifier accuracy and fewer running time.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信