Strategies for constructive neural networks and its application to regression models

Jifu Nong
{"title":"Strategies for constructive neural networks and its application to regression models","authors":"Jifu Nong","doi":"10.1109/CSIP.2012.6308828","DOIUrl":null,"url":null,"abstract":"Regression problem is an important application area for neural networks (NNs). Among a large number of existing NN architectures, the feedforward NN (FNN) paradigm is one of the most widely used structures. Although one-hidden-layer feedforward neural networks (OHL-FNNs) have simple structures, they possess interesting representational and learning capabilities. In this paper, we are interested particularly in incremental constructive training of OHL-FNNs. In the proposed incremental constructive training schemes for an OHL-FNN, input-side training and output-side training may be separated in order to reduce the training time. A new technique is proposed to scale the error signal during the constructive learning process to improve the input-side training efficiency and to obtain better generalization performance. Two pruning methods for removing the input-side redundant connections have also been applied. Numerical simulations demonstrate the potential and advantages of the proposed strategies when compared to other existing techniques in the literature.","PeriodicalId":193335,"journal":{"name":"2012 International Conference on Computer Science and Information Processing (CSIP)","volume":"28 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 International Conference on Computer Science and Information Processing (CSIP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CSIP.2012.6308828","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Regression problem is an important application area for neural networks (NNs). Among a large number of existing NN architectures, the feedforward NN (FNN) paradigm is one of the most widely used structures. Although one-hidden-layer feedforward neural networks (OHL-FNNs) have simple structures, they possess interesting representational and learning capabilities. In this paper, we are interested particularly in incremental constructive training of OHL-FNNs. In the proposed incremental constructive training schemes for an OHL-FNN, input-side training and output-side training may be separated in order to reduce the training time. A new technique is proposed to scale the error signal during the constructive learning process to improve the input-side training efficiency and to obtain better generalization performance. Two pruning methods for removing the input-side redundant connections have also been applied. Numerical simulations demonstrate the potential and advantages of the proposed strategies when compared to other existing techniques in the literature.
构造性神经网络策略及其在回归模型中的应用
回归问题是神经网络的一个重要应用领域。在现有的大量神经网络结构中,前馈神经网络(FNN)范式是应用最广泛的结构之一。单隐层前馈神经网络(ohl - fnn)虽然结构简单,但具有有趣的表征和学习能力。在本文中,我们对ohl - fnn的增量建设性训练特别感兴趣。在提出的OHL-FNN增量建设性训练方案中,可以将输入侧训练和输出侧训练分开,以减少训练时间。为了提高输入端的训练效率和获得更好的泛化性能,提出了一种在构造学习过程中对误差信号进行缩放的新技术。还应用了两种用于去除输入侧冗余连接的修剪方法。数值模拟证明了与文献中其他现有技术相比,所提出的策略的潜力和优势。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信