Robust initialization of a Jordan network with recurrent constrained learning.

IEEE transactions on neural networks Pub Date : 2011-12-01 Epub Date: 2011-09-29 DOI:10.1109/TNN.2011.2168423
Qing Song
{"title":"Robust initialization of a Jordan network with recurrent constrained learning.","authors":"Qing Song","doi":"10.1109/TNN.2011.2168423","DOIUrl":null,"url":null,"abstract":"<p><p>In this paper, we propose a robust initialization of a Jordan network with a recurrent constrained learning (RIJNRCL) algorithm for multilayered recurrent neural networks (RNNs). This novel algorithm is based on the constrained learning concept of the Jordan network with a recurrent sensitivity and weight convergence analysis, which is used to obtain a tradeoff between the training and testing errors. In addition to using classical techniques for the adaptive learning rate and the adaptive dead zone, RIJNRCL employs a recurrent constrained parameter matrix to switch off excessive contributions from the hidden layer neurons based on weight convergence and stability conditions of the multilayered RNNs. It is well known that a good response from the hidden layer neurons and proper initialization play a dominant role in avoiding local minima in multilayered RNNs. The new RIJNRCL algorithm solves the twin problems of weight initialization and selection of the hidden layer neurons via a novel recurrent sensitivity ratio analysis. We provide the detailed steps for using RIJNRCL in a few benchmark time-series prediction problems and show that the proposed algorithm achieves superior generalization performance.</p>","PeriodicalId":13434,"journal":{"name":"IEEE transactions on neural networks","volume":"22 12","pages":"2460-73"},"PeriodicalIF":0.0000,"publicationDate":"2011-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/TNN.2011.2168423","citationCount":"16","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TNN.2011.2168423","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2011/9/29 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 16

Abstract

In this paper, we propose a robust initialization of a Jordan network with a recurrent constrained learning (RIJNRCL) algorithm for multilayered recurrent neural networks (RNNs). This novel algorithm is based on the constrained learning concept of the Jordan network with a recurrent sensitivity and weight convergence analysis, which is used to obtain a tradeoff between the training and testing errors. In addition to using classical techniques for the adaptive learning rate and the adaptive dead zone, RIJNRCL employs a recurrent constrained parameter matrix to switch off excessive contributions from the hidden layer neurons based on weight convergence and stability conditions of the multilayered RNNs. It is well known that a good response from the hidden layer neurons and proper initialization play a dominant role in avoiding local minima in multilayered RNNs. The new RIJNRCL algorithm solves the twin problems of weight initialization and selection of the hidden layer neurons via a novel recurrent sensitivity ratio analysis. We provide the detailed steps for using RIJNRCL in a few benchmark time-series prediction problems and show that the proposed algorithm achieves superior generalization performance.

具有循环约束学习的Jordan网络鲁棒初始化。
在本文中,我们提出了一种基于递归约束学习(RIJNRCL)算法的多层递归神经网络(rnn) Jordan网络的鲁棒初始化。该算法基于Jordan网络的约束学习概念,采用递归灵敏度和权值收敛分析,在训练误差和测试误差之间进行权衡。除了使用经典的自适应学习率和自适应死区技术外,RIJNRCL基于多层rnn的权收敛性和稳定性条件,采用循环约束参数矩阵来关闭隐藏层神经元的过度贡献。众所周知,在多层rnn中,隐藏层神经元的良好响应和适当的初始化是避免局部最小值的主要因素。新的RIJNRCL算法通过一种新颖的递归灵敏度比分析方法解决了权值初始化和隐层神经元选择的双重问题。我们提供了在几个基准时间序列预测问题中使用RIJNRCL的详细步骤,并表明所提出的算法具有优异的泛化性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE transactions on neural networks
IEEE transactions on neural networks 工程技术-工程:电子与电气
自引率
0.00%
发文量
2
审稿时长
8.7 months
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信