{"title":"Robust Jordan network for nonlinear time series prediction","authors":"Q. Song","doi":"10.1109/IJCNN.2011.6033550","DOIUrl":null,"url":null,"abstract":"We propose a robust initialization of Jordan network with recurrent constrained learning (RIJNRCL) algorithm for multilayered recurrent neural networks (RNNs). This novel algorithm is based on the constrained learning concept of Jordan network with recurrent sensitivity and weight convergence analysis to obtain a tradeoff between training and testing errors. In addition to use classical techniques of the adaptive learning rate and adaptive dead zone, RIJNRCL uses a recurrent constrained parameter matrix to switch off excessive contribution of the hidden layer neurons based on weight convergence and stability conditions of the the multilayered RNNs.","PeriodicalId":415833,"journal":{"name":"The 2011 International Joint Conference on Neural Networks","volume":"36 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-10-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The 2011 International Joint Conference on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2011.6033550","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
We propose a robust initialization of Jordan network with recurrent constrained learning (RIJNRCL) algorithm for multilayered recurrent neural networks (RNNs). This novel algorithm is based on the constrained learning concept of Jordan network with recurrent sensitivity and weight convergence analysis to obtain a tradeoff between training and testing errors. In addition to use classical techniques of the adaptive learning rate and adaptive dead zone, RIJNRCL uses a recurrent constrained parameter matrix to switch off excessive contribution of the hidden layer neurons based on weight convergence and stability conditions of the the multilayered RNNs.