{"title":"Separable recursive training algorithms for feedforward neural networks","authors":"V. Asirvadam, S.F. McLoone, G. Irwin","doi":"10.1109/IJCNN.2002.1007667","DOIUrl":null,"url":null,"abstract":"Novel separable recursive training strategies are derived for the training of feedforward neural networks. These hybrid algorithms combine nonlinear recursive optimization of hidden-layer nonlinear weights with recursive least-squares optimization of linear output-layer weights in one integrated routine. Experimental results for two benchmark problems demonstrate the superiority of the new hybrid training schemes compared to conventional counterparts.","PeriodicalId":382771,"journal":{"name":"Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290)","volume":"2017 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2002-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2002.1007667","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8
Abstract
Novel separable recursive training strategies are derived for the training of feedforward neural networks. These hybrid algorithms combine nonlinear recursive optimization of hidden-layer nonlinear weights with recursive least-squares optimization of linear output-layer weights in one integrated routine. Experimental results for two benchmark problems demonstrate the superiority of the new hybrid training schemes compared to conventional counterparts.