{"title":"Step acceleration based training algorithm for feedforward neural networks","authors":"Yanlai Li, Kuanquan Wang, David Zhang","doi":"10.1109/ICPR.2002.1048243","DOIUrl":null,"url":null,"abstract":"This paper presents a very fast step acceleration based training algorithm (SATA) for multilayer feedforward neural network training. The most outstanding virtue of this algorithm is that it does not need to calculate the gradient of the target function. In each iteration step, the computation only concentrates on the corresponding varied part. The proposed algorithm has attributes in simplicity, flexibility and feasibility, as well as high speed of convergence. Compared with the other methods, including the conventional backpropagation (BP), conjugate gradient, and weight extrapolation based BP, many simulations confirmed the superiority of this algorithm in terms of converging speed and computation time required.","PeriodicalId":159502,"journal":{"name":"Object recognition supported by user interaction for service robots","volume":"75 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2002-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"11","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Object recognition supported by user interaction for service robots","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICPR.2002.1048243","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 11
Abstract
This paper presents a very fast step acceleration based training algorithm (SATA) for multilayer feedforward neural network training. The most outstanding virtue of this algorithm is that it does not need to calculate the gradient of the target function. In each iteration step, the computation only concentrates on the corresponding varied part. The proposed algorithm has attributes in simplicity, flexibility and feasibility, as well as high speed of convergence. Compared with the other methods, including the conventional backpropagation (BP), conjugate gradient, and weight extrapolation based BP, many simulations confirmed the superiority of this algorithm in terms of converging speed and computation time required.