{"title":"A neural network to approximate nonlinear functions","authors":"A. Bernardini, S. de Fina","doi":"10.1109/MWSCAS.1991.252103","DOIUrl":null,"url":null,"abstract":"A neural network approach to the problem of approximating any nonlinear continuous function is provided. The results obtained are related to the single-variable case, but the main conclusions can be generalized for the multidimensional case. The net is a modified perceptron with one hidden layer of sigmoidal units and two intermediate output linear units that are linearly combined to provide the final mapping. In particular, the problem concerning the starting weight configuration and the conditions that guarantee the correct learning with a random setting is analyzed. Other neural computations providing similar solutions to the approximation problem suffer from convergence to a local minimum if the starting network configuration is arbitrarily chosen, thus requiring a previous computation of the interpolating parameters that provides a weights setting quite close to the global optimum. In the present approach, one of the intermediate outputs is somewhat related to the curve derivative so that the overall net behavior can be viewed as a curve derivative integrator in which the second output is related to the constant term to be added to the undefined integral calculation. Simulation results, obtained after randomly setting the starting weight configuration, show excellent performance for all the trained functions.<<ETX>>","PeriodicalId":6453,"journal":{"name":"[1991] Proceedings of the 34th Midwest Symposium on Circuits and Systems","volume":"1 1","pages":"545-548 vol.1"},"PeriodicalIF":0.0000,"publicationDate":"1991-05-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"[1991] Proceedings of the 34th Midwest Symposium on Circuits and Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MWSCAS.1991.252103","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
A neural network approach to the problem of approximating any nonlinear continuous function is provided. The results obtained are related to the single-variable case, but the main conclusions can be generalized for the multidimensional case. The net is a modified perceptron with one hidden layer of sigmoidal units and two intermediate output linear units that are linearly combined to provide the final mapping. In particular, the problem concerning the starting weight configuration and the conditions that guarantee the correct learning with a random setting is analyzed. Other neural computations providing similar solutions to the approximation problem suffer from convergence to a local minimum if the starting network configuration is arbitrarily chosen, thus requiring a previous computation of the interpolating parameters that provides a weights setting quite close to the global optimum. In the present approach, one of the intermediate outputs is somewhat related to the curve derivative so that the overall net behavior can be viewed as a curve derivative integrator in which the second output is related to the constant term to be added to the undefined integral calculation. Simulation results, obtained after randomly setting the starting weight configuration, show excellent performance for all the trained functions.<>