A neural network to approximate nonlinear functions

A. Bernardini, S. de Fina
{"title":"A neural network to approximate nonlinear functions","authors":"A. Bernardini, S. de Fina","doi":"10.1109/MWSCAS.1991.252103","DOIUrl":null,"url":null,"abstract":"A neural network approach to the problem of approximating any nonlinear continuous function is provided. The results obtained are related to the single-variable case, but the main conclusions can be generalized for the multidimensional case. The net is a modified perceptron with one hidden layer of sigmoidal units and two intermediate output linear units that are linearly combined to provide the final mapping. In particular, the problem concerning the starting weight configuration and the conditions that guarantee the correct learning with a random setting is analyzed. Other neural computations providing similar solutions to the approximation problem suffer from convergence to a local minimum if the starting network configuration is arbitrarily chosen, thus requiring a previous computation of the interpolating parameters that provides a weights setting quite close to the global optimum. In the present approach, one of the intermediate outputs is somewhat related to the curve derivative so that the overall net behavior can be viewed as a curve derivative integrator in which the second output is related to the constant term to be added to the undefined integral calculation. Simulation results, obtained after randomly setting the starting weight configuration, show excellent performance for all the trained functions.<<ETX>>","PeriodicalId":6453,"journal":{"name":"[1991] Proceedings of the 34th Midwest Symposium on Circuits and Systems","volume":"1 1","pages":"545-548 vol.1"},"PeriodicalIF":0.0000,"publicationDate":"1991-05-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"[1991] Proceedings of the 34th Midwest Symposium on Circuits and Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MWSCAS.1991.252103","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

A neural network approach to the problem of approximating any nonlinear continuous function is provided. The results obtained are related to the single-variable case, but the main conclusions can be generalized for the multidimensional case. The net is a modified perceptron with one hidden layer of sigmoidal units and two intermediate output linear units that are linearly combined to provide the final mapping. In particular, the problem concerning the starting weight configuration and the conditions that guarantee the correct learning with a random setting is analyzed. Other neural computations providing similar solutions to the approximation problem suffer from convergence to a local minimum if the starting network configuration is arbitrarily chosen, thus requiring a previous computation of the interpolating parameters that provides a weights setting quite close to the global optimum. In the present approach, one of the intermediate outputs is somewhat related to the curve derivative so that the overall net behavior can be viewed as a curve derivative integrator in which the second output is related to the constant term to be added to the undefined integral calculation. Simulation results, obtained after randomly setting the starting weight configuration, show excellent performance for all the trained functions.<>
近似非线性函数的神经网络
给出了一种逼近任意非线性连续函数问题的神经网络方法。所得结果与单变量情况有关,但主要结论可以推广到多维情况。该网络是一个改进的感知器,具有一个隐藏层的s型单元和两个中间输出线性单元,它们线性组合以提供最终映射。特别分析了在随机设置下的起始权值配置问题和保证正确学习的条件。如果任意选择起始网络配置,其他为近似问题提供类似解决方案的神经计算也会收敛到局部最小值,因此需要事先计算插值参数,从而提供非常接近全局最优的权重设置。在目前的方法中,其中一个中间输出与曲线导数有一定的关系,因此整个网络行为可以被视为曲线导数积分器,其中第二个输出与要添加到未定义积分计算中的常数项有关。随机设置起始权值配置后的仿真结果表明,所有训练函数的性能都很好。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信