A feed forward neural network with resolution properties for function approximation and modeling

P. H. F. D. Silva, E. Fernandes, A. Neto
{"title":"A feed forward neural network with resolution properties for function approximation and modeling","authors":"P. H. F. D. Silva, E. Fernandes, A. Neto","doi":"10.1109/SBRN.2002.1181435","DOIUrl":null,"url":null,"abstract":"This paper attempts to the development of a novel feed forward artificial neural network paradigm. In its formulation, the hidden neurons were defined by the use of sample activation functions. The following function parameters were included: amplitude, width and translation. Further, the hidden neurons were classified as low and high resolution neurons, with global and local approximation properties, respectively. The gradient method was applied to obtain simple recursive relations for paradigm training. The results of the applications shown the interesting paradigm properties: (i) easy choice of neural network size; (ii) fast training; (iii) strong ability to perform complicated function approximation and nonlinear modeling.","PeriodicalId":157186,"journal":{"name":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2002-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SBRN.2002.1181435","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

This paper attempts to the development of a novel feed forward artificial neural network paradigm. In its formulation, the hidden neurons were defined by the use of sample activation functions. The following function parameters were included: amplitude, width and translation. Further, the hidden neurons were classified as low and high resolution neurons, with global and local approximation properties, respectively. The gradient method was applied to obtain simple recursive relations for paradigm training. The results of the applications shown the interesting paradigm properties: (i) easy choice of neural network size; (ii) fast training; (iii) strong ability to perform complicated function approximation and nonlinear modeling.
用于函数逼近和建模的具有分辨率特性的前馈神经网络
本文试图发展一种新的前馈人工神经网络范式。在其公式中,隐藏神经元通过使用样本激活函数来定义。包括振幅、宽度、平移等函数参数。此外,隐藏神经元被分类为低分辨率和高分辨率神经元,分别具有全局和局部近似性质。采用梯度法得到简单递归关系,用于范式训练。应用结果显示了有趣的范式特性:(1)易于选择神经网络的大小;(ii)快速训练;(iii)具有较强的复杂函数逼近和非线性建模能力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信