{"title":"具有s型激活函数的概率神经网络的通用逼近","authors":"R. Murugadoss, M. Ramakrishnan","doi":"10.1109/ICAETR.2014.7012920","DOIUrl":null,"url":null,"abstract":"In this paper we demonstrate that finite linear combinations of compositions of a fixed, univariate function and a set of affine functional can uniformly approximate any continuous function of n real variables with support in the unit hypercube; only mild conditions are imposed on the univariate function. Our results settle an open question about representability in the class of single bidden layer neural networks. In particular, we show that arbitrary decision regions can be arbitrarily well approximated by continuous feedforward neural networks with only a single internal, hidden layer and any continuous sigmoidal nonlinearity. The paper discusses approximation properties of other possible types of nonlinearities that might be implemented by artificial neural networks. The daily registration has N cases that each of the well-known stimulus-answer couples represents. The objective of this work is to develop a function that allows finding the vector of entrance variables t to the vector of exit variables P. F is any function, in this case the electric power consumption. Their modeling with Artificial Neural Network (ANN) is Multi a Perceptron Layer (PMC). Another form of modeling it is using Interpolation Algorithms (AI).","PeriodicalId":196504,"journal":{"name":"2014 International Conference on Advances in Engineering & Technology Research (ICAETR - 2014)","volume":"37 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Universal approximation using probabilistic neural networks with sigmoid activation functions\",\"authors\":\"R. Murugadoss, M. Ramakrishnan\",\"doi\":\"10.1109/ICAETR.2014.7012920\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper we demonstrate that finite linear combinations of compositions of a fixed, univariate function and a set of affine functional can uniformly approximate any continuous function of n real variables with support in the unit hypercube; only mild conditions are imposed on the univariate function. Our results settle an open question about representability in the class of single bidden layer neural networks. In particular, we show that arbitrary decision regions can be arbitrarily well approximated by continuous feedforward neural networks with only a single internal, hidden layer and any continuous sigmoidal nonlinearity. The paper discusses approximation properties of other possible types of nonlinearities that might be implemented by artificial neural networks. The daily registration has N cases that each of the well-known stimulus-answer couples represents. The objective of this work is to develop a function that allows finding the vector of entrance variables t to the vector of exit variables P. F is any function, in this case the electric power consumption. Their modeling with Artificial Neural Network (ANN) is Multi a Perceptron Layer (PMC). Another form of modeling it is using Interpolation Algorithms (AI).\",\"PeriodicalId\":196504,\"journal\":{\"name\":\"2014 International Conference on Advances in Engineering & Technology Research (ICAETR - 2014)\",\"volume\":\"37 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-08-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2014 International Conference on Advances in Engineering & Technology Research (ICAETR - 2014)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICAETR.2014.7012920\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 International Conference on Advances in Engineering & Technology Research (ICAETR - 2014)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAETR.2014.7012920","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Universal approximation using probabilistic neural networks with sigmoid activation functions
In this paper we demonstrate that finite linear combinations of compositions of a fixed, univariate function and a set of affine functional can uniformly approximate any continuous function of n real variables with support in the unit hypercube; only mild conditions are imposed on the univariate function. Our results settle an open question about representability in the class of single bidden layer neural networks. In particular, we show that arbitrary decision regions can be arbitrarily well approximated by continuous feedforward neural networks with only a single internal, hidden layer and any continuous sigmoidal nonlinearity. The paper discusses approximation properties of other possible types of nonlinearities that might be implemented by artificial neural networks. The daily registration has N cases that each of the well-known stimulus-answer couples represents. The objective of this work is to develop a function that allows finding the vector of entrance variables t to the vector of exit variables P. F is any function, in this case the electric power consumption. Their modeling with Artificial Neural Network (ANN) is Multi a Perceptron Layer (PMC). Another form of modeling it is using Interpolation Algorithms (AI).