{"title":"复杂递归神经网络的综合与分析方法","authors":"K. Nikolic, B. Abramović, I. Šćepanović","doi":"10.1109/NEUREL.2006.341180","DOIUrl":null,"url":null,"abstract":"This paper presents an approach to optimization of recurrent artificial neural networks (RNN) that leans on the appliance of stochastic search (SS). Favor algorithm SS with information accumulation (SSAI) is simple in numerical sense, and does not require a lot of computing time in the optimization process i.e. RNN training, and gives suboptimal results in comparison to gradient methods. In certain sense, suggested approach more appropriate for engineering practice than back propagation error (BPE) method, because it does not condition the differentiability of activation neuron function, as well as transformation of RNN in corresponding multi-layered network with forward propagation signal, and after that gave the problem with a great deal of dimensions. Behind the corresponding theoretical analysis, SSAI is applied on optimization of structure and RNN parameters (supervised learning algorithm), for creation of predictive model which serves for content of useful component in input raw material in technological process of flotation in real time","PeriodicalId":231606,"journal":{"name":"2006 8th Seminar on Neural Network Applications in Electrical Engineering","volume":"63 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2006-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"An approach to Synthesis and Analysis of Complex Recurrent Neural Network\",\"authors\":\"K. Nikolic, B. Abramović, I. Šćepanović\",\"doi\":\"10.1109/NEUREL.2006.341180\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents an approach to optimization of recurrent artificial neural networks (RNN) that leans on the appliance of stochastic search (SS). Favor algorithm SS with information accumulation (SSAI) is simple in numerical sense, and does not require a lot of computing time in the optimization process i.e. RNN training, and gives suboptimal results in comparison to gradient methods. In certain sense, suggested approach more appropriate for engineering practice than back propagation error (BPE) method, because it does not condition the differentiability of activation neuron function, as well as transformation of RNN in corresponding multi-layered network with forward propagation signal, and after that gave the problem with a great deal of dimensions. Behind the corresponding theoretical analysis, SSAI is applied on optimization of structure and RNN parameters (supervised learning algorithm), for creation of predictive model which serves for content of useful component in input raw material in technological process of flotation in real time\",\"PeriodicalId\":231606,\"journal\":{\"name\":\"2006 8th Seminar on Neural Network Applications in Electrical Engineering\",\"volume\":\"63 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2006-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2006 8th Seminar on Neural Network Applications in Electrical Engineering\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/NEUREL.2006.341180\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2006 8th Seminar on Neural Network Applications in Electrical Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NEUREL.2006.341180","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An approach to Synthesis and Analysis of Complex Recurrent Neural Network
This paper presents an approach to optimization of recurrent artificial neural networks (RNN) that leans on the appliance of stochastic search (SS). Favor algorithm SS with information accumulation (SSAI) is simple in numerical sense, and does not require a lot of computing time in the optimization process i.e. RNN training, and gives suboptimal results in comparison to gradient methods. In certain sense, suggested approach more appropriate for engineering practice than back propagation error (BPE) method, because it does not condition the differentiability of activation neuron function, as well as transformation of RNN in corresponding multi-layered network with forward propagation signal, and after that gave the problem with a great deal of dimensions. Behind the corresponding theoretical analysis, SSAI is applied on optimization of structure and RNN parameters (supervised learning algorithm), for creation of predictive model which serves for content of useful component in input raw material in technological process of flotation in real time