{"title":"A more powerful random neural network model in supervised learning applications","authors":"Sebastián Basterrech, G. Rubino","doi":"10.1109/SOCPAR.2013.7054127","DOIUrl":null,"url":null,"abstract":"Since the early 1990s, Random Neural Networks (RNNs) have gained importance in the Neural Networks and Queueing Networks communities. RNNs are inspired by biological neural networks and they are also an extension of open Jackson's networks in Queueing Theory. In 1993, a learning algorithm of gradient type was introduced in order to use RNNs in supervised learning tasks. This method considers only the weight connections among the neurons as adjustable parameters. All other parameters are deemed fixed during the training process. The RNN model has been successfully utilized in several types of applications such as: supervised learning problems, pattern recognition, optimization, image processing, associative memory. In this contribution we present a modification of the classic model obtained by extending the set of adjustable parameters. The modification increases the potential of the RNN model in supervised learning tasks keeping the same network topology and the same time complexity of the algorithm. We describe the new equations implementing a gradient descent learning technique for the model.","PeriodicalId":315126,"journal":{"name":"2013 International Conference on Soft Computing and Pattern Recognition (SoCPaR)","volume":"75 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 International Conference on Soft Computing and Pattern Recognition (SoCPaR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SOCPAR.2013.7054127","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
Since the early 1990s, Random Neural Networks (RNNs) have gained importance in the Neural Networks and Queueing Networks communities. RNNs are inspired by biological neural networks and they are also an extension of open Jackson's networks in Queueing Theory. In 1993, a learning algorithm of gradient type was introduced in order to use RNNs in supervised learning tasks. This method considers only the weight connections among the neurons as adjustable parameters. All other parameters are deemed fixed during the training process. The RNN model has been successfully utilized in several types of applications such as: supervised learning problems, pattern recognition, optimization, image processing, associative memory. In this contribution we present a modification of the classic model obtained by extending the set of adjustable parameters. The modification increases the potential of the RNN model in supervised learning tasks keeping the same network topology and the same time complexity of the algorithm. We describe the new equations implementing a gradient descent learning technique for the model.