{"title":"Activation of connections to accelerate the learning in recurrent back-propagation","authors":"R. Kamimura","doi":"10.1109/CMPEUR.1992.218512","DOIUrl":null,"url":null,"abstract":"A method of accelerating learning in recurrent neural networks is described. To activate and use connections, a complexity term defined by an equation was added to a standard quadratic error function. In experiments, a method was used in which a derivative of the complexity term was normally effective for positive connections, while negative connections were pushed toward smaller values. Thus, some connections were expected to be activated and were large enough to speed up learning. It was confirmed that the complexity term was effective in increasing the variance of the connections, especially the hidden connections. It was also confirmed that eventually some connections, especially some hidden connections, were activated and were large enough to be used in speeding up learning.<<ETX>>","PeriodicalId":390273,"journal":{"name":"CompEuro 1992 Proceedings Computer Systems and Software Engineering","volume":"34 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1992-05-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"CompEuro 1992 Proceedings Computer Systems and Software Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CMPEUR.1992.218512","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
A method of accelerating learning in recurrent neural networks is described. To activate and use connections, a complexity term defined by an equation was added to a standard quadratic error function. In experiments, a method was used in which a derivative of the complexity term was normally effective for positive connections, while negative connections were pushed toward smaller values. Thus, some connections were expected to be activated and were large enough to speed up learning. It was confirmed that the complexity term was effective in increasing the variance of the connections, especially the hidden connections. It was also confirmed that eventually some connections, especially some hidden connections, were activated and were large enough to be used in speeding up learning.<>