{"title":"Introduction of a Hebbian unsupervised learning algorithm to boost the encoding capacity of Hopfield networks","authors":"C. Molter, U. Salihoglu, H. Bersini","doi":"10.1109/IJCNN.2005.1556109","DOIUrl":null,"url":null,"abstract":"The learning impact, of an iterative supervised Hebbian learning algorithm, on a recurrent neural network's underlying dynamics has been discussed in a previous paper. It was argued that these results are in line with the observations made by Freeman in the olfactory bulb of the rabbit: cycles are used to store information and the chaotic dynamics appears as the background regime composed of those cyclic \"memory bags\". However, to get closer to a biological point of view, this paper introduces an unsupervised version of this Hebbian algorithm. As a direct result, both the storing capacity and the content addressability of the learned networks are greatly enhanced. Furthermore, stunning dynamical results are observed: if the learning process increases the dimension of the potential attractors, however, less chaoticity is found than in a supervised learning process. Moreover, chaos obtained looks more structured, made from brief itinerancy among learned cycles.","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"41 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"18","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2005.1556109","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 18
Abstract
The learning impact, of an iterative supervised Hebbian learning algorithm, on a recurrent neural network's underlying dynamics has been discussed in a previous paper. It was argued that these results are in line with the observations made by Freeman in the olfactory bulb of the rabbit: cycles are used to store information and the chaotic dynamics appears as the background regime composed of those cyclic "memory bags". However, to get closer to a biological point of view, this paper introduces an unsupervised version of this Hebbian algorithm. As a direct result, both the storing capacity and the content addressability of the learned networks are greatly enhanced. Furthermore, stunning dynamical results are observed: if the learning process increases the dimension of the potential attractors, however, less chaoticity is found than in a supervised learning process. Moreover, chaos obtained looks more structured, made from brief itinerancy among learned cycles.