{"title":"递归神经网络中的连续吸引子与相空间学习","authors":"Rogério de Oliveira, L. Monteiro","doi":"10.1109/SBRN.2000.889763","DOIUrl":null,"url":null,"abstract":"Recurrent networks can be used as associative memories where the stored memories represent fixed points to which the dynamics of the network converges. These networks, however, also can present continuous attractors, as limit cycles and chaotic attractors. The use of these attractors in recurrent networks for the construction of associative memories is argued. We provide a training algorithm for continuous attractors and present some numerical results of the learning method which involves genetic algorithms.","PeriodicalId":448461,"journal":{"name":"Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks","volume":"16 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2000-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Continuous attractors in recurrent neural networks and phase space learning\",\"authors\":\"Rogério de Oliveira, L. Monteiro\",\"doi\":\"10.1109/SBRN.2000.889763\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Recurrent networks can be used as associative memories where the stored memories represent fixed points to which the dynamics of the network converges. These networks, however, also can present continuous attractors, as limit cycles and chaotic attractors. The use of these attractors in recurrent networks for the construction of associative memories is argued. We provide a training algorithm for continuous attractors and present some numerical results of the learning method which involves genetic algorithms.\",\"PeriodicalId\":448461,\"journal\":{\"name\":\"Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks\",\"volume\":\"16 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2000-01-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SBRN.2000.889763\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SBRN.2000.889763","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Continuous attractors in recurrent neural networks and phase space learning
Recurrent networks can be used as associative memories where the stored memories represent fixed points to which the dynamics of the network converges. These networks, however, also can present continuous attractors, as limit cycles and chaotic attractors. The use of these attractors in recurrent networks for the construction of associative memories is argued. We provide a training algorithm for continuous attractors and present some numerical results of the learning method which involves genetic algorithms.