{"title":"利用极限环提高Willshaw型网络的性能","authors":"G. Kohring","doi":"10.1051/JPHYS:0199000510210238700","DOIUrl":null,"url":null,"abstract":"Simulation results of a Willshaw type model for storing sparsely coded patterns are presented. It is suggested that random patterns can be stored in Willshaw type models by transforming them into a set of sparsely coded patterns and retrieving this set as a limit cycle. In this way, the number of steps needed to recall a pattern will be a function of the amount of information the pattern contains. A general algorithm for simulating neural networks with sparsely coded patterns is also discussed, and, on a fully connected network of N=36864 neurons (1.4×10 9 couplings), it is shown to achieve effective updaping speeds as high as 1.6×10 11 coupling evaluations per second on one Cray-YMP processor","PeriodicalId":14747,"journal":{"name":"Journal De Physique","volume":"21 1","pages":"2387-2393"},"PeriodicalIF":0.0000,"publicationDate":"1990-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Performance enhancement of Willshaw type networks through the use of limit cycles\",\"authors\":\"G. Kohring\",\"doi\":\"10.1051/JPHYS:0199000510210238700\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Simulation results of a Willshaw type model for storing sparsely coded patterns are presented. It is suggested that random patterns can be stored in Willshaw type models by transforming them into a set of sparsely coded patterns and retrieving this set as a limit cycle. In this way, the number of steps needed to recall a pattern will be a function of the amount of information the pattern contains. A general algorithm for simulating neural networks with sparsely coded patterns is also discussed, and, on a fully connected network of N=36864 neurons (1.4×10 9 couplings), it is shown to achieve effective updaping speeds as high as 1.6×10 11 coupling evaluations per second on one Cray-YMP processor\",\"PeriodicalId\":14747,\"journal\":{\"name\":\"Journal De Physique\",\"volume\":\"21 1\",\"pages\":\"2387-2393\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1990-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal De Physique\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1051/JPHYS:0199000510210238700\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal De Physique","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1051/JPHYS:0199000510210238700","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Performance enhancement of Willshaw type networks through the use of limit cycles
Simulation results of a Willshaw type model for storing sparsely coded patterns are presented. It is suggested that random patterns can be stored in Willshaw type models by transforming them into a set of sparsely coded patterns and retrieving this set as a limit cycle. In this way, the number of steps needed to recall a pattern will be a function of the amount of information the pattern contains. A general algorithm for simulating neural networks with sparsely coded patterns is also discussed, and, on a fully connected network of N=36864 neurons (1.4×10 9 couplings), it is shown to achieve effective updaping speeds as high as 1.6×10 11 coupling evaluations per second on one Cray-YMP processor