Performance enhancement of Willshaw type networks through the use of limit cycles

G. Kohring
{"title":"Performance enhancement of Willshaw type networks through the use of limit cycles","authors":"G. Kohring","doi":"10.1051/JPHYS:0199000510210238700","DOIUrl":null,"url":null,"abstract":"Simulation results of a Willshaw type model for storing sparsely coded patterns are presented. It is suggested that random patterns can be stored in Willshaw type models by transforming them into a set of sparsely coded patterns and retrieving this set as a limit cycle. In this way, the number of steps needed to recall a pattern will be a function of the amount of information the pattern contains. A general algorithm for simulating neural networks with sparsely coded patterns is also discussed, and, on a fully connected network of N=36864 neurons (1.4×10 9 couplings), it is shown to achieve effective updaping speeds as high as 1.6×10 11 coupling evaluations per second on one Cray-YMP processor","PeriodicalId":14747,"journal":{"name":"Journal De Physique","volume":"21 1","pages":"2387-2393"},"PeriodicalIF":0.0000,"publicationDate":"1990-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal De Physique","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1051/JPHYS:0199000510210238700","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Simulation results of a Willshaw type model for storing sparsely coded patterns are presented. It is suggested that random patterns can be stored in Willshaw type models by transforming them into a set of sparsely coded patterns and retrieving this set as a limit cycle. In this way, the number of steps needed to recall a pattern will be a function of the amount of information the pattern contains. A general algorithm for simulating neural networks with sparsely coded patterns is also discussed, and, on a fully connected network of N=36864 neurons (1.4×10 9 couplings), it is shown to achieve effective updaping speeds as high as 1.6×10 11 coupling evaluations per second on one Cray-YMP processor
利用极限环提高Willshaw型网络的性能
给出了一种存储稀疏编码模式的Willshaw型模型的仿真结果。建议将随机模式转换为稀疏编码模式集合,并将该集合作为极限环检索,从而将随机模式存储在Willshaw型模型中。通过这种方式,回忆模式所需的步骤数将是模式包含的信息量的函数。本文还讨论了一种用于模拟具有稀疏编码模式的神经网络的通用算法,并且,在N=36864个神经元(1.4×10 9个耦合)的全连接网络上,它被证明可以在一个cry - ymp处理器上实现高达每秒1.6×10 11次耦合评估的有效更新速度
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信