Hopfield网络中的鲁棒指数记忆。

IF 2.3 4区 医学 Q1 Neuroscience
Christopher J Hillar, Ngoc M Tran
{"title":"Hopfield网络中的鲁棒指数记忆。","authors":"Christopher J Hillar,&nbsp;Ngoc M Tran","doi":"10.1186/s13408-017-0056-2","DOIUrl":null,"url":null,"abstract":"<p><p>The Hopfield recurrent neural network is a classical auto-associative model of memory, in which collections of symmetrically coupled McCulloch-Pitts binary neurons interact to perform emergent computation. Although previous researchers have explored the potential of this network to solve combinatorial optimization problems or store reoccurring activity patterns as attractors of its deterministic dynamics, a basic open problem is to design a family of Hopfield networks with a number of noise-tolerant memories that grows exponentially with neural population size. Here, we discover such networks by minimizing probability flow, a recently proposed objective for estimating parameters in discrete maximum entropy models. By descending the gradient of the convex probability flow, our networks adapt synaptic weights to achieve robust exponential storage, even when presented with vanishingly small numbers of training patterns. In addition to providing a new set of low-density error-correcting codes that achieve Shannon's noisy channel bound, these networks also efficiently solve a variant of the hidden clique problem in computer science, opening new avenues for real-world applications of computational models originating from biology.</p>","PeriodicalId":54271,"journal":{"name":"Journal of Mathematical Neuroscience","volume":"8 1","pages":"1"},"PeriodicalIF":2.3000,"publicationDate":"2018-01-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1186/s13408-017-0056-2","citationCount":"20","resultStr":"{\"title\":\"Robust Exponential Memory in Hopfield Networks.\",\"authors\":\"Christopher J Hillar,&nbsp;Ngoc M Tran\",\"doi\":\"10.1186/s13408-017-0056-2\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>The Hopfield recurrent neural network is a classical auto-associative model of memory, in which collections of symmetrically coupled McCulloch-Pitts binary neurons interact to perform emergent computation. Although previous researchers have explored the potential of this network to solve combinatorial optimization problems or store reoccurring activity patterns as attractors of its deterministic dynamics, a basic open problem is to design a family of Hopfield networks with a number of noise-tolerant memories that grows exponentially with neural population size. Here, we discover such networks by minimizing probability flow, a recently proposed objective for estimating parameters in discrete maximum entropy models. By descending the gradient of the convex probability flow, our networks adapt synaptic weights to achieve robust exponential storage, even when presented with vanishingly small numbers of training patterns. In addition to providing a new set of low-density error-correcting codes that achieve Shannon's noisy channel bound, these networks also efficiently solve a variant of the hidden clique problem in computer science, opening new avenues for real-world applications of computational models originating from biology.</p>\",\"PeriodicalId\":54271,\"journal\":{\"name\":\"Journal of Mathematical Neuroscience\",\"volume\":\"8 1\",\"pages\":\"1\"},\"PeriodicalIF\":2.3000,\"publicationDate\":\"2018-01-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1186/s13408-017-0056-2\",\"citationCount\":\"20\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Mathematical Neuroscience\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.1186/s13408-017-0056-2\",\"RegionNum\":4,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"Neuroscience\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Mathematical Neuroscience","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1186/s13408-017-0056-2","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Neuroscience","Score":null,"Total":0}
引用次数: 20

摘要

Hopfield递归神经网络是一种经典的记忆自联想模型,其中对称耦合的McCulloch-Pitts二元神经元的集合相互作用来执行紧急计算。尽管先前的研究人员已经探索了该网络解决组合优化问题或存储重复出现的活动模式作为其确定性动态吸引子的潜力,但一个基本的开放问题是设计一个具有许多耐噪声记忆的Hopfield网络家族,这些记忆随着神经种群的大小呈指数级增长。在这里,我们通过最小化概率流来发现这样的网络,这是最近提出的在离散最大熵模型中估计参数的目标。通过降低凸概率流的梯度,我们的网络调整突触权重以实现稳健的指数存储,即使呈现的训练模式数量非常少。除了提供一组新的低密度纠错码来实现香农的噪声信道边界外,这些网络还有效地解决了计算机科学中隐藏集团问题的一个变体,为源自生物学的计算模型的实际应用开辟了新的途径。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Robust Exponential Memory in Hopfield Networks.

Robust Exponential Memory in Hopfield Networks.

Robust Exponential Memory in Hopfield Networks.

Robust Exponential Memory in Hopfield Networks.

The Hopfield recurrent neural network is a classical auto-associative model of memory, in which collections of symmetrically coupled McCulloch-Pitts binary neurons interact to perform emergent computation. Although previous researchers have explored the potential of this network to solve combinatorial optimization problems or store reoccurring activity patterns as attractors of its deterministic dynamics, a basic open problem is to design a family of Hopfield networks with a number of noise-tolerant memories that grows exponentially with neural population size. Here, we discover such networks by minimizing probability flow, a recently proposed objective for estimating parameters in discrete maximum entropy models. By descending the gradient of the convex probability flow, our networks adapt synaptic weights to achieve robust exponential storage, even when presented with vanishingly small numbers of training patterns. In addition to providing a new set of low-density error-correcting codes that achieve Shannon's noisy channel bound, these networks also efficiently solve a variant of the hidden clique problem in computer science, opening new avenues for real-world applications of computational models originating from biology.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of Mathematical Neuroscience
Journal of Mathematical Neuroscience Neuroscience-Neuroscience (miscellaneous)
自引率
0.00%
发文量
0
审稿时长
13 weeks
期刊介绍: The Journal of Mathematical Neuroscience (JMN) publishes research articles on the mathematical modeling and analysis of all areas of neuroscience, i.e., the study of the nervous system and its dysfunctions. The focus is on using mathematics as the primary tool for elucidating the fundamental mechanisms responsible for experimentally observed behaviours in neuroscience at all relevant scales, from the molecular world to that of cognition. The aim is to publish work that uses advanced mathematical techniques to illuminate these questions. It publishes full length original papers, rapid communications and review articles. Papers that combine theoretical results supported by convincing numerical experiments are especially encouraged. Papers that introduce and help develop those new pieces of mathematical theory which are likely to be relevant to future studies of the nervous system in general and the human brain in particular are also welcome.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信