{"title":"Robust Exponential Memory in Hopfield Networks.","authors":"Christopher J Hillar, Ngoc M Tran","doi":"10.1186/s13408-017-0056-2","DOIUrl":null,"url":null,"abstract":"<p><p>The Hopfield recurrent neural network is a classical auto-associative model of memory, in which collections of symmetrically coupled McCulloch-Pitts binary neurons interact to perform emergent computation. Although previous researchers have explored the potential of this network to solve combinatorial optimization problems or store reoccurring activity patterns as attractors of its deterministic dynamics, a basic open problem is to design a family of Hopfield networks with a number of noise-tolerant memories that grows exponentially with neural population size. Here, we discover such networks by minimizing probability flow, a recently proposed objective for estimating parameters in discrete maximum entropy models. By descending the gradient of the convex probability flow, our networks adapt synaptic weights to achieve robust exponential storage, even when presented with vanishingly small numbers of training patterns. In addition to providing a new set of low-density error-correcting codes that achieve Shannon's noisy channel bound, these networks also efficiently solve a variant of the hidden clique problem in computer science, opening new avenues for real-world applications of computational models originating from biology.</p>","PeriodicalId":54271,"journal":{"name":"Journal of Mathematical Neuroscience","volume":"8 1","pages":"1"},"PeriodicalIF":2.3000,"publicationDate":"2018-01-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1186/s13408-017-0056-2","citationCount":"20","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Mathematical Neuroscience","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1186/s13408-017-0056-2","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Neuroscience","Score":null,"Total":0}
引用次数: 20
Abstract
The Hopfield recurrent neural network is a classical auto-associative model of memory, in which collections of symmetrically coupled McCulloch-Pitts binary neurons interact to perform emergent computation. Although previous researchers have explored the potential of this network to solve combinatorial optimization problems or store reoccurring activity patterns as attractors of its deterministic dynamics, a basic open problem is to design a family of Hopfield networks with a number of noise-tolerant memories that grows exponentially with neural population size. Here, we discover such networks by minimizing probability flow, a recently proposed objective for estimating parameters in discrete maximum entropy models. By descending the gradient of the convex probability flow, our networks adapt synaptic weights to achieve robust exponential storage, even when presented with vanishingly small numbers of training patterns. In addition to providing a new set of low-density error-correcting codes that achieve Shannon's noisy channel bound, these networks also efficiently solve a variant of the hidden clique problem in computer science, opening new avenues for real-world applications of computational models originating from biology.
期刊介绍:
The Journal of Mathematical Neuroscience (JMN) publishes research articles on the mathematical modeling and analysis of all areas of neuroscience, i.e., the study of the nervous system and its dysfunctions. The focus is on using mathematics as the primary tool for elucidating the fundamental mechanisms responsible for experimentally observed behaviours in neuroscience at all relevant scales, from the molecular world to that of cognition. The aim is to publish work that uses advanced mathematical techniques to illuminate these questions.
It publishes full length original papers, rapid communications and review articles. Papers that combine theoretical results supported by convincing numerical experiments are especially encouraged.
Papers that introduce and help develop those new pieces of mathematical theory which are likely to be relevant to future studies of the nervous system in general and the human brain in particular are also welcome.