Improved Hopfield networks by training with noisy data

F. Clift, T. Martinez
{"title":"Improved Hopfield networks by training with noisy data","authors":"F. Clift, T. Martinez","doi":"10.1109/IJCNN.2001.939521","DOIUrl":null,"url":null,"abstract":"An approach to training a generalized Hopfield network is developed and evaluated. Both the weight symmetricity constraint and the zero self-connection constraint are removed from standard Hopfield networks. Training is accomplished with backpropagation through time, using noisy versions of the memorized patterns. Training in this way is referred to as noisy associative training (NAT). Performance of NAT is evaluated on both random and correlated data. NAT has been tested on several data sets, with a large number of training runs for each experiment. The data sets used include uniformly distributed random data and several data sets adapted from the U.C. Irvine Machine Learning Repository. Results show that for random patterns, Hopfield networks trained with NAT have an average overall recall accuracy 6.1 times greater than networks produced with either Hebbian or pseudo-inverse training. Additionally, these networks have 13% fewer spurious memories on average than networks trained with pseudoinverse or Hebbian training. Typically, networks memorizing over 2N (where N is the number of nodes in the network) patterns are produced. Performance on correlated data shows an even greater improvement over networks produced with either Hebbian or pseudo-inverse training-an average of 27.8 times greater recall accuracy, with 14% fewer spurious memories.","PeriodicalId":346955,"journal":{"name":"IJCNN'01. International Joint Conference on Neural Networks. Proceedings (Cat. No.01CH37222)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2001-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IJCNN'01. International Joint Conference on Neural Networks. Proceedings (Cat. No.01CH37222)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2001.939521","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

Abstract

An approach to training a generalized Hopfield network is developed and evaluated. Both the weight symmetricity constraint and the zero self-connection constraint are removed from standard Hopfield networks. Training is accomplished with backpropagation through time, using noisy versions of the memorized patterns. Training in this way is referred to as noisy associative training (NAT). Performance of NAT is evaluated on both random and correlated data. NAT has been tested on several data sets, with a large number of training runs for each experiment. The data sets used include uniformly distributed random data and several data sets adapted from the U.C. Irvine Machine Learning Repository. Results show that for random patterns, Hopfield networks trained with NAT have an average overall recall accuracy 6.1 times greater than networks produced with either Hebbian or pseudo-inverse training. Additionally, these networks have 13% fewer spurious memories on average than networks trained with pseudoinverse or Hebbian training. Typically, networks memorizing over 2N (where N is the number of nodes in the network) patterns are produced. Performance on correlated data shows an even greater improvement over networks produced with either Hebbian or pseudo-inverse training-an average of 27.8 times greater recall accuracy, with 14% fewer spurious memories.
通过带噪数据训练改进Hopfield网络
提出并评价了一种训练广义Hopfield网络的方法。从标准Hopfield网络中去除了权对称约束和零自连接约束。训练是通过时间反向传播完成的,使用记忆模式的噪声版本。这种训练方式被称为噪声联想训练(NAT)。在随机数据和相关数据上对NAT的性能进行了评估。NAT已经在几个数据集上进行了测试,每个实验都有大量的训练运行。使用的数据集包括均匀分布的随机数据和来自加州大学欧文分校机器学习存储库的几个数据集。结果表明,对于随机模式,使用NAT训练的Hopfield网络的平均总召回准确率比使用Hebbian或伪逆训练产生的网络高6.1倍。此外,这些网络的虚假记忆平均比使用伪逆训练或Hebbian训练的网络少13%。通常,会产生记忆超过2N个模式的网络(其中N是网络中的节点数)。在相关数据上的表现显示出比使用Hebbian或伪逆训练产生的网络有更大的改进——回忆准确率平均提高27.8倍,虚假记忆减少14%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信