{"title":"Hopfield联想记忆的最佳学习","authors":"X. Zhuang, Y. Huang","doi":"10.1109/ICPR.1992.201801","DOIUrl":null,"url":null,"abstract":"Designs the optimal learning rule for the Hopfield associative memories (HAM) based on three well recognized criteria, that is, all desired attractors must be made not only isolately stable but also asymptotically stable, and the spurious stable states should be the fewest possible. To construct a satisfactory HAM, those criteria are crucial. The paper first analyzes the real cause of the unsatisfactory performance of the Hebb rule and many other existing learning rules designed for HAMs and then show that three criteria actually amount to widely expanding the basin of attraction around each desired attractor. One effective way to widely expand basins of attraction of all desired attractors is to appropriately dig their respective steep kernal basin of attraction. For this, the authors introduce a concept called the Hamming-stability. The Hamming-stability for all desired attractors can be reduced to a moderately expansive linear separability condition at each neuron and thus the well known Rosenblatt's perceptron learning rule is the right one for learning the Hamming-stability. Extensive and systematic experiments were conducted, convincingly showing that the proposed perceptron. Hamming-stability learning rule did take a good care of three optimal criteria.<<ETX>>","PeriodicalId":34917,"journal":{"name":"模式识别与人工智能","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"1992-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Optimal learning for Hopfield associative memory\",\"authors\":\"X. Zhuang, Y. Huang\",\"doi\":\"10.1109/ICPR.1992.201801\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Designs the optimal learning rule for the Hopfield associative memories (HAM) based on three well recognized criteria, that is, all desired attractors must be made not only isolately stable but also asymptotically stable, and the spurious stable states should be the fewest possible. To construct a satisfactory HAM, those criteria are crucial. The paper first analyzes the real cause of the unsatisfactory performance of the Hebb rule and many other existing learning rules designed for HAMs and then show that three criteria actually amount to widely expanding the basin of attraction around each desired attractor. One effective way to widely expand basins of attraction of all desired attractors is to appropriately dig their respective steep kernal basin of attraction. For this, the authors introduce a concept called the Hamming-stability. The Hamming-stability for all desired attractors can be reduced to a moderately expansive linear separability condition at each neuron and thus the well known Rosenblatt's perceptron learning rule is the right one for learning the Hamming-stability. Extensive and systematic experiments were conducted, convincingly showing that the proposed perceptron. Hamming-stability learning rule did take a good care of three optimal criteria.<<ETX>>\",\"PeriodicalId\":34917,\"journal\":{\"name\":\"模式识别与人工智能\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1992-08-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"模式识别与人工智能\",\"FirstCategoryId\":\"1093\",\"ListUrlMain\":\"https://doi.org/10.1109/ICPR.1992.201801\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"Computer Science\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"模式识别与人工智能","FirstCategoryId":"1093","ListUrlMain":"https://doi.org/10.1109/ICPR.1992.201801","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"Computer Science","Score":null,"Total":0}
Designs the optimal learning rule for the Hopfield associative memories (HAM) based on three well recognized criteria, that is, all desired attractors must be made not only isolately stable but also asymptotically stable, and the spurious stable states should be the fewest possible. To construct a satisfactory HAM, those criteria are crucial. The paper first analyzes the real cause of the unsatisfactory performance of the Hebb rule and many other existing learning rules designed for HAMs and then show that three criteria actually amount to widely expanding the basin of attraction around each desired attractor. One effective way to widely expand basins of attraction of all desired attractors is to appropriately dig their respective steep kernal basin of attraction. For this, the authors introduce a concept called the Hamming-stability. The Hamming-stability for all desired attractors can be reduced to a moderately expansive linear separability condition at each neuron and thus the well known Rosenblatt's perceptron learning rule is the right one for learning the Hamming-stability. Extensive and systematic experiments were conducted, convincingly showing that the proposed perceptron. Hamming-stability learning rule did take a good care of three optimal criteria.<>