{"title":"同时摄动双向联想存储器的FPGA实现","authors":"Y. Maeda, M. Wakamura","doi":"10.1109/NNSP.2003.1318029","DOIUrl":null,"url":null,"abstract":"Recurrent neural networks have interesting properties and can handle dynamic information processing unlike the ordinary feedforward neural networks. Bidirectional associative memory (BAM) is a typical recurrent network. Ordinarily, weights of the BAM are determined by the Hebb's learning. In this paper, a recursive learning scheme for BAM is proposed and its hardware implementation is described. The learning scheme is applicable to analogue BAM as well. A simulation result and details of the implementation are shown.","PeriodicalId":315958,"journal":{"name":"2003 IEEE XIII Workshop on Neural Networks for Signal Processing (IEEE Cat. No.03TH8718)","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2003-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"FPGA implementation of bidirectional associative memory using simultaneous perturbation\",\"authors\":\"Y. Maeda, M. Wakamura\",\"doi\":\"10.1109/NNSP.2003.1318029\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Recurrent neural networks have interesting properties and can handle dynamic information processing unlike the ordinary feedforward neural networks. Bidirectional associative memory (BAM) is a typical recurrent network. Ordinarily, weights of the BAM are determined by the Hebb's learning. In this paper, a recursive learning scheme for BAM is proposed and its hardware implementation is described. The learning scheme is applicable to analogue BAM as well. A simulation result and details of the implementation are shown.\",\"PeriodicalId\":315958,\"journal\":{\"name\":\"2003 IEEE XIII Workshop on Neural Networks for Signal Processing (IEEE Cat. No.03TH8718)\",\"volume\":\"30 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2003-09-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2003 IEEE XIII Workshop on Neural Networks for Signal Processing (IEEE Cat. No.03TH8718)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/NNSP.2003.1318029\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2003 IEEE XIII Workshop on Neural Networks for Signal Processing (IEEE Cat. No.03TH8718)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NNSP.2003.1318029","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
FPGA implementation of bidirectional associative memory using simultaneous perturbation
Recurrent neural networks have interesting properties and can handle dynamic information processing unlike the ordinary feedforward neural networks. Bidirectional associative memory (BAM) is a typical recurrent network. Ordinarily, weights of the BAM are determined by the Hebb's learning. In this paper, a recursive learning scheme for BAM is proposed and its hardware implementation is described. The learning scheme is applicable to analogue BAM as well. A simulation result and details of the implementation are shown.