{"title":"一种利用冗余和多样化的网络结构提高精度和收敛速度的神经网络学习方案","authors":"I. Kumazawa","doi":"10.1109/IJCNN.1991.170584","DOIUrl":null,"url":null,"abstract":"The author proposes a learning scheme which compensates for the incomplete result of learning using redundant internal coding of the required input-output relation and some plans to diversify inner subnetwork structures. He applies this scheme to a character recognition problem and experimentally shows that this approach gives more accurate learning results and faster convergence as well as more efficient hardware constitutions than the traditional approach. Specifically, computer simulations are presented which shows that the proposed approach is superior to the traditional approach using the so-called grandmother cell representation scheme.<<ETX>>","PeriodicalId":211135,"journal":{"name":"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"1991-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A learning scheme of neural networks which improves accuracy and speed of convergence using redundant and diversified network structures\",\"authors\":\"I. Kumazawa\",\"doi\":\"10.1109/IJCNN.1991.170584\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The author proposes a learning scheme which compensates for the incomplete result of learning using redundant internal coding of the required input-output relation and some plans to diversify inner subnetwork structures. He applies this scheme to a character recognition problem and experimentally shows that this approach gives more accurate learning results and faster convergence as well as more efficient hardware constitutions than the traditional approach. Specifically, computer simulations are presented which shows that the proposed approach is superior to the traditional approach using the so-called grandmother cell representation scheme.<<ETX>>\",\"PeriodicalId\":211135,\"journal\":{\"name\":\"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1991-11-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN.1991.170584\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.1991.170584","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A learning scheme of neural networks which improves accuracy and speed of convergence using redundant and diversified network structures
The author proposes a learning scheme which compensates for the incomplete result of learning using redundant internal coding of the required input-output relation and some plans to diversify inner subnetwork structures. He applies this scheme to a character recognition problem and experimentally shows that this approach gives more accurate learning results and faster convergence as well as more efficient hardware constitutions than the traditional approach. Specifically, computer simulations are presented which shows that the proposed approach is superior to the traditional approach using the so-called grandmother cell representation scheme.<>