{"title":"提高字符识别的泛化性能","authors":"H. Drucker, Y. Le Cun","doi":"10.1109/NNSP.1991.239522","DOIUrl":null,"url":null,"abstract":"One test of a new training algorithm is how well the algorithm generalizes from the training data to the test data. A new neural net training algorithm termed double backpropagation improves generalization in character recognition by minimizing the change in the output due to small changes in the input. This is accomplished by minimizing the normal energy term found in backpropagation and an additional energy term that is a function of the Jacobian.<<ETX>>","PeriodicalId":354832,"journal":{"name":"Neural Networks for Signal Processing Proceedings of the 1991 IEEE Workshop","volume":"13 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1991-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"Improving generalization performance in character recognition\",\"authors\":\"H. Drucker, Y. Le Cun\",\"doi\":\"10.1109/NNSP.1991.239522\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"One test of a new training algorithm is how well the algorithm generalizes from the training data to the test data. A new neural net training algorithm termed double backpropagation improves generalization in character recognition by minimizing the change in the output due to small changes in the input. This is accomplished by minimizing the normal energy term found in backpropagation and an additional energy term that is a function of the Jacobian.<<ETX>>\",\"PeriodicalId\":354832,\"journal\":{\"name\":\"Neural Networks for Signal Processing Proceedings of the 1991 IEEE Workshop\",\"volume\":\"13 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1991-09-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Networks for Signal Processing Proceedings of the 1991 IEEE Workshop\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/NNSP.1991.239522\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks for Signal Processing Proceedings of the 1991 IEEE Workshop","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NNSP.1991.239522","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Improving generalization performance in character recognition
One test of a new training algorithm is how well the algorithm generalizes from the training data to the test data. A new neural net training algorithm termed double backpropagation improves generalization in character recognition by minimizing the change in the output due to small changes in the input. This is accomplished by minimizing the normal energy term found in backpropagation and an additional energy term that is a function of the Jacobian.<>