{"title":"最小化与网络大小和训练epoch数量相关的验证误差","authors":"Rohit Rawat, Jignesh K. Patel, M. Manry","doi":"10.1109/IJCNN.2013.6706919","DOIUrl":null,"url":null,"abstract":"A batch training algorithm for the multilayer perceptron is developed that optimizes validation error with respect to two parameters. At the end of each training epoch, the method temporarily prunes the network and calculates the validation error versus number of hidden units curve in one pass through the validation data. Since, pruning is done at each epoch, and the best networks are saved, we optimize validation error over the number of hidden units and the number of epochs simultaneously. The number of required multiplies for the algorithm has been analyzed. The method has been compared to others in simulations and has been found to work very well.","PeriodicalId":376975,"journal":{"name":"The 2013 International Joint Conference on Neural Networks (IJCNN)","volume":"67 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"13","resultStr":"{\"title\":\"Minimizing validation error with respect to network size and number of training epochs\",\"authors\":\"Rohit Rawat, Jignesh K. Patel, M. Manry\",\"doi\":\"10.1109/IJCNN.2013.6706919\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A batch training algorithm for the multilayer perceptron is developed that optimizes validation error with respect to two parameters. At the end of each training epoch, the method temporarily prunes the network and calculates the validation error versus number of hidden units curve in one pass through the validation data. Since, pruning is done at each epoch, and the best networks are saved, we optimize validation error over the number of hidden units and the number of epochs simultaneously. The number of required multiplies for the algorithm has been analyzed. The method has been compared to others in simulations and has been found to work very well.\",\"PeriodicalId\":376975,\"journal\":{\"name\":\"The 2013 International Joint Conference on Neural Networks (IJCNN)\",\"volume\":\"67 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2013-08-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"13\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The 2013 International Joint Conference on Neural Networks (IJCNN)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN.2013.6706919\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The 2013 International Joint Conference on Neural Networks (IJCNN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2013.6706919","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Minimizing validation error with respect to network size and number of training epochs
A batch training algorithm for the multilayer perceptron is developed that optimizes validation error with respect to two parameters. At the end of each training epoch, the method temporarily prunes the network and calculates the validation error versus number of hidden units curve in one pass through the validation data. Since, pruning is done at each epoch, and the best networks are saved, we optimize validation error over the number of hidden units and the number of epochs simultaneously. The number of required multiplies for the algorithm has been analyzed. The method has been compared to others in simulations and has been found to work very well.