{"title":"神经网络设计与训练的全局优化方法","authors":"A. Yamazaki, Teresa B Ludermir, M. D. Souto","doi":"10.1109/SBRN.2002.1181455","DOIUrl":null,"url":null,"abstract":"This paper shows results of two approaches for the optimization of neural networks: one uses simulated annealing for optimizing both architectures and weights combined with backpropagation for fine tuning, while the other uses tabu search for the same purpose. Both approaches generate networks with good generalization performance (mean classification error of 1.68% for simulated annealing and 0.64% for tabu search) and low complexity (mean number of connections of 11.15 out of 36 for simulated annealing and 11.62 out of 36 for tabu search) for an odor recognition task in an artificial nose.","PeriodicalId":157186,"journal":{"name":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","volume":"39 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2002-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"Global optimization methods for designing and training neural networks\",\"authors\":\"A. Yamazaki, Teresa B Ludermir, M. D. Souto\",\"doi\":\"10.1109/SBRN.2002.1181455\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper shows results of two approaches for the optimization of neural networks: one uses simulated annealing for optimizing both architectures and weights combined with backpropagation for fine tuning, while the other uses tabu search for the same purpose. Both approaches generate networks with good generalization performance (mean classification error of 1.68% for simulated annealing and 0.64% for tabu search) and low complexity (mean number of connections of 11.15 out of 36 for simulated annealing and 11.62 out of 36 for tabu search) for an odor recognition task in an artificial nose.\",\"PeriodicalId\":157186,\"journal\":{\"name\":\"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.\",\"volume\":\"39 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2002-11-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SBRN.2002.1181455\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SBRN.2002.1181455","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Global optimization methods for designing and training neural networks
This paper shows results of two approaches for the optimization of neural networks: one uses simulated annealing for optimizing both architectures and weights combined with backpropagation for fine tuning, while the other uses tabu search for the same purpose. Both approaches generate networks with good generalization performance (mean classification error of 1.68% for simulated annealing and 0.64% for tabu search) and low complexity (mean number of connections of 11.15 out of 36 for simulated annealing and 11.62 out of 36 for tabu search) for an odor recognition task in an artificial nose.