提高人工神经网络性能的优化方法比较研究

Jesada Kajornrit
{"title":"提高人工神经网络性能的优化方法比较研究","authors":"Jesada Kajornrit","doi":"10.1109/ICITEED.2015.7408908","DOIUrl":null,"url":null,"abstract":"This paper proposes a comparative study of commonly-used global optimization methods to improve training performance of back-propagation neural networks. The optimization methods adopted herein include Simulated annealing, Direct search, and Genetic algorithm. These methods are used to optimize neural networks' weights and biases before using back-propagation algorithm in order to prevent the networks from local minima. Four benchmark datasets of prediction (regression) task were used to evaluate the established models. The experimental results indicated that optimizing neural network's parameters is a complicated problem due to its high dimension of variables to be optimized. And only genetic algorithm was able to solve this difficult optimization problem. In addition, this paper also applied this success method to predict monthly rainfall time series data in the northeast region of Thailand. The results indicated that using of genetic algorithm with back-propagation neural network is a recommended combination.","PeriodicalId":207985,"journal":{"name":"2015 7th International Conference on Information Technology and Electrical Engineering (ICITEE)","volume":"126 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"A comparative study of optimization methods for improving artificial neural network performance\",\"authors\":\"Jesada Kajornrit\",\"doi\":\"10.1109/ICITEED.2015.7408908\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper proposes a comparative study of commonly-used global optimization methods to improve training performance of back-propagation neural networks. The optimization methods adopted herein include Simulated annealing, Direct search, and Genetic algorithm. These methods are used to optimize neural networks' weights and biases before using back-propagation algorithm in order to prevent the networks from local minima. Four benchmark datasets of prediction (regression) task were used to evaluate the established models. The experimental results indicated that optimizing neural network's parameters is a complicated problem due to its high dimension of variables to be optimized. And only genetic algorithm was able to solve this difficult optimization problem. In addition, this paper also applied this success method to predict monthly rainfall time series data in the northeast region of Thailand. The results indicated that using of genetic algorithm with back-propagation neural network is a recommended combination.\",\"PeriodicalId\":207985,\"journal\":{\"name\":\"2015 7th International Conference on Information Technology and Electrical Engineering (ICITEE)\",\"volume\":\"126 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2015 7th International Conference on Information Technology and Electrical Engineering (ICITEE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICITEED.2015.7408908\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 7th International Conference on Information Technology and Electrical Engineering (ICITEE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICITEED.2015.7408908","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8

摘要

为了提高反向传播神经网络的训练性能,本文对常用的全局优化方法进行了比较研究。本文采用的优化方法包括模拟退火、直接搜索和遗传算法。这些方法在使用反向传播算法之前对神经网络的权值和偏差进行优化,以防止网络出现局部最小值。使用4个预测(回归)任务基准数据集对建立的模型进行评价。实验结果表明,神经网络的参数优化是一个复杂的问题,因为其待优化变量的维数很高。只有遗传算法能够解决这个复杂的优化问题。此外,本文还将该成功方法应用于泰国东北地区月降水时间序列数据的预测。结果表明,遗传算法与反向传播神经网络相结合是一种较好的组合。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A comparative study of optimization methods for improving artificial neural network performance
This paper proposes a comparative study of commonly-used global optimization methods to improve training performance of back-propagation neural networks. The optimization methods adopted herein include Simulated annealing, Direct search, and Genetic algorithm. These methods are used to optimize neural networks' weights and biases before using back-propagation algorithm in order to prevent the networks from local minima. Four benchmark datasets of prediction (regression) task were used to evaluate the established models. The experimental results indicated that optimizing neural network's parameters is a complicated problem due to its high dimension of variables to be optimized. And only genetic algorithm was able to solve this difficult optimization problem. In addition, this paper also applied this success method to predict monthly rainfall time series data in the northeast region of Thailand. The results indicated that using of genetic algorithm with back-propagation neural network is a recommended combination.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信