用改进的差分进化算法训练神经网络来解决分类问题

Behrouz Ahadzadeh, M. Menhaj
{"title":"用改进的差分进化算法训练神经网络来解决分类问题","authors":"Behrouz Ahadzadeh, M. Menhaj","doi":"10.1109/ICCKE.2014.6993451","DOIUrl":null,"url":null,"abstract":"In recent years, progress in the field of artificial neural networks provides a very important tool for complex problems in pattern recognition, data mining and medical diagnosis. The training algorithms of neural networks play an important role for adjustment the network parameters. Different algorithms have been presented for training neural networks; the most common one is the use of gradient descent based algorithms such as back propagation algorithm. Getting trapped in local minima and possessing a very slow converging speed made the gradient based methods problematic. To resolve this many evolutionary algorithms have been adopted for the training of neural networks. In this paper, a modified differential evolution algorithm acronymed as 2sDE is employed as a new training algorithm for feedforward neural networks in order to resolve the problems of local optimization training algorithms such as trapping in local minima and the slow convergence. Effectiveness and efficiency of the proposed method are compared with other training algorithms on various classification problems.","PeriodicalId":152540,"journal":{"name":"2014 4th International Conference on Computer and Knowledge Engineering (ICCKE)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-12-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Training neural networks using modified differential evolution algorithm for classification problems\",\"authors\":\"Behrouz Ahadzadeh, M. Menhaj\",\"doi\":\"10.1109/ICCKE.2014.6993451\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In recent years, progress in the field of artificial neural networks provides a very important tool for complex problems in pattern recognition, data mining and medical diagnosis. The training algorithms of neural networks play an important role for adjustment the network parameters. Different algorithms have been presented for training neural networks; the most common one is the use of gradient descent based algorithms such as back propagation algorithm. Getting trapped in local minima and possessing a very slow converging speed made the gradient based methods problematic. To resolve this many evolutionary algorithms have been adopted for the training of neural networks. In this paper, a modified differential evolution algorithm acronymed as 2sDE is employed as a new training algorithm for feedforward neural networks in order to resolve the problems of local optimization training algorithms such as trapping in local minima and the slow convergence. Effectiveness and efficiency of the proposed method are compared with other training algorithms on various classification problems.\",\"PeriodicalId\":152540,\"journal\":{\"name\":\"2014 4th International Conference on Computer and Knowledge Engineering (ICCKE)\",\"volume\":\"16 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-12-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2014 4th International Conference on Computer and Knowledge Engineering (ICCKE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICCKE.2014.6993451\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 4th International Conference on Computer and Knowledge Engineering (ICCKE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCKE.2014.6993451","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

摘要

近年来,人工神经网络领域的进展为解决模式识别、数据挖掘和医学诊断等复杂问题提供了非常重要的工具。神经网络的训练算法对网络参数的调整起着重要的作用。不同的算法已经被提出用于训练神经网络;最常见的是使用基于梯度下降的算法,如反向传播算法。由于陷入局部极小值且收敛速度很慢,使得基于梯度的方法存在问题。为了解决这个问题,许多进化算法被用于神经网络的训练。本文采用改进的差分进化算法2sDE作为前馈神经网络的一种新的训练算法,解决了局部优化训练算法陷入局部极小值、收敛速度慢等问题。在各种分类问题上,将该方法与其他训练算法的有效性和效率进行了比较。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Training neural networks using modified differential evolution algorithm for classification problems
In recent years, progress in the field of artificial neural networks provides a very important tool for complex problems in pattern recognition, data mining and medical diagnosis. The training algorithms of neural networks play an important role for adjustment the network parameters. Different algorithms have been presented for training neural networks; the most common one is the use of gradient descent based algorithms such as back propagation algorithm. Getting trapped in local minima and possessing a very slow converging speed made the gradient based methods problematic. To resolve this many evolutionary algorithms have been adopted for the training of neural networks. In this paper, a modified differential evolution algorithm acronymed as 2sDE is employed as a new training algorithm for feedforward neural networks in order to resolve the problems of local optimization training algorithms such as trapping in local minima and the slow convergence. Effectiveness and efficiency of the proposed method are compared with other training algorithms on various classification problems.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信