Simulated Annulling in Convolutional Neural Network

Chenqi Zhou
{"title":"Simulated Annulling in Convolutional Neural Network","authors":"Chenqi Zhou","doi":"10.1109/ISAIAM55748.2022.00015","DOIUrl":null,"url":null,"abstract":"Deep learning is a new branch of machine learning research with the goal of bringing us closer to artificial intelligence. This approach can learn several layers to abstract and represent in order to generate a shared understanding of dataset like text, music, and image. Even though DL is effective in wide range, it is difficult to train. Stochastic Gradient Descent and Conjugate Gradient have been offered as approaches for training DL to make it effective. This paper aims to propose Simulated Annealing (SA) as an alternative way for optimum DL employing a current optimization technique, namely a metaheuristic algorithm, to enhance the effectiveness of Convolution Neural Network (CNN). Two classical CNN models AlexNet and ResNet are used for the experiment. The suggested method is tested by the CIFAR-10 dataset to confirm its correctness and efficiency. In addition, we compare our proposed solution to CNN's original at different standards, such as model accuracy, test error rate and learning efficiency. After the experiment, it can be concluded that despite the longer computation time, the results of the experiments suggest that the proposed approach of this paper can enhance the effectiveness of several models of CNN, such as AlexNet and ResNet34.","PeriodicalId":382895,"journal":{"name":"2022 2nd International Symposium on Artificial Intelligence and its Application on Media (ISAIAM)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 2nd International Symposium on Artificial Intelligence and its Application on Media (ISAIAM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISAIAM55748.2022.00015","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Deep learning is a new branch of machine learning research with the goal of bringing us closer to artificial intelligence. This approach can learn several layers to abstract and represent in order to generate a shared understanding of dataset like text, music, and image. Even though DL is effective in wide range, it is difficult to train. Stochastic Gradient Descent and Conjugate Gradient have been offered as approaches for training DL to make it effective. This paper aims to propose Simulated Annealing (SA) as an alternative way for optimum DL employing a current optimization technique, namely a metaheuristic algorithm, to enhance the effectiveness of Convolution Neural Network (CNN). Two classical CNN models AlexNet and ResNet are used for the experiment. The suggested method is tested by the CIFAR-10 dataset to confirm its correctness and efficiency. In addition, we compare our proposed solution to CNN's original at different standards, such as model accuracy, test error rate and learning efficiency. After the experiment, it can be concluded that despite the longer computation time, the results of the experiments suggest that the proposed approach of this paper can enhance the effectiveness of several models of CNN, such as AlexNet and ResNet34.
卷积神经网络中的模拟环化
深度学习是机器学习研究的一个新分支,其目标是使我们更接近人工智能。这种方法可以学习几个层来抽象和表示,以生成对文本、音乐和图像等数据集的共享理解。尽管深度训练在大范围内是有效的,但很难训练。随机梯度下降法和共轭梯度法是训练深度学习的有效方法。本文旨在提出模拟退火(SA)作为优化深度学习的替代方法,采用当前的优化技术,即元启发式算法,以提高卷积神经网络(CNN)的有效性。实验中使用了两个经典的CNN模型AlexNet和ResNet。通过CIFAR-10数据集验证了该方法的正确性和有效性。此外,我们在模型精度、测试错误率和学习效率等不同标准上将我们提出的解决方案与CNN的原始方案进行了比较。经过实验,可以得出结论,尽管计算时间较长,但实验结果表明本文提出的方法可以提高几种CNN模型的有效性,如AlexNet和ResNet34。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信