{"title":"卷积神经网络中的模拟环化","authors":"Chenqi Zhou","doi":"10.1109/ISAIAM55748.2022.00015","DOIUrl":null,"url":null,"abstract":"Deep learning is a new branch of machine learning research with the goal of bringing us closer to artificial intelligence. This approach can learn several layers to abstract and represent in order to generate a shared understanding of dataset like text, music, and image. Even though DL is effective in wide range, it is difficult to train. Stochastic Gradient Descent and Conjugate Gradient have been offered as approaches for training DL to make it effective. This paper aims to propose Simulated Annealing (SA) as an alternative way for optimum DL employing a current optimization technique, namely a metaheuristic algorithm, to enhance the effectiveness of Convolution Neural Network (CNN). Two classical CNN models AlexNet and ResNet are used for the experiment. The suggested method is tested by the CIFAR-10 dataset to confirm its correctness and efficiency. In addition, we compare our proposed solution to CNN's original at different standards, such as model accuracy, test error rate and learning efficiency. After the experiment, it can be concluded that despite the longer computation time, the results of the experiments suggest that the proposed approach of this paper can enhance the effectiveness of several models of CNN, such as AlexNet and ResNet34.","PeriodicalId":382895,"journal":{"name":"2022 2nd International Symposium on Artificial Intelligence and its Application on Media (ISAIAM)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Simulated Annulling in Convolutional Neural Network\",\"authors\":\"Chenqi Zhou\",\"doi\":\"10.1109/ISAIAM55748.2022.00015\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Deep learning is a new branch of machine learning research with the goal of bringing us closer to artificial intelligence. This approach can learn several layers to abstract and represent in order to generate a shared understanding of dataset like text, music, and image. Even though DL is effective in wide range, it is difficult to train. Stochastic Gradient Descent and Conjugate Gradient have been offered as approaches for training DL to make it effective. This paper aims to propose Simulated Annealing (SA) as an alternative way for optimum DL employing a current optimization technique, namely a metaheuristic algorithm, to enhance the effectiveness of Convolution Neural Network (CNN). Two classical CNN models AlexNet and ResNet are used for the experiment. The suggested method is tested by the CIFAR-10 dataset to confirm its correctness and efficiency. In addition, we compare our proposed solution to CNN's original at different standards, such as model accuracy, test error rate and learning efficiency. After the experiment, it can be concluded that despite the longer computation time, the results of the experiments suggest that the proposed approach of this paper can enhance the effectiveness of several models of CNN, such as AlexNet and ResNet34.\",\"PeriodicalId\":382895,\"journal\":{\"name\":\"2022 2nd International Symposium on Artificial Intelligence and its Application on Media (ISAIAM)\",\"volume\":\"23 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 2nd International Symposium on Artificial Intelligence and its Application on Media (ISAIAM)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISAIAM55748.2022.00015\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 2nd International Symposium on Artificial Intelligence and its Application on Media (ISAIAM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISAIAM55748.2022.00015","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Simulated Annulling in Convolutional Neural Network
Deep learning is a new branch of machine learning research with the goal of bringing us closer to artificial intelligence. This approach can learn several layers to abstract and represent in order to generate a shared understanding of dataset like text, music, and image. Even though DL is effective in wide range, it is difficult to train. Stochastic Gradient Descent and Conjugate Gradient have been offered as approaches for training DL to make it effective. This paper aims to propose Simulated Annealing (SA) as an alternative way for optimum DL employing a current optimization technique, namely a metaheuristic algorithm, to enhance the effectiveness of Convolution Neural Network (CNN). Two classical CNN models AlexNet and ResNet are used for the experiment. The suggested method is tested by the CIFAR-10 dataset to confirm its correctness and efficiency. In addition, we compare our proposed solution to CNN's original at different standards, such as model accuracy, test error rate and learning efficiency. After the experiment, it can be concluded that despite the longer computation time, the results of the experiments suggest that the proposed approach of this paper can enhance the effectiveness of several models of CNN, such as AlexNet and ResNet34.