Jialiang Yu, Song Gao, Jie Tian, H. Bian, Hui Liu, Junqing Li
{"title":"基于变分推理贝叶斯卷积神经网络的进化神经结构搜索","authors":"Jialiang Yu, Song Gao, Jie Tian, H. Bian, Hui Liu, Junqing Li","doi":"10.1109/DOCS55193.2022.9967744","DOIUrl":null,"url":null,"abstract":"In past decades, Bayesian neural networks have attracted much attention due to their advantages of being less prone to over-fitting and being able to generate uncertain measurements with discriminant results. However, Compared with traditional neural networks, Bayesian neural network has too many hyper-parameters to be optimized, so that its performance in classification or regression problems on large-scale datasets is not much superior to ordinary neural networks. Therefore, in order to design a Bayesian network with superior performance, we propose VIBCNN-EvoNAS, a Bayesian convolutional neural network architecture search framework based on variational inference, which constructs a search space through a fixed length integer encoding scheme, and uses evolutionary algorithm as a search strategy to deeply explore the influence of convolution kernel size and other related parameters on the network architecture. In addition, in order to reduce the time loss caused by individual evaluation, we adopt the early stop mechanism in the performance evaluation stage. The proposed method is evaluated on CIFAR10 and CIFAR100 datasets, and the experimental results show the effectiveness of the proposed method.","PeriodicalId":348545,"journal":{"name":"2022 4th International Conference on Data-driven Optimization of Complex Systems (DOCS)","volume":"103 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Evolutionary Neural Architecture Search Based on Variational Inference Bayesian Convolutional Neural Network\",\"authors\":\"Jialiang Yu, Song Gao, Jie Tian, H. Bian, Hui Liu, Junqing Li\",\"doi\":\"10.1109/DOCS55193.2022.9967744\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In past decades, Bayesian neural networks have attracted much attention due to their advantages of being less prone to over-fitting and being able to generate uncertain measurements with discriminant results. However, Compared with traditional neural networks, Bayesian neural network has too many hyper-parameters to be optimized, so that its performance in classification or regression problems on large-scale datasets is not much superior to ordinary neural networks. Therefore, in order to design a Bayesian network with superior performance, we propose VIBCNN-EvoNAS, a Bayesian convolutional neural network architecture search framework based on variational inference, which constructs a search space through a fixed length integer encoding scheme, and uses evolutionary algorithm as a search strategy to deeply explore the influence of convolution kernel size and other related parameters on the network architecture. In addition, in order to reduce the time loss caused by individual evaluation, we adopt the early stop mechanism in the performance evaluation stage. The proposed method is evaluated on CIFAR10 and CIFAR100 datasets, and the experimental results show the effectiveness of the proposed method.\",\"PeriodicalId\":348545,\"journal\":{\"name\":\"2022 4th International Conference on Data-driven Optimization of Complex Systems (DOCS)\",\"volume\":\"103 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-10-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 4th International Conference on Data-driven Optimization of Complex Systems (DOCS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/DOCS55193.2022.9967744\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 4th International Conference on Data-driven Optimization of Complex Systems (DOCS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DOCS55193.2022.9967744","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Evolutionary Neural Architecture Search Based on Variational Inference Bayesian Convolutional Neural Network
In past decades, Bayesian neural networks have attracted much attention due to their advantages of being less prone to over-fitting and being able to generate uncertain measurements with discriminant results. However, Compared with traditional neural networks, Bayesian neural network has too many hyper-parameters to be optimized, so that its performance in classification or regression problems on large-scale datasets is not much superior to ordinary neural networks. Therefore, in order to design a Bayesian network with superior performance, we propose VIBCNN-EvoNAS, a Bayesian convolutional neural network architecture search framework based on variational inference, which constructs a search space through a fixed length integer encoding scheme, and uses evolutionary algorithm as a search strategy to deeply explore the influence of convolution kernel size and other related parameters on the network architecture. In addition, in order to reduce the time loss caused by individual evaluation, we adopt the early stop mechanism in the performance evaluation stage. The proposed method is evaluated on CIFAR10 and CIFAR100 datasets, and the experimental results show the effectiveness of the proposed method.