{"title":"Neural Architecture Search Based on Particle Swarm Optimization","authors":"Ruicheng Niu, Hao Li, Yachuan Zhang, Yan Kang","doi":"10.1109/ICDSBA48748.2019.00073","DOIUrl":null,"url":null,"abstract":"Neural architecture search can help researchers design excellent neural network structure. But it takes a lot of time, such as using a neural architecture search method based on reinforcement learning, which requires more than 3000 GPU hours to find an excellent architecture on the CIFAR-10. And in order to be able to use the back-propagation method during training, the architecture will be continuous. Therefore, we propose a neural network architecture search algorithm based on Particle Swarm Optimization (PSO) – PNAS. First, we need to train a super-net. Through random sampling during the super-net training process, only one path training is activated at a time, which greatly reduces the coupling between the super-net nodes. After the super-net training, we use the PSO algorithm to search the architecture of the neural network to find optimal architecture.Our PSO-based neural architecture search can achieve competitive speed compared to state-of-the-art models. Our PNAS search time is faster than GDAS 28% and the parameters are also less than GDAS.","PeriodicalId":382429,"journal":{"name":"2019 3rd International Conference on Data Science and Business Analytics (ICDSBA)","volume":"26 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 3rd International Conference on Data Science and Business Analytics (ICDSBA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDSBA48748.2019.00073","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
Neural architecture search can help researchers design excellent neural network structure. But it takes a lot of time, such as using a neural architecture search method based on reinforcement learning, which requires more than 3000 GPU hours to find an excellent architecture on the CIFAR-10. And in order to be able to use the back-propagation method during training, the architecture will be continuous. Therefore, we propose a neural network architecture search algorithm based on Particle Swarm Optimization (PSO) – PNAS. First, we need to train a super-net. Through random sampling during the super-net training process, only one path training is activated at a time, which greatly reduces the coupling between the super-net nodes. After the super-net training, we use the PSO algorithm to search the architecture of the neural network to find optimal architecture.Our PSO-based neural architecture search can achieve competitive speed compared to state-of-the-art models. Our PNAS search time is faster than GDAS 28% and the parameters are also less than GDAS.