Xiaotong Zhai , Shu Li , Guoqiang Zhong , Tao Li , Fuchang Zhang , Rachid Hedjam
{"title":"Generative neural architecture search","authors":"Xiaotong Zhai , Shu Li , Guoqiang Zhong , Tao Li , Fuchang Zhang , Rachid Hedjam","doi":"10.1016/j.neucom.2025.130360","DOIUrl":null,"url":null,"abstract":"<div><div>Neural architecture search (NAS) is an important approach for automatic neural architecture design and has been applied to many tasks, such as image classification and object detection. However, most of the conventional NAS algorithms mainly focus on reducing the prohibitive computational cost, while choosing commonly used reinforcement learning (RL), evolutionary algorithm (EA) or gradient-based methods as their search strategy. In this paper, we propose a novel search strategy for NAS, called Generative NAS (GNAS). Specifically, we assume that high-performing convolutional neural networks adhere to a latent distribution, and design a generator to learn this distribution for generating neural architectures. Furthermore, in order to update the generator for better learning the latent distribution, we use the policy gradient and the performance of the generated CNNs on the validation datasets as a reward signal. To evaluate GNAS, we have conducted extensive experiments on the CIFAR-10, SVHN, MNIST, Fashion-MNIST and ImageNet datasets. The results demonstrate the effectiveness of GNAS compared to previous NAS strategies.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"642 ","pages":"Article 130360"},"PeriodicalIF":5.5000,"publicationDate":"2025-05-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S092523122501032X","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Neural architecture search (NAS) is an important approach for automatic neural architecture design and has been applied to many tasks, such as image classification and object detection. However, most of the conventional NAS algorithms mainly focus on reducing the prohibitive computational cost, while choosing commonly used reinforcement learning (RL), evolutionary algorithm (EA) or gradient-based methods as their search strategy. In this paper, we propose a novel search strategy for NAS, called Generative NAS (GNAS). Specifically, we assume that high-performing convolutional neural networks adhere to a latent distribution, and design a generator to learn this distribution for generating neural architectures. Furthermore, in order to update the generator for better learning the latent distribution, we use the policy gradient and the performance of the generated CNNs on the validation datasets as a reward signal. To evaluate GNAS, we have conducted extensive experiments on the CIFAR-10, SVHN, MNIST, Fashion-MNIST and ImageNet datasets. The results demonstrate the effectiveness of GNAS compared to previous NAS strategies.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.