LSBO-NAS: Latent Space Bayesian Optimization for Neural Architecture Search

Xuan Rao, Songyi Xiao, Jiaxin Li, Qiuye Wu, Bo Zhao, Derong Liu
{"title":"LSBO-NAS: Latent Space Bayesian Optimization for Neural Architecture Search","authors":"Xuan Rao, Songyi Xiao, Jiaxin Li, Qiuye Wu, Bo Zhao, Derong Liu","doi":"10.1109/ICCR55715.2022.10053904","DOIUrl":null,"url":null,"abstract":"From the perspective of data stream, neural architecture search (NAS) can be formulated as a graph optimization problem. However, many state-of-the-art black-box optimization algorithms, such as Bayesian optimization and simulated annealing, operate in continuous space primarily, which does not match the NAS optimization due to the discreteness of graph structures. To tackle this problem, the latent space Bayesian optimization NAS (LSBO-NAS) algorithm is developed in this paper. In LSBO-NAS, the neural architectures are represented as sequences, and a variational auto-encoder (VAE) is trained to convert the discrete search space of NAS into a continuous latent space by learning the continuous representation of neural architectures. Hereafter, a Bayesian optimization (BO) algorithm, i.e., the tree-structure parzen estimator (TPE) algorithm, is developed to obtain admirable neural architectures. The optimization loop of LSBO-NAS consists of two stages. In the first stage, the BO algorithm generates a preferable architecture representation according to its search strategy. In the second stage, the decoder of VAE decodes the representation into a discrete neural architecture, whose performance evaluation is regarded as the feedback signal for the BO algorithm. The effectiveness of the developed LSBO-NAS is demonstrated on the NAS-Bench-301 benchmark, where the LSBO-NAS achieves a better performance than several NAS baselines.","PeriodicalId":441511,"journal":{"name":"2022 4th International Conference on Control and Robotics (ICCR)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 4th International Conference on Control and Robotics (ICCR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCR55715.2022.10053904","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

From the perspective of data stream, neural architecture search (NAS) can be formulated as a graph optimization problem. However, many state-of-the-art black-box optimization algorithms, such as Bayesian optimization and simulated annealing, operate in continuous space primarily, which does not match the NAS optimization due to the discreteness of graph structures. To tackle this problem, the latent space Bayesian optimization NAS (LSBO-NAS) algorithm is developed in this paper. In LSBO-NAS, the neural architectures are represented as sequences, and a variational auto-encoder (VAE) is trained to convert the discrete search space of NAS into a continuous latent space by learning the continuous representation of neural architectures. Hereafter, a Bayesian optimization (BO) algorithm, i.e., the tree-structure parzen estimator (TPE) algorithm, is developed to obtain admirable neural architectures. The optimization loop of LSBO-NAS consists of two stages. In the first stage, the BO algorithm generates a preferable architecture representation according to its search strategy. In the second stage, the decoder of VAE decodes the representation into a discrete neural architecture, whose performance evaluation is regarded as the feedback signal for the BO algorithm. The effectiveness of the developed LSBO-NAS is demonstrated on the NAS-Bench-301 benchmark, where the LSBO-NAS achieves a better performance than several NAS baselines.
LSBO-NAS:神经结构搜索的潜在空间贝叶斯优化
从数据流的角度来看,神经结构搜索(NAS)可以表述为一个图优化问题。然而,许多最先进的黑箱优化算法,如贝叶斯优化和模拟退火,主要在连续空间中运行,由于图结构的离散性,与NAS优化不匹配。为了解决这一问题,本文提出了潜在空间贝叶斯优化NAS (LSBO-NAS)算法。在LSBO-NAS中,神经结构被表示为序列,通过学习神经结构的连续表示,训练变分自编码器(VAE)将NAS的离散搜索空间转换为连续的潜在空间。在此基础上,提出了一种贝叶斯优化算法,即树结构parzen估计器(TPE)算法,以获得令人满意的神经结构。LSBO-NAS的优化循环包括两个阶段。在第一阶段,BO算法根据其搜索策略生成较优的体系结构表示。在第二阶段,VAE的解码器将表示解码成一个离散的神经结构,其性能评估作为BO算法的反馈信号。开发的LSBO-NAS的有效性在NAS- bench -301基准测试中得到了验证,其中LSBO-NAS的性能优于几个NAS基准。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信