Jianwei Zhang, Lei Zhang, Yan Wang, Junyou Wang, Xin Wei, Wenjie Liu
{"title":"一种高效的多目标进化零射击神经结构图像分类搜索框架。","authors":"Jianwei Zhang, Lei Zhang, Yan Wang, Junyou Wang, Xin Wei, Wenjie Liu","doi":"10.1142/S0129065723500168","DOIUrl":null,"url":null,"abstract":"<p><p>Neural Architecture Search (NAS) has recently shown a powerful ability to engineer networks automatically on various tasks. Most current approaches navigate the search direction with the validation performance-based architecture evaluation methodology, which estimates an architecture's quality by training and validating on a specific large dataset. However, for small-scale datasets, the model's performance on the validation set cannot precisely estimate that on the test set. The imprecise architecture evaluation can mislead the search to sub-optima. To address the above problem, we propose an efficient multi-objective evolutionary zero-shot NAS framework by evaluating architectures with zero-cost metrics, which can be calculated with randomly initialized models in a training-free manner. Specifically, a general zero-cost metric design principle is proposed to unify the current metrics and help develop several new metrics. Then, we offer an efficient computational method for multi-zero-cost metrics by calculating them in one forward and backward pass. Finally, comprehensive experiments have been conducted on NAS-Bench-201 and MedMNIST. The results have shown that the proposed method can achieve sufficiently accurate, high-throughput performance on MedMNIST and 20[Formula: see text]faster than the previous best method.</p>","PeriodicalId":50305,"journal":{"name":"International Journal of Neural Systems","volume":null,"pages":null},"PeriodicalIF":6.6000,"publicationDate":"2023-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"An Efficient Multi-Objective Evolutionary Zero-Shot Neural Architecture Search Framework for Image Classification.\",\"authors\":\"Jianwei Zhang, Lei Zhang, Yan Wang, Junyou Wang, Xin Wei, Wenjie Liu\",\"doi\":\"10.1142/S0129065723500168\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Neural Architecture Search (NAS) has recently shown a powerful ability to engineer networks automatically on various tasks. Most current approaches navigate the search direction with the validation performance-based architecture evaluation methodology, which estimates an architecture's quality by training and validating on a specific large dataset. However, for small-scale datasets, the model's performance on the validation set cannot precisely estimate that on the test set. The imprecise architecture evaluation can mislead the search to sub-optima. To address the above problem, we propose an efficient multi-objective evolutionary zero-shot NAS framework by evaluating architectures with zero-cost metrics, which can be calculated with randomly initialized models in a training-free manner. Specifically, a general zero-cost metric design principle is proposed to unify the current metrics and help develop several new metrics. Then, we offer an efficient computational method for multi-zero-cost metrics by calculating them in one forward and backward pass. Finally, comprehensive experiments have been conducted on NAS-Bench-201 and MedMNIST. The results have shown that the proposed method can achieve sufficiently accurate, high-throughput performance on MedMNIST and 20[Formula: see text]faster than the previous best method.</p>\",\"PeriodicalId\":50305,\"journal\":{\"name\":\"International Journal of Neural Systems\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":6.6000,\"publicationDate\":\"2023-05-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Neural Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1142/S0129065723500168\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Neural Systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1142/S0129065723500168","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
An Efficient Multi-Objective Evolutionary Zero-Shot Neural Architecture Search Framework for Image Classification.
Neural Architecture Search (NAS) has recently shown a powerful ability to engineer networks automatically on various tasks. Most current approaches navigate the search direction with the validation performance-based architecture evaluation methodology, which estimates an architecture's quality by training and validating on a specific large dataset. However, for small-scale datasets, the model's performance on the validation set cannot precisely estimate that on the test set. The imprecise architecture evaluation can mislead the search to sub-optima. To address the above problem, we propose an efficient multi-objective evolutionary zero-shot NAS framework by evaluating architectures with zero-cost metrics, which can be calculated with randomly initialized models in a training-free manner. Specifically, a general zero-cost metric design principle is proposed to unify the current metrics and help develop several new metrics. Then, we offer an efficient computational method for multi-zero-cost metrics by calculating them in one forward and backward pass. Finally, comprehensive experiments have been conducted on NAS-Bench-201 and MedMNIST. The results have shown that the proposed method can achieve sufficiently accurate, high-throughput performance on MedMNIST and 20[Formula: see text]faster than the previous best method.
期刊介绍:
The International Journal of Neural Systems is a monthly, rigorously peer-reviewed transdisciplinary journal focusing on information processing in both natural and artificial neural systems. Special interests include machine learning, computational neuroscience and neurology. The journal prioritizes innovative, high-impact articles spanning multiple fields, including neurosciences and computer science and engineering. It adopts an open-minded approach to this multidisciplinary field, serving as a platform for novel ideas and enhanced understanding of collective and cooperative phenomena in computationally capable systems.