Libin Hou , Linyuan Wang , Senbao Hou, Tianyuan Liu, Shuxiao Ma, Jian Chen, Bin Yan
{"title":"Low redundancy cell-based Neural Architecture Search for large convolutional neural networks","authors":"Libin Hou , Linyuan Wang , Senbao Hou, Tianyuan Liu, Shuxiao Ma, Jian Chen, Bin Yan","doi":"10.1016/j.neucom.2025.130644","DOIUrl":null,"url":null,"abstract":"<div><div>The cell-based search space is one of the main paradigms in Neural Architecture Search (NAS). However, the current research on this search space tends to optimize on small-size models, and the performance improvement of NAS might be stuck in a bottleneck. This situation has led to a growing performance gap between NAS and hand-designed models in recent years. In this paper, we focus on how to effectively expand the cell-based search space and proposes <em>Low redundancy Cell-based Neural Architecture Search for Large Convolutional neural networks</em> (<span><math><mrow><mi>L</mi><msup><mrow><mi>C</mi></mrow><mrow><mn>2</mn></mrow></msup><mi>N</mi><mi>A</mi><mi>S</mi></mrow></math></span>), a gradient-based NAS method to search large-scale convolutional models with better performance based on low redundant cell search space. Specifically, a cell-based search space with low redundancy and large kernel is designed. Then train and sample a super network under computational constraints. Finally the network structure is optimized by gradient-based search. Experimental results show that the performance of the proposed search method is comparable to the popular hand-designed models in recent years at different scales. Moreover, LC-NASNet-B achieves an 83.7% classification accuracy on the ImageNet-1k dataset with 86.2M parameters, surpassing previous NAS methods and comparable to the most prominent hand-designed models.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"649 ","pages":"Article 130644"},"PeriodicalIF":5.5000,"publicationDate":"2025-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231225013165","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
摘要
基于细胞的搜索空间是神经结构搜索(NAS)的主要范式之一。然而,目前对该搜索空间的研究倾向于在小尺寸模型上进行优化,NAS的性能提升可能会陷入瓶颈。这种情况导致近年来NAS和手工设计模型之间的性能差距越来越大。本文针对如何有效扩展基于cell的搜索空间,提出了基于cell-based Neural Architecture search for Large Convolutional Neural networks (LC2NAS)的低冗余cell-based Neural Architecture search for Large Convolutional Neural networks (LC2NAS),这是一种基于低冗余cell搜索空间的基于梯度的NAS方法,用于搜索具有较好性能的大规模卷积模型。具体来说,设计了一个低冗余、大核的基于cell的搜索空间。然后在计算约束下训练和采样一个超级网络。最后通过梯度搜索对网络结构进行优化。实验结果表明,本文提出的搜索方法在不同尺度上的性能与近年来流行的手工设计模型相当。此外,LC-NASNet-B在包含8620万个参数的ImageNet-1k数据集上实现了83.7%的分类准确率,超过了以前的NAS方法,与最著名的手工设计模型相当。
Low redundancy cell-based Neural Architecture Search for large convolutional neural networks
The cell-based search space is one of the main paradigms in Neural Architecture Search (NAS). However, the current research on this search space tends to optimize on small-size models, and the performance improvement of NAS might be stuck in a bottleneck. This situation has led to a growing performance gap between NAS and hand-designed models in recent years. In this paper, we focus on how to effectively expand the cell-based search space and proposes Low redundancy Cell-based Neural Architecture Search for Large Convolutional neural networks (), a gradient-based NAS method to search large-scale convolutional models with better performance based on low redundant cell search space. Specifically, a cell-based search space with low redundancy and large kernel is designed. Then train and sample a super network under computational constraints. Finally the network structure is optimized by gradient-based search. Experimental results show that the performance of the proposed search method is comparable to the popular hand-designed models in recent years at different scales. Moreover, LC-NASNet-B achieves an 83.7% classification accuracy on the ImageNet-1k dataset with 86.2M parameters, surpassing previous NAS methods and comparable to the most prominent hand-designed models.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.