IF 5.5 2区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Libin Hou , Linyuan Wang , Senbao Hou, Tianyuan Liu, Shuxiao Ma, Jian Chen, Bin Yan
{"title":"Low redundancy cell-based Neural Architecture Search for large convolutional neural networks","authors":"Libin Hou ,&nbsp;Linyuan Wang ,&nbsp;Senbao Hou,&nbsp;Tianyuan Liu,&nbsp;Shuxiao Ma,&nbsp;Jian Chen,&nbsp;Bin Yan","doi":"10.1016/j.neucom.2025.130644","DOIUrl":null,"url":null,"abstract":"<div><div>The cell-based search space is one of the main paradigms in Neural Architecture Search (NAS). However, the current research on this search space tends to optimize on small-size models, and the performance improvement of NAS might be stuck in a bottleneck. This situation has led to a growing performance gap between NAS and hand-designed models in recent years. In this paper, we focus on how to effectively expand the cell-based search space and proposes <em>Low redundancy Cell-based Neural Architecture Search for Large Convolutional neural networks</em> (<span><math><mrow><mi>L</mi><msup><mrow><mi>C</mi></mrow><mrow><mn>2</mn></mrow></msup><mi>N</mi><mi>A</mi><mi>S</mi></mrow></math></span>), a gradient-based NAS method to search large-scale convolutional models with better performance based on low redundant cell search space. Specifically, a cell-based search space with low redundancy and large kernel is designed. Then train and sample a super network under computational constraints. Finally the network structure is optimized by gradient-based search. Experimental results show that the performance of the proposed search method is comparable to the popular hand-designed models in recent years at different scales. Moreover, LC-NASNet-B achieves an 83.7% classification accuracy on the ImageNet-1k dataset with 86.2M parameters, surpassing previous NAS methods and comparable to the most prominent hand-designed models.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"649 ","pages":"Article 130644"},"PeriodicalIF":5.5000,"publicationDate":"2025-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231225013165","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

基于细胞的搜索空间是神经结构搜索(NAS)的主要范式之一。然而,目前对该搜索空间的研究倾向于在小尺寸模型上进行优化,NAS的性能提升可能会陷入瓶颈。这种情况导致近年来NAS和手工设计模型之间的性能差距越来越大。本文针对如何有效扩展基于cell的搜索空间,提出了基于cell-based Neural Architecture search for Large Convolutional Neural networks (LC2NAS)的低冗余cell-based Neural Architecture search for Large Convolutional Neural networks (LC2NAS),这是一种基于低冗余cell搜索空间的基于梯度的NAS方法,用于搜索具有较好性能的大规模卷积模型。具体来说,设计了一个低冗余、大核的基于cell的搜索空间。然后在计算约束下训练和采样一个超级网络。最后通过梯度搜索对网络结构进行优化。实验结果表明,本文提出的搜索方法在不同尺度上的性能与近年来流行的手工设计模型相当。此外,LC-NASNet-B在包含8620万个参数的ImageNet-1k数据集上实现了83.7%的分类准确率,超过了以前的NAS方法,与最著名的手工设计模型相当。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Low redundancy cell-based Neural Architecture Search for large convolutional neural networks
The cell-based search space is one of the main paradigms in Neural Architecture Search (NAS). However, the current research on this search space tends to optimize on small-size models, and the performance improvement of NAS might be stuck in a bottleneck. This situation has led to a growing performance gap between NAS and hand-designed models in recent years. In this paper, we focus on how to effectively expand the cell-based search space and proposes Low redundancy Cell-based Neural Architecture Search for Large Convolutional neural networks (LC2NAS), a gradient-based NAS method to search large-scale convolutional models with better performance based on low redundant cell search space. Specifically, a cell-based search space with low redundancy and large kernel is designed. Then train and sample a super network under computational constraints. Finally the network structure is optimized by gradient-based search. Experimental results show that the performance of the proposed search method is comparable to the popular hand-designed models in recent years at different scales. Moreover, LC-NASNet-B achieves an 83.7% classification accuracy on the ImageNet-1k dataset with 86.2M parameters, surpassing previous NAS methods and comparable to the most prominent hand-designed models.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Neurocomputing
Neurocomputing 工程技术-计算机:人工智能
CiteScore
13.10
自引率
10.00%
发文量
1382
审稿时长
70 days
期刊介绍: Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信