Hui Wei , Feifei Lee , Lin Xie , Li Liu , Hongliu Yu , Qiu Chen
{"title":"CSC-DARTS: Efficient differentiable neural architecture search using channel splitting connections","authors":"Hui Wei , Feifei Lee , Lin Xie , Li Liu , Hongliu Yu , Qiu Chen","doi":"10.1016/j.ins.2025.122538","DOIUrl":null,"url":null,"abstract":"<div><div>Recently, differentiable architecture search (DARTS) has made great progress in decreasing the computational cost of Neural Architecture Search (NAS). However, there is still a problem of excessive memory access costs in training the supernet. In this paper, we propose an efficient search framework for differentiable architecture search using channel splitting connections, namely CSC-DARTS, based on bi-level optimization and second-order gradient approximation. Specifically, a “Channel Splitting” technique is developed to split the feature maps of the supernet at the channel level into two branches, one of which is sent to the operation selection for less redundancy when exploring the search space, while bypassing the rest directly to the output as feature reuse. In addition, an “identity” connection is adopted in both the search and evaluation phases, which is regarded as a regularization for less variability, to bridge a large gap caused by the inconsistency of the architecture depth between the two stages. Experimental results on the benchmark datasets CIFAR-10, CIFAR-100, and ImageNet demonstrate that CSC-DARTS achieves state-of-the-art performance with fewer GPU resources, including a test error of 2.52 % on CIFAR-10, an average test error of 17.25 % on CIFAR-100, and a top-1/5 accuracy of 74.5 %/91.7 % on ImageNet.</div></div>","PeriodicalId":51063,"journal":{"name":"Information Sciences","volume":"720 ","pages":"Article 122538"},"PeriodicalIF":8.1000,"publicationDate":"2025-07-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Sciences","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0020025525006711","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"0","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Recently, differentiable architecture search (DARTS) has made great progress in decreasing the computational cost of Neural Architecture Search (NAS). However, there is still a problem of excessive memory access costs in training the supernet. In this paper, we propose an efficient search framework for differentiable architecture search using channel splitting connections, namely CSC-DARTS, based on bi-level optimization and second-order gradient approximation. Specifically, a “Channel Splitting” technique is developed to split the feature maps of the supernet at the channel level into two branches, one of which is sent to the operation selection for less redundancy when exploring the search space, while bypassing the rest directly to the output as feature reuse. In addition, an “identity” connection is adopted in both the search and evaluation phases, which is regarded as a regularization for less variability, to bridge a large gap caused by the inconsistency of the architecture depth between the two stages. Experimental results on the benchmark datasets CIFAR-10, CIFAR-100, and ImageNet demonstrate that CSC-DARTS achieves state-of-the-art performance with fewer GPU resources, including a test error of 2.52 % on CIFAR-10, an average test error of 17.25 % on CIFAR-100, and a top-1/5 accuracy of 74.5 %/91.7 % on ImageNet.
期刊介绍:
Informatics and Computer Science Intelligent Systems Applications is an esteemed international journal that focuses on publishing original and creative research findings in the field of information sciences. We also feature a limited number of timely tutorial and surveying contributions.
Our journal aims to cater to a diverse audience, including researchers, developers, managers, strategic planners, graduate students, and anyone interested in staying up-to-date with cutting-edge research in information science, knowledge engineering, and intelligent systems. While readers are expected to share a common interest in information science, they come from varying backgrounds such as engineering, mathematics, statistics, physics, computer science, cell biology, molecular biology, management science, cognitive science, neurobiology, behavioral sciences, and biochemistry.