Junhao Huang , Bing Xue , Yanan Sun , Mengjie Zhang , Gary G. Yen
{"title":"Automated design of neural networks with multi-scale convolutions via multi-path weight sampling","authors":"Junhao Huang , Bing Xue , Yanan Sun , Mengjie Zhang , Gary G. Yen","doi":"10.1016/j.patcog.2025.111605","DOIUrl":null,"url":null,"abstract":"<div><div>The performance of convolutional neural networks (CNNs) relies heavily on the architecture design. Recently, an increasingly prevalent trend in CNN architecture design is the utilization of ingeniously crafted building blocks, e.g., the MixConv module, for improving the model expressivity and efficiency. To leverage the feature learning capability of multi-scale convolution while further reducing its computational complexity, this paper presents a computationally efficient yet powerful module, dubbed EMixConv, by combining parameter-free concatenation-based feature reuse with multi-scale convolution. In addition, we propose a one-shot neural architecture search (NAS) method integrating the EMixConv module to automatically search for the optimal combination of the related architectural parameters. Furthermore, an efficient multi-path weight sampling mechanism is developed to enhance the robustness of weight inheritance in the supernet. We demonstrate the effectiveness of the proposed module and the NAS algorithm on three popular image classification tasks. The developed models, dubbed EMixNets, outperform most state-of-the-art architectures with fewer parameters and computations on the CIFAR datasets. On ImageNet, EMixNet is superior to a majority of compared methods and is also more compact and computationally efficient.</div></div>","PeriodicalId":49713,"journal":{"name":"Pattern Recognition","volume":"165 ","pages":"Article 111605"},"PeriodicalIF":7.5000,"publicationDate":"2025-03-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pattern Recognition","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0031320325002651","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
The performance of convolutional neural networks (CNNs) relies heavily on the architecture design. Recently, an increasingly prevalent trend in CNN architecture design is the utilization of ingeniously crafted building blocks, e.g., the MixConv module, for improving the model expressivity and efficiency. To leverage the feature learning capability of multi-scale convolution while further reducing its computational complexity, this paper presents a computationally efficient yet powerful module, dubbed EMixConv, by combining parameter-free concatenation-based feature reuse with multi-scale convolution. In addition, we propose a one-shot neural architecture search (NAS) method integrating the EMixConv module to automatically search for the optimal combination of the related architectural parameters. Furthermore, an efficient multi-path weight sampling mechanism is developed to enhance the robustness of weight inheritance in the supernet. We demonstrate the effectiveness of the proposed module and the NAS algorithm on three popular image classification tasks. The developed models, dubbed EMixNets, outperform most state-of-the-art architectures with fewer parameters and computations on the CIFAR datasets. On ImageNet, EMixNet is superior to a majority of compared methods and is also more compact and computationally efficient.
期刊介绍:
The field of Pattern Recognition is both mature and rapidly evolving, playing a crucial role in various related fields such as computer vision, image processing, text analysis, and neural networks. It closely intersects with machine learning and is being applied in emerging areas like biometrics, bioinformatics, multimedia data analysis, and data science. The journal Pattern Recognition, established half a century ago during the early days of computer science, has since grown significantly in scope and influence.