{"title":"SequenceOut: Boosting CNNs by Freezing Layers","authors":"Shitala Prasad;Rakesh Paul;Mayur Kamat","doi":"10.1109/LSP.2025.3553430","DOIUrl":null,"url":null,"abstract":"Convolutional neural networks (CNNs) are a powerful tool for various computer vision tasks, demonstrating exceptional performance in image classification, object detection, and segmentation. However, traditional training methods often require meticulous hyperparameter tuning, architectural adjustments, or the introduction of additional data through techniques such as data augmentation to achieve optimal accuracy. This letter introduces an innovative training strategy that leverages layer freezing to enhance the training process while keeping the model's architecture and hyperparameters unchanged. By selectively and progressively freezing certain hidden layers in the CNN, we prevent the model from reaching a saturation point. This approach effectively reduces the backpropagation parameter space, facilitating more focused and efficient learning in the remaining layers.","PeriodicalId":13154,"journal":{"name":"IEEE Signal Processing Letters","volume":"32 ","pages":"1401-1405"},"PeriodicalIF":3.2000,"publicationDate":"2025-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Signal Processing Letters","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10935680/","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Convolutional neural networks (CNNs) are a powerful tool for various computer vision tasks, demonstrating exceptional performance in image classification, object detection, and segmentation. However, traditional training methods often require meticulous hyperparameter tuning, architectural adjustments, or the introduction of additional data through techniques such as data augmentation to achieve optimal accuracy. This letter introduces an innovative training strategy that leverages layer freezing to enhance the training process while keeping the model's architecture and hyperparameters unchanged. By selectively and progressively freezing certain hidden layers in the CNN, we prevent the model from reaching a saturation point. This approach effectively reduces the backpropagation parameter space, facilitating more focused and efficient learning in the remaining layers.
期刊介绍:
The IEEE Signal Processing Letters is a monthly, archival publication designed to provide rapid dissemination of original, cutting-edge ideas and timely, significant contributions in signal, image, speech, language and audio processing. Papers published in the Letters can be presented within one year of their appearance in signal processing conferences such as ICASSP, GlobalSIP and ICIP, and also in several workshop organized by the Signal Processing Society.