基于循环重叠彩票的卷积神经网络快速剪枝

Md. Ismail Hossain;Mohammed Rakib;M. M. Lutfe Elahi;Nabeel Mohammed;Shafin Rahman
{"title":"基于循环重叠彩票的卷积神经网络快速剪枝","authors":"Md. Ismail Hossain;Mohammed Rakib;M. M. Lutfe Elahi;Nabeel Mohammed;Shafin Rahman","doi":"10.1109/TAI.2025.3534745","DOIUrl":null,"url":null,"abstract":"Pruning refers to the elimination of trivial weights from neural networks. The sub-networks within an overparameterized model produced after pruning are often called lottery tickets. This research aims to generate winning lottery tickets from a set of lottery tickets that can achieve accuracy similar to that of the original unpruned network. We introduce a novel winning ticket called cyclic overlapping lottery ticket (COLT) by data splitting and cyclic retraining of the pruned network from scratch. We apply a cyclic pruning algorithm that keeps only the overlapping weights of different pruned models trained on different data segments. Our results demonstrate that COLT can achieve similar accuracies (obtained by the unpruned model) while maintaining high sparsities. Based on object recognition and detection tasks, we show that the accuracy of COLT is on par with the winning tickets of the lottery ticket hypothesis and, at times, is better. Moreover, COLTs can be generated using fewer iterations than tickets generated by the popular iterative magnitude pruning method. In addition, we also notice that COLTs generated on large datasets can be transferred to small ones without compromising performance, demonstrating its generalizing capability. We conduct all our experiments on Cifar-10, Cifar-100, TinyImageNet, and ImageNet datasets and report superior performance than the state-of-the-art methods. The codes are available at: <uri>https://github.com/ismail31416/COLT</uri>.","PeriodicalId":73305,"journal":{"name":"IEEE transactions on artificial intelligence","volume":"6 6","pages":"1664-1678"},"PeriodicalIF":0.0000,"publicationDate":"2025-01-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10855806","citationCount":"0","resultStr":"{\"title\":\"COLT: Cyclic Overlapping Lottery Tickets for Faster Pruning of Convolutional Neural Networks\",\"authors\":\"Md. Ismail Hossain;Mohammed Rakib;M. M. Lutfe Elahi;Nabeel Mohammed;Shafin Rahman\",\"doi\":\"10.1109/TAI.2025.3534745\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Pruning refers to the elimination of trivial weights from neural networks. The sub-networks within an overparameterized model produced after pruning are often called lottery tickets. This research aims to generate winning lottery tickets from a set of lottery tickets that can achieve accuracy similar to that of the original unpruned network. We introduce a novel winning ticket called cyclic overlapping lottery ticket (COLT) by data splitting and cyclic retraining of the pruned network from scratch. We apply a cyclic pruning algorithm that keeps only the overlapping weights of different pruned models trained on different data segments. Our results demonstrate that COLT can achieve similar accuracies (obtained by the unpruned model) while maintaining high sparsities. Based on object recognition and detection tasks, we show that the accuracy of COLT is on par with the winning tickets of the lottery ticket hypothesis and, at times, is better. Moreover, COLTs can be generated using fewer iterations than tickets generated by the popular iterative magnitude pruning method. In addition, we also notice that COLTs generated on large datasets can be transferred to small ones without compromising performance, demonstrating its generalizing capability. We conduct all our experiments on Cifar-10, Cifar-100, TinyImageNet, and ImageNet datasets and report superior performance than the state-of-the-art methods. The codes are available at: <uri>https://github.com/ismail31416/COLT</uri>.\",\"PeriodicalId\":73305,\"journal\":{\"name\":\"IEEE transactions on artificial intelligence\",\"volume\":\"6 6\",\"pages\":\"1664-1678\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2025-01-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10855806\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on artificial intelligence\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10855806/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on artificial intelligence","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10855806/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

修剪指的是从神经网络中去除无关紧要的权重。修剪后产生的过参数化模型中的子网络通常被称为彩票。本研究旨在从一组彩票中生成中奖彩票,该彩票可以达到与原始未修剪网络相似的准确性。通过对剪枝后的网络进行数据分割和循环再训练,提出了一种新的中奖彩票——循环重叠彩票(COLT)。我们采用循环剪枝算法,只保留不同剪枝模型在不同数据段上训练的重叠权值。我们的结果表明,COLT可以在保持高稀疏性的同时达到类似的精度(由未修剪模型获得)。基于目标识别和检测任务,我们证明了COLT的准确性与彩票假设的中奖彩票相当,有时甚至更好。此外,与流行的迭代幅度修剪方法生成的票据相比,colt可以使用更少的迭代来生成。此外,我们还注意到在大数据集上生成的colt可以在不影响性能的情况下转移到小数据集上,这证明了它的泛化能力。我们在Cifar-10、Cifar-100、TinyImageNet和ImageNet数据集上进行了所有实验,并报告了比最先进的方法更好的性能。代码可在https://github.com/ismail31416/COLT上获得。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
COLT: Cyclic Overlapping Lottery Tickets for Faster Pruning of Convolutional Neural Networks
Pruning refers to the elimination of trivial weights from neural networks. The sub-networks within an overparameterized model produced after pruning are often called lottery tickets. This research aims to generate winning lottery tickets from a set of lottery tickets that can achieve accuracy similar to that of the original unpruned network. We introduce a novel winning ticket called cyclic overlapping lottery ticket (COLT) by data splitting and cyclic retraining of the pruned network from scratch. We apply a cyclic pruning algorithm that keeps only the overlapping weights of different pruned models trained on different data segments. Our results demonstrate that COLT can achieve similar accuracies (obtained by the unpruned model) while maintaining high sparsities. Based on object recognition and detection tasks, we show that the accuracy of COLT is on par with the winning tickets of the lottery ticket hypothesis and, at times, is better. Moreover, COLTs can be generated using fewer iterations than tickets generated by the popular iterative magnitude pruning method. In addition, we also notice that COLTs generated on large datasets can be transferred to small ones without compromising performance, demonstrating its generalizing capability. We conduct all our experiments on Cifar-10, Cifar-100, TinyImageNet, and ImageNet datasets and report superior performance than the state-of-the-art methods. The codes are available at: https://github.com/ismail31416/COLT.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
7.70
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信