Compressing CNN by alternating constraint optimization framework

Peidong Liu, Weirong Liu, Changhong Shi, Zhiqiang Zhang, Zhijun Li, Jie Liu
{"title":"Compressing CNN by alternating constraint optimization framework","authors":"Peidong Liu, Weirong Liu, Changhong Shi, Zhiqiang Zhang, Zhijun Li, Jie Liu","doi":"10.1117/12.2643734","DOIUrl":null,"url":null,"abstract":"Tensor decomposition has been extensively studied for convolutional neural networks (CNN) model compression. However, the direct decomposition of an uncompressed model into low-rank form causes unavoidable approximation error due to the lack of low-rank property of a pre-trained model. In this manuscript, a CNN model compression method using alternating constraint optimization framework (ACOF) is proposed. Firstly, ACOF formulates tensor decomposition-based model compression as a constraint optimization problem with low tensor rank constraints. This optimization problem is then solved systematically in an iterative manner using alternating direction method of multipliers (ADMM). During the alternating process, the uncompressed model gradually exhibits low-rank tensor property, and then the approximation error in low-rank tensor decomposition can be negligible. Finally, a high-performance CNN compression network can be effectively obtained by SGD-based fine-tuning. Extensive experimental results on image classification show that ACOF produces the optimal compressed model with high performance and low computational complexity. Notably, ACOF compresses Resnet56 to 28% without accuracy drop, and the compressed model have 1.14% higher accuracy than learning-compression (LC) method.","PeriodicalId":314555,"journal":{"name":"International Conference on Digital Image Processing","volume":"20 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Conference on Digital Image Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.2643734","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Tensor decomposition has been extensively studied for convolutional neural networks (CNN) model compression. However, the direct decomposition of an uncompressed model into low-rank form causes unavoidable approximation error due to the lack of low-rank property of a pre-trained model. In this manuscript, a CNN model compression method using alternating constraint optimization framework (ACOF) is proposed. Firstly, ACOF formulates tensor decomposition-based model compression as a constraint optimization problem with low tensor rank constraints. This optimization problem is then solved systematically in an iterative manner using alternating direction method of multipliers (ADMM). During the alternating process, the uncompressed model gradually exhibits low-rank tensor property, and then the approximation error in low-rank tensor decomposition can be negligible. Finally, a high-performance CNN compression network can be effectively obtained by SGD-based fine-tuning. Extensive experimental results on image classification show that ACOF produces the optimal compressed model with high performance and low computational complexity. Notably, ACOF compresses Resnet56 to 28% without accuracy drop, and the compressed model have 1.14% higher accuracy than learning-compression (LC) method.
交替约束优化框架压缩CNN
张量分解在卷积神经网络(CNN)模型压缩中得到了广泛的研究。然而,由于缺乏预训练模型的低秩特性,直接将未压缩模型分解为低秩形式会导致不可避免的近似误差。本文提出了一种基于交替约束优化框架(ACOF)的CNN模型压缩方法。首先,ACOF将基于张量分解的模型压缩表述为具有低张量秩约束的约束优化问题。然后用乘法器的交替方向法(ADMM)以迭代的方式系统地求解了这个优化问题。在交替过程中,未压缩模型逐渐表现出低秩张量特性,此时低秩张量分解的近似误差可以忽略不计。最后,通过基于sgd的微调,可以有效地获得高性能的CNN压缩网络。大量的图像分类实验结果表明,ACOF产生的压缩模型具有高性能和低计算复杂度。值得注意的是,ACOF将Resnet56压缩到28%,而精度没有下降,压缩模型的精度比学习压缩(LC)方法提高了1.14%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信