Compression of fully-connected layer in neural network by Kronecker product

Jia-Nan Wu
{"title":"Compression of fully-connected layer in neural network by Kronecker product","authors":"Jia-Nan Wu","doi":"10.1109/ICACI.2016.7449822","DOIUrl":null,"url":null,"abstract":"In this paper we propose and study a technique to reduce the number of parameters in fully-connected layers of neural networks using Kronecker product, at a mild cost of the prediction quality. The technique proceeds by replacing fully-connected layers with so-called Kronecker fully-connected layers, where the weight matrices of the fully-connected layers are approximated by linear combinations of multiple Kronecker products of smaller matrices. Just as the Kronecker product is a generalization of the outer product from vectors to matrices, our method is a generalization of the low rank approximation method for fully-connected layers. We also use combinations of different shapes of Kronecker product to increase modelling capacity. Experiments on SVHN, scene text recognition and ImageNet dataset demonstrate that we can achieve 10x reduction of number of parameters with less than 1% drop in accuracy, showing the effectiveness and efficiency of our method.","PeriodicalId":211040,"journal":{"name":"2016 Eighth International Conference on Advanced Computational Intelligence (ICACI)","volume":"59 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-07-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"20","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 Eighth International Conference on Advanced Computational Intelligence (ICACI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICACI.2016.7449822","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 20

Abstract

In this paper we propose and study a technique to reduce the number of parameters in fully-connected layers of neural networks using Kronecker product, at a mild cost of the prediction quality. The technique proceeds by replacing fully-connected layers with so-called Kronecker fully-connected layers, where the weight matrices of the fully-connected layers are approximated by linear combinations of multiple Kronecker products of smaller matrices. Just as the Kronecker product is a generalization of the outer product from vectors to matrices, our method is a generalization of the low rank approximation method for fully-connected layers. We also use combinations of different shapes of Kronecker product to increase modelling capacity. Experiments on SVHN, scene text recognition and ImageNet dataset demonstrate that we can achieve 10x reduction of number of parameters with less than 1% drop in accuracy, showing the effectiveness and efficiency of our method.
神经网络中全连通层的Kronecker积压缩
在本文中,我们提出并研究了一种利用Kronecker积来减少神经网络全连接层中参数数量的技术,并且以较小的预测质量为代价。该技术通过用所谓的Kronecker全连接层取代全连接层来进行,其中全连接层的权矩阵由多个较小矩阵的Kronecker积的线性组合近似。就像Kronecker积是从向量到矩阵的外积的推广一样,我们的方法是对全连通层的低秩近似方法的推广。我们还使用克罗内克产品不同形状的组合来增加造型能力。在SVHN、场景文本识别和ImageNet数据集上的实验表明,我们的方法可以在精度下降不到1%的情况下将参数数量减少10倍,显示了我们的方法的有效性和效率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信