{"title":"A Novel Filter Pruning Algorithm for Vision Tasks based on Kernel Grouping","authors":"Jongmin Lee, A. Elibol, N. Chong","doi":"10.1109/ur55393.2022.9826290","DOIUrl":null,"url":null,"abstract":"Although the size and the computation cost of the state of the art deep learning models are tremendously large, they run without any problem when implemented on computers thanks to the remarkable enhancements and advancements of computers. However, the problem is likely to be faced when the need for deploying them on mobile platforms arises. Model compression techniques such as filter pruning or knowledge distillation help to reduce the size of deep learning models. However the conventional methods contain sorting algorithms therefore they cannot be applied to models that have reshaping layers like involution. In this research, we revisit a model compression algorithm named Model Diet that can be both applied to involution and convolution models. Furthermore, we present its application on two different tasks, image segmentation and depth estimation.","PeriodicalId":398742,"journal":{"name":"2022 19th International Conference on Ubiquitous Robots (UR)","volume":"94 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 19th International Conference on Ubiquitous Robots (UR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ur55393.2022.9826290","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Although the size and the computation cost of the state of the art deep learning models are tremendously large, they run without any problem when implemented on computers thanks to the remarkable enhancements and advancements of computers. However, the problem is likely to be faced when the need for deploying them on mobile platforms arises. Model compression techniques such as filter pruning or knowledge distillation help to reduce the size of deep learning models. However the conventional methods contain sorting algorithms therefore they cannot be applied to models that have reshaping layers like involution. In this research, we revisit a model compression algorithm named Model Diet that can be both applied to involution and convolution models. Furthermore, we present its application on two different tasks, image segmentation and depth estimation.