基于加速局部优化和GLCM能量Z-Score的Memetic GA深度卷积网络进化剪枝

Hana Cho, Han Joon Byun, Min Kee Kim, Joon Huh, Byung-Ro Moon
{"title":"基于加速局部优化和GLCM能量Z-Score的Memetic GA深度卷积网络进化剪枝","authors":"Hana Cho, Han Joon Byun, Min Kee Kim, Joon Huh, Byung-Ro Moon","doi":"10.1145/3583133.3590604","DOIUrl":null,"url":null,"abstract":"This paper introduces a novel method of selecting the most significant filters in deep neural networks. We performed model simplification via pruning with Genetic Algorithm (GA) for trained deep networks. Pure GA has a weakness of local tuning and slow convergence, so it is not easy to produce good results for problems with large problem space such as ours. We present new ideas that overcome some of GA's weaknesses. These include efficient local optimization, as well as reducing the time of evaluation which occupies most of the running time. Additional time was saved by restricting the filters to preserve using the GLCM (Gray-Level Co-occurrence Matrix) to determine the usefulness of the filters. Ultimately, the saved time was used to perform more iterations, providing the opportunity to further optimize the network. The experimental result showed more than 95% of reduction in forward convolution computation with negligible performance degradation.","PeriodicalId":422029,"journal":{"name":"Proceedings of the Companion Conference on Genetic and Evolutionary Computation","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Evolutionary Pruning of Deep Convolutional Networks by a Memetic GA with Sped-Up Local Optimization and GLCM Energy Z-Score\",\"authors\":\"Hana Cho, Han Joon Byun, Min Kee Kim, Joon Huh, Byung-Ro Moon\",\"doi\":\"10.1145/3583133.3590604\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper introduces a novel method of selecting the most significant filters in deep neural networks. We performed model simplification via pruning with Genetic Algorithm (GA) for trained deep networks. Pure GA has a weakness of local tuning and slow convergence, so it is not easy to produce good results for problems with large problem space such as ours. We present new ideas that overcome some of GA's weaknesses. These include efficient local optimization, as well as reducing the time of evaluation which occupies most of the running time. Additional time was saved by restricting the filters to preserve using the GLCM (Gray-Level Co-occurrence Matrix) to determine the usefulness of the filters. Ultimately, the saved time was used to perform more iterations, providing the opportunity to further optimize the network. The experimental result showed more than 95% of reduction in forward convolution computation with negligible performance degradation.\",\"PeriodicalId\":422029,\"journal\":{\"name\":\"Proceedings of the Companion Conference on Genetic and Evolutionary Computation\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-07-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the Companion Conference on Genetic and Evolutionary Computation\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3583133.3590604\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Companion Conference on Genetic and Evolutionary Computation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3583133.3590604","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

介绍了一种选择深度神经网络中最显著滤波器的新方法。我们使用遗传算法(GA)对训练好的深度网络进行了模型简化。纯遗传算法存在局部调优和收敛速度慢的缺点,对于像我们这样问题空间大的问题,不容易产生好的结果。我们提出了新的想法,克服了遗传算法的一些弱点。这包括高效的局部优化,以及减少占用大部分运行时间的求值时间。通过使用GLCM(灰度共生矩阵)来确定过滤器的有用性来限制过滤器以保留,从而节省了额外的时间。最终,节省下来的时间被用于执行更多的迭代,从而为进一步优化网络提供了机会。实验结果表明,前向卷积计算减少95%以上,性能下降可以忽略不计。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Evolutionary Pruning of Deep Convolutional Networks by a Memetic GA with Sped-Up Local Optimization and GLCM Energy Z-Score
This paper introduces a novel method of selecting the most significant filters in deep neural networks. We performed model simplification via pruning with Genetic Algorithm (GA) for trained deep networks. Pure GA has a weakness of local tuning and slow convergence, so it is not easy to produce good results for problems with large problem space such as ours. We present new ideas that overcome some of GA's weaknesses. These include efficient local optimization, as well as reducing the time of evaluation which occupies most of the running time. Additional time was saved by restricting the filters to preserve using the GLCM (Gray-Level Co-occurrence Matrix) to determine the usefulness of the filters. Ultimately, the saved time was used to perform more iterations, providing the opportunity to further optimize the network. The experimental result showed more than 95% of reduction in forward convolution computation with negligible performance degradation.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信