Toan Pham Van, T. Tung, Linh Bao Doan, Thanh Ta Minh
{"title":"一种无原始训练数据集的神经网络预训练剪枝进化方法","authors":"Toan Pham Van, T. Tung, Linh Bao Doan, Thanh Ta Minh","doi":"10.18178/ijke.2022.8.1.136","DOIUrl":null,"url":null,"abstract":"—Model pruning is an important technique in real-world machine learning problems, especially in deep learning. This technique has provided some methods for compressing a large model to a smaller model while retaining the most accuracy. However, a majority of these approaches require a full original training set. This might not always be possible in practice if the model is trained in a large-scale dataset or on a dataset whose release poses privacy. Although we cannot access the original training set in some cases, pre-trained models are available more often. This paper aims to solve the model pruning problem without the initial training set by finding the sub-networks in the initial pre-trained model. We propose an approach of using genetic algorithms (GA) to find the sub-networks systematically and automatically. Experimental results show that our algorithm can find good sub-networks efficiently. Theoretically, if we had unlimited time and hardware power, we could find the optimized sub-networks of any pre-trained model and achieve the best results in the future. Our code and pre-trained models are available at: https://github.com/sun-asterisk-research/ga_pruning_research.","PeriodicalId":88527,"journal":{"name":"International journal of knowledge engineering and soft data paradigms","volume":"1 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"An Evolution Approach for Pre-trained Neural Network Pruning without Original Training Dataset\",\"authors\":\"Toan Pham Van, T. Tung, Linh Bao Doan, Thanh Ta Minh\",\"doi\":\"10.18178/ijke.2022.8.1.136\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"—Model pruning is an important technique in real-world machine learning problems, especially in deep learning. This technique has provided some methods for compressing a large model to a smaller model while retaining the most accuracy. However, a majority of these approaches require a full original training set. This might not always be possible in practice if the model is trained in a large-scale dataset or on a dataset whose release poses privacy. Although we cannot access the original training set in some cases, pre-trained models are available more often. This paper aims to solve the model pruning problem without the initial training set by finding the sub-networks in the initial pre-trained model. We propose an approach of using genetic algorithms (GA) to find the sub-networks systematically and automatically. Experimental results show that our algorithm can find good sub-networks efficiently. Theoretically, if we had unlimited time and hardware power, we could find the optimized sub-networks of any pre-trained model and achieve the best results in the future. Our code and pre-trained models are available at: https://github.com/sun-asterisk-research/ga_pruning_research.\",\"PeriodicalId\":88527,\"journal\":{\"name\":\"International journal of knowledge engineering and soft data paradigms\",\"volume\":\"1 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International journal of knowledge engineering and soft data paradigms\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.18178/ijke.2022.8.1.136\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of knowledge engineering and soft data paradigms","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.18178/ijke.2022.8.1.136","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An Evolution Approach for Pre-trained Neural Network Pruning without Original Training Dataset
—Model pruning is an important technique in real-world machine learning problems, especially in deep learning. This technique has provided some methods for compressing a large model to a smaller model while retaining the most accuracy. However, a majority of these approaches require a full original training set. This might not always be possible in practice if the model is trained in a large-scale dataset or on a dataset whose release poses privacy. Although we cannot access the original training set in some cases, pre-trained models are available more often. This paper aims to solve the model pruning problem without the initial training set by finding the sub-networks in the initial pre-trained model. We propose an approach of using genetic algorithms (GA) to find the sub-networks systematically and automatically. Experimental results show that our algorithm can find good sub-networks efficiently. Theoretically, if we had unlimited time and hardware power, we could find the optimized sub-networks of any pre-trained model and achieve the best results in the future. Our code and pre-trained models are available at: https://github.com/sun-asterisk-research/ga_pruning_research.