Jihene Tmamna, Emna Ben Ayed, Rahma Fourati, Mandar Gogate, Tughrul Arslan, Amir Hussain, Mounir Ben Ayed
{"title":"修剪深度神经网络,建立绿色节能模型:调查","authors":"Jihene Tmamna, Emna Ben Ayed, Rahma Fourati, Mandar Gogate, Tughrul Arslan, Amir Hussain, Mounir Ben Ayed","doi":"10.1007/s12559-024-10313-0","DOIUrl":null,"url":null,"abstract":"<p>Over the past few years, larger and deeper neural network models, particularly convolutional neural networks (CNNs), have consistently advanced state-of-the-art performance across various disciplines. Yet, the computational demands of these models have escalated exponentially. Intensive computations hinder not only research inclusiveness and deployment on resource-constrained devices, such as Edge Internet of Things (IoT) devices, but also result in a substantial carbon footprint. Green deep learning has emerged as a research field that emphasizes energy consumption and carbon emissions during model training and inference, aiming to innovate with light and energy-efficient neural networks. Various techniques are available to achieve this goal. Studies show that conventional deep models often contain redundant parameters that do not alter outcomes significantly, underpinning the theoretical basis for model pruning. Consequently, this timely review paper seeks to systematically summarize recent breakthroughs in CNN pruning methods, offering necessary background knowledge for researchers in this interdisciplinary domain. Secondly, we spotlight the challenges of current model pruning methods to inform future avenues of research. Additionally, the survey highlights the pressing need for the development of innovative metrics to effectively balance diverse pruning objectives. Lastly, it investigates pruning techniques oriented towards sophisticated deep learning models, including hybrid feedforward CNNs and long short-term memory (LSTM) recurrent neural networks, a field ripe for exploration within green deep learning research.</p>","PeriodicalId":51243,"journal":{"name":"Cognitive Computation","volume":null,"pages":null},"PeriodicalIF":4.3000,"publicationDate":"2024-07-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Pruning Deep Neural Networks for Green Energy-Efficient Models: A Survey\",\"authors\":\"Jihene Tmamna, Emna Ben Ayed, Rahma Fourati, Mandar Gogate, Tughrul Arslan, Amir Hussain, Mounir Ben Ayed\",\"doi\":\"10.1007/s12559-024-10313-0\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Over the past few years, larger and deeper neural network models, particularly convolutional neural networks (CNNs), have consistently advanced state-of-the-art performance across various disciplines. Yet, the computational demands of these models have escalated exponentially. Intensive computations hinder not only research inclusiveness and deployment on resource-constrained devices, such as Edge Internet of Things (IoT) devices, but also result in a substantial carbon footprint. Green deep learning has emerged as a research field that emphasizes energy consumption and carbon emissions during model training and inference, aiming to innovate with light and energy-efficient neural networks. Various techniques are available to achieve this goal. Studies show that conventional deep models often contain redundant parameters that do not alter outcomes significantly, underpinning the theoretical basis for model pruning. Consequently, this timely review paper seeks to systematically summarize recent breakthroughs in CNN pruning methods, offering necessary background knowledge for researchers in this interdisciplinary domain. Secondly, we spotlight the challenges of current model pruning methods to inform future avenues of research. Additionally, the survey highlights the pressing need for the development of innovative metrics to effectively balance diverse pruning objectives. Lastly, it investigates pruning techniques oriented towards sophisticated deep learning models, including hybrid feedforward CNNs and long short-term memory (LSTM) recurrent neural networks, a field ripe for exploration within green deep learning research.</p>\",\"PeriodicalId\":51243,\"journal\":{\"name\":\"Cognitive Computation\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":4.3000,\"publicationDate\":\"2024-07-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Cognitive Computation\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1007/s12559-024-10313-0\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cognitive Computation","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s12559-024-10313-0","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Pruning Deep Neural Networks for Green Energy-Efficient Models: A Survey
Over the past few years, larger and deeper neural network models, particularly convolutional neural networks (CNNs), have consistently advanced state-of-the-art performance across various disciplines. Yet, the computational demands of these models have escalated exponentially. Intensive computations hinder not only research inclusiveness and deployment on resource-constrained devices, such as Edge Internet of Things (IoT) devices, but also result in a substantial carbon footprint. Green deep learning has emerged as a research field that emphasizes energy consumption and carbon emissions during model training and inference, aiming to innovate with light and energy-efficient neural networks. Various techniques are available to achieve this goal. Studies show that conventional deep models often contain redundant parameters that do not alter outcomes significantly, underpinning the theoretical basis for model pruning. Consequently, this timely review paper seeks to systematically summarize recent breakthroughs in CNN pruning methods, offering necessary background knowledge for researchers in this interdisciplinary domain. Secondly, we spotlight the challenges of current model pruning methods to inform future avenues of research. Additionally, the survey highlights the pressing need for the development of innovative metrics to effectively balance diverse pruning objectives. Lastly, it investigates pruning techniques oriented towards sophisticated deep learning models, including hybrid feedforward CNNs and long short-term memory (LSTM) recurrent neural networks, a field ripe for exploration within green deep learning research.
期刊介绍:
Cognitive Computation is an international, peer-reviewed, interdisciplinary journal that publishes cutting-edge articles describing original basic and applied work involving biologically-inspired computational accounts of all aspects of natural and artificial cognitive systems. It provides a new platform for the dissemination of research, current practices and future trends in the emerging discipline of cognitive computation that bridges the gap between life sciences, social sciences, engineering, physical and mathematical sciences, and humanities.