A. Kaulage, Shraddha Shaha, Tanaya Naik, Khushi Nikumbh, Vedant Jagtap
{"title":"利用深度q学习最小化服务器能耗的人工智能方法","authors":"A. Kaulage, Shraddha Shaha, Tanaya Naik, Khushi Nikumbh, Vedant Jagtap","doi":"10.1109/I2CT57861.2023.10126481","DOIUrl":null,"url":null,"abstract":"This paper focuses on minimizing energy consumption by servers in data centers. Server’s energy consumption can be impacted by numerous factors, such as the number of connected devices, the workload being processed, and the energy efficiency of the components.High energy consumption can be serious because of several reasons as it can impact the reliability of servers because high temperatures generated by energy consumption can lead to hardware failure and other technical issues. Therefore, reducing energy consumption in servers is important for improving the cost-effectiveness, sustainability, scalability, and reliability of data center operations. A type of reinforcement learning called Deep Q-Learning (DQL) can be used to address issues with server energy consumption. The basic idea behind DQL is to train an artificial agent, such as a neural network, to make decisions about energy consumption in real time. The agent is trained by frequently performing actions in a setting and earning rewards depending on the amount of energy consumed by specific actions. Over time, the agent learns which actions are most likely to lead to energy savings, and it can then be deployed to make real-time decisions about energy consumption in a server. Experimental results of the proposed research show an average of 66% power saving in the server’s consumption of energy using Deep Q-Learning (DQL).","PeriodicalId":150346,"journal":{"name":"2023 IEEE 8th International Conference for Convergence in Technology (I2CT)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-04-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"AI Approach for Minimizing The Energy Consumption of Servers Using Deep-Q-Learning\",\"authors\":\"A. Kaulage, Shraddha Shaha, Tanaya Naik, Khushi Nikumbh, Vedant Jagtap\",\"doi\":\"10.1109/I2CT57861.2023.10126481\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper focuses on minimizing energy consumption by servers in data centers. Server’s energy consumption can be impacted by numerous factors, such as the number of connected devices, the workload being processed, and the energy efficiency of the components.High energy consumption can be serious because of several reasons as it can impact the reliability of servers because high temperatures generated by energy consumption can lead to hardware failure and other technical issues. Therefore, reducing energy consumption in servers is important for improving the cost-effectiveness, sustainability, scalability, and reliability of data center operations. A type of reinforcement learning called Deep Q-Learning (DQL) can be used to address issues with server energy consumption. The basic idea behind DQL is to train an artificial agent, such as a neural network, to make decisions about energy consumption in real time. The agent is trained by frequently performing actions in a setting and earning rewards depending on the amount of energy consumed by specific actions. Over time, the agent learns which actions are most likely to lead to energy savings, and it can then be deployed to make real-time decisions about energy consumption in a server. Experimental results of the proposed research show an average of 66% power saving in the server’s consumption of energy using Deep Q-Learning (DQL).\",\"PeriodicalId\":150346,\"journal\":{\"name\":\"2023 IEEE 8th International Conference for Convergence in Technology (I2CT)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-04-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 IEEE 8th International Conference for Convergence in Technology (I2CT)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/I2CT57861.2023.10126481\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE 8th International Conference for Convergence in Technology (I2CT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/I2CT57861.2023.10126481","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
AI Approach for Minimizing The Energy Consumption of Servers Using Deep-Q-Learning
This paper focuses on minimizing energy consumption by servers in data centers. Server’s energy consumption can be impacted by numerous factors, such as the number of connected devices, the workload being processed, and the energy efficiency of the components.High energy consumption can be serious because of several reasons as it can impact the reliability of servers because high temperatures generated by energy consumption can lead to hardware failure and other technical issues. Therefore, reducing energy consumption in servers is important for improving the cost-effectiveness, sustainability, scalability, and reliability of data center operations. A type of reinforcement learning called Deep Q-Learning (DQL) can be used to address issues with server energy consumption. The basic idea behind DQL is to train an artificial agent, such as a neural network, to make decisions about energy consumption in real time. The agent is trained by frequently performing actions in a setting and earning rewards depending on the amount of energy consumed by specific actions. Over time, the agent learns which actions are most likely to lead to energy savings, and it can then be deployed to make real-time decisions about energy consumption in a server. Experimental results of the proposed research show an average of 66% power saving in the server’s consumption of energy using Deep Q-Learning (DQL).