Yuanyuan Hu , Jing Yang , Xiaoli Ruan , Yuling Chen , Chengjiang Li , Zhaohu Zhang , Wei Zhang
{"title":"Green optimization for micro data centers: Task scheduling for a combined energy consumption strategy","authors":"Yuanyuan Hu , Jing Yang , Xiaoli Ruan , Yuling Chen , Chengjiang Li , Zhaohu Zhang , Wei Zhang","doi":"10.1016/j.apenergy.2025.126031","DOIUrl":null,"url":null,"abstract":"<div><div>As micro data centers (MDCs) continue to increase in size, their high energy consumption leads to increasing environmental concerns, making it crucial to explore optimization methods to reduce energy consumption. Deep reinforcement learning (DRL) utilizing server energy consumption models can yield a task scheduling scheme for optimizing energy consumption. However, server energy consumption models fail to capture the overall energy consumption fluctuations of MDCs. Moreover, existing scheduling methods lack the adaptability to dynamically adjust policies in response to real-time load and environmental changes. To address these challenges, we propose a novel task scheduling approach using SAC-Discrete and a combined energy consumption model (SAC-EC). This approach employs distributed learning and parallel task assignment across multiple servers using SAC-Discrete, and integrates a combined energy consumption model that includes a server energy consumption model, a cooling energy consumption model, and an adaptive thermal control model to optimize the overall energy consumption of MDCs. For efficient energy cost optimization, SAC-EC employs a dynamic pricing policy that assigns reward values to energy consumption and models the policy update, server resource scheduling, and policy learning processes. The experimental results on real datasets demonstrate that, compared with six mainstream reinforcement learning methods, SAC-EC reduces server energy consumption by 18.44 % and cooling energy consumption by 30.68 % on average. In addition, SAC-EC is optimized with respect to energy cost, adaptive thermal energy consumption, server room temperature control, and reward values. The code is available at: <span><span>https://github.com/ybyangjing/SAC-EC</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":246,"journal":{"name":"Applied Energy","volume":"393 ","pages":"Article 126031"},"PeriodicalIF":10.1000,"publicationDate":"2025-05-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Energy","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0306261925007615","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENERGY & FUELS","Score":null,"Total":0}
引用次数: 0
Abstract
As micro data centers (MDCs) continue to increase in size, their high energy consumption leads to increasing environmental concerns, making it crucial to explore optimization methods to reduce energy consumption. Deep reinforcement learning (DRL) utilizing server energy consumption models can yield a task scheduling scheme for optimizing energy consumption. However, server energy consumption models fail to capture the overall energy consumption fluctuations of MDCs. Moreover, existing scheduling methods lack the adaptability to dynamically adjust policies in response to real-time load and environmental changes. To address these challenges, we propose a novel task scheduling approach using SAC-Discrete and a combined energy consumption model (SAC-EC). This approach employs distributed learning and parallel task assignment across multiple servers using SAC-Discrete, and integrates a combined energy consumption model that includes a server energy consumption model, a cooling energy consumption model, and an adaptive thermal control model to optimize the overall energy consumption of MDCs. For efficient energy cost optimization, SAC-EC employs a dynamic pricing policy that assigns reward values to energy consumption and models the policy update, server resource scheduling, and policy learning processes. The experimental results on real datasets demonstrate that, compared with six mainstream reinforcement learning methods, SAC-EC reduces server energy consumption by 18.44 % and cooling energy consumption by 30.68 % on average. In addition, SAC-EC is optimized with respect to energy cost, adaptive thermal energy consumption, server room temperature control, and reward values. The code is available at: https://github.com/ybyangjing/SAC-EC.
期刊介绍:
Applied Energy serves as a platform for sharing innovations, research, development, and demonstrations in energy conversion, conservation, and sustainable energy systems. The journal covers topics such as optimal energy resource use, environmental pollutant mitigation, and energy process analysis. It welcomes original papers, review articles, technical notes, and letters to the editor. Authors are encouraged to submit manuscripts that bridge the gap between research, development, and implementation. The journal addresses a wide spectrum of topics, including fossil and renewable energy technologies, energy economics, and environmental impacts. Applied Energy also explores modeling and forecasting, conservation strategies, and the social and economic implications of energy policies, including climate change mitigation. It is complemented by the open-access journal Advances in Applied Energy.