Atit Bashyal, Tina Boroukhian, Pakin Veerachanchai, Myanganbayar Naransukh, Hendro Wicaksono
{"title":"基于多智能体深度强化学习的离散制造重工业需求响应与能量管理","authors":"Atit Bashyal, Tina Boroukhian, Pakin Veerachanchai, Myanganbayar Naransukh, Hendro Wicaksono","doi":"10.1016/j.apenergy.2025.125990","DOIUrl":null,"url":null,"abstract":"<div><div>Energy-centric decarbonization of heavy industries, such as steel and cement, necessitates their participation in integrating Renewable Energy Sources (RES) and effective Demand Response (DR) programs. This situation has created the opportunities to research control algorithms in diverse DR scenarios. Further, the industrial sector’s unique challenges, including the diversity of operations and the need for uninterrupted production, bring unique challenges in designing and implementing control algorithms. Reinforcement learning (RL) methods are practical solutions to the unique challenges faced by the industrial sector. Nevertheless, research in RL for industrial demand response has not yet achieved the level of standardization seen in other areas of RL research, hindering broader progress. To propel the research progress, we propose a multi-agent reinforcement learning (MARL)-based energy management system designed to optimize energy consumption in energy-intensive industrial settings by leveraging dynamic pricing DR schemes. The study highlights the creation of a MARL environment and addresses these challenges by designing a general framework that allows researchers to replicate and implement MARL environments for industrial sectors. The proposed framework incorporates a Partially Observable Markov Decision Process (POMDP) to model energy consumption and production processes while introducing buffer storage constraints and a flexible reward function that balances production efficiency and cost reduction. The paper evaluates the framework through experimental validation within a steel powder manufacturing facility. The experimental results validate our framework and also demonstrate the effectiveness of the MARL-based energy management system.</div></div>","PeriodicalId":246,"journal":{"name":"Applied Energy","volume":"392 ","pages":"Article 125990"},"PeriodicalIF":10.1000,"publicationDate":"2025-04-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Multi-agent deep reinforcement learning based demand response and energy management for heavy industries with discrete manufacturing systems\",\"authors\":\"Atit Bashyal, Tina Boroukhian, Pakin Veerachanchai, Myanganbayar Naransukh, Hendro Wicaksono\",\"doi\":\"10.1016/j.apenergy.2025.125990\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Energy-centric decarbonization of heavy industries, such as steel and cement, necessitates their participation in integrating Renewable Energy Sources (RES) and effective Demand Response (DR) programs. This situation has created the opportunities to research control algorithms in diverse DR scenarios. Further, the industrial sector’s unique challenges, including the diversity of operations and the need for uninterrupted production, bring unique challenges in designing and implementing control algorithms. Reinforcement learning (RL) methods are practical solutions to the unique challenges faced by the industrial sector. Nevertheless, research in RL for industrial demand response has not yet achieved the level of standardization seen in other areas of RL research, hindering broader progress. To propel the research progress, we propose a multi-agent reinforcement learning (MARL)-based energy management system designed to optimize energy consumption in energy-intensive industrial settings by leveraging dynamic pricing DR schemes. The study highlights the creation of a MARL environment and addresses these challenges by designing a general framework that allows researchers to replicate and implement MARL environments for industrial sectors. The proposed framework incorporates a Partially Observable Markov Decision Process (POMDP) to model energy consumption and production processes while introducing buffer storage constraints and a flexible reward function that balances production efficiency and cost reduction. The paper evaluates the framework through experimental validation within a steel powder manufacturing facility. The experimental results validate our framework and also demonstrate the effectiveness of the MARL-based energy management system.</div></div>\",\"PeriodicalId\":246,\"journal\":{\"name\":\"Applied Energy\",\"volume\":\"392 \",\"pages\":\"Article 125990\"},\"PeriodicalIF\":10.1000,\"publicationDate\":\"2025-04-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Applied Energy\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0306261925007202\",\"RegionNum\":1,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENERGY & FUELS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Energy","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0306261925007202","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENERGY & FUELS","Score":null,"Total":0}
Multi-agent deep reinforcement learning based demand response and energy management for heavy industries with discrete manufacturing systems
Energy-centric decarbonization of heavy industries, such as steel and cement, necessitates their participation in integrating Renewable Energy Sources (RES) and effective Demand Response (DR) programs. This situation has created the opportunities to research control algorithms in diverse DR scenarios. Further, the industrial sector’s unique challenges, including the diversity of operations and the need for uninterrupted production, bring unique challenges in designing and implementing control algorithms. Reinforcement learning (RL) methods are practical solutions to the unique challenges faced by the industrial sector. Nevertheless, research in RL for industrial demand response has not yet achieved the level of standardization seen in other areas of RL research, hindering broader progress. To propel the research progress, we propose a multi-agent reinforcement learning (MARL)-based energy management system designed to optimize energy consumption in energy-intensive industrial settings by leveraging dynamic pricing DR schemes. The study highlights the creation of a MARL environment and addresses these challenges by designing a general framework that allows researchers to replicate and implement MARL environments for industrial sectors. The proposed framework incorporates a Partially Observable Markov Decision Process (POMDP) to model energy consumption and production processes while introducing buffer storage constraints and a flexible reward function that balances production efficiency and cost reduction. The paper evaluates the framework through experimental validation within a steel powder manufacturing facility. The experimental results validate our framework and also demonstrate the effectiveness of the MARL-based energy management system.
期刊介绍:
Applied Energy serves as a platform for sharing innovations, research, development, and demonstrations in energy conversion, conservation, and sustainable energy systems. The journal covers topics such as optimal energy resource use, environmental pollutant mitigation, and energy process analysis. It welcomes original papers, review articles, technical notes, and letters to the editor. Authors are encouraged to submit manuscripts that bridge the gap between research, development, and implementation. The journal addresses a wide spectrum of topics, including fossil and renewable energy technologies, energy economics, and environmental impacts. Applied Energy also explores modeling and forecasting, conservation strategies, and the social and economic implications of energy policies, including climate change mitigation. It is complemented by the open-access journal Advances in Applied Energy.