{"title":"基于深度强化学习的社区点对点能源交易","authors":"Yiqun Wang, Qingyu Yang, Donghe Li","doi":"10.1063/5.0172713","DOIUrl":null,"url":null,"abstract":"With the massive access to distributed energy resources, an increasing number of users have transformed into prosumers with the functions of producing, storing, and consuming electric energy. Peer-to-peer (P2P) energy trading, as a new way to allow direct energy transactions between prosumers, is becoming increasingly widespread. How to determine the trading strategy of prosumers participating in P2P energy trading while the strategy can satisfy multiple optimization objectives simultaneously is a crucial problem to be solved. To this end, this paper introduces the demand response mechanism and applies the dissatisfaction function to represent the electricity consumption of prosumers. The mid-market rate price is adopted to attract more prosumers to participate in P2P energy trading. The P2P energy trading process among multiple prosumers in the community is constructed as a Markov decision process. We design the method of deep reinforcement learning (DRL) to solve the optimal trading policy of prosumers. DRL, by engaging in continual interactions with the environment, autonomously learns the optimal strategies. Additionally, the deep deterministic policy gradient algorithm is well-suited for handling the continuous and intricate decision problems that arise in the P2P energy trading market. Through the judicious construction of a reinforcement learning environment, this paper achieves multi-objective collaborative optimization. Simulation results show that our proposed algorithm and model reduce costs by 16.5%, compared to the transaction between prosumers and grid, and can effectively decrease the dependence of prosumers on the main grid.","PeriodicalId":16953,"journal":{"name":"Journal of Renewable and Sustainable Energy","volume":"24 1","pages":""},"PeriodicalIF":1.9000,"publicationDate":"2023-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Peer-to-peer energy trading in a community based on deep reinforcement learning\",\"authors\":\"Yiqun Wang, Qingyu Yang, Donghe Li\",\"doi\":\"10.1063/5.0172713\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"With the massive access to distributed energy resources, an increasing number of users have transformed into prosumers with the functions of producing, storing, and consuming electric energy. Peer-to-peer (P2P) energy trading, as a new way to allow direct energy transactions between prosumers, is becoming increasingly widespread. How to determine the trading strategy of prosumers participating in P2P energy trading while the strategy can satisfy multiple optimization objectives simultaneously is a crucial problem to be solved. To this end, this paper introduces the demand response mechanism and applies the dissatisfaction function to represent the electricity consumption of prosumers. The mid-market rate price is adopted to attract more prosumers to participate in P2P energy trading. The P2P energy trading process among multiple prosumers in the community is constructed as a Markov decision process. We design the method of deep reinforcement learning (DRL) to solve the optimal trading policy of prosumers. DRL, by engaging in continual interactions with the environment, autonomously learns the optimal strategies. Additionally, the deep deterministic policy gradient algorithm is well-suited for handling the continuous and intricate decision problems that arise in the P2P energy trading market. Through the judicious construction of a reinforcement learning environment, this paper achieves multi-objective collaborative optimization. Simulation results show that our proposed algorithm and model reduce costs by 16.5%, compared to the transaction between prosumers and grid, and can effectively decrease the dependence of prosumers on the main grid.\",\"PeriodicalId\":16953,\"journal\":{\"name\":\"Journal of Renewable and Sustainable Energy\",\"volume\":\"24 1\",\"pages\":\"\"},\"PeriodicalIF\":1.9000,\"publicationDate\":\"2023-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Renewable and Sustainable Energy\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1063/5.0172713\",\"RegionNum\":4,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"ENERGY & FUELS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Renewable and Sustainable Energy","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1063/5.0172713","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ENERGY & FUELS","Score":null,"Total":0}
Peer-to-peer energy trading in a community based on deep reinforcement learning
With the massive access to distributed energy resources, an increasing number of users have transformed into prosumers with the functions of producing, storing, and consuming electric energy. Peer-to-peer (P2P) energy trading, as a new way to allow direct energy transactions between prosumers, is becoming increasingly widespread. How to determine the trading strategy of prosumers participating in P2P energy trading while the strategy can satisfy multiple optimization objectives simultaneously is a crucial problem to be solved. To this end, this paper introduces the demand response mechanism and applies the dissatisfaction function to represent the electricity consumption of prosumers. The mid-market rate price is adopted to attract more prosumers to participate in P2P energy trading. The P2P energy trading process among multiple prosumers in the community is constructed as a Markov decision process. We design the method of deep reinforcement learning (DRL) to solve the optimal trading policy of prosumers. DRL, by engaging in continual interactions with the environment, autonomously learns the optimal strategies. Additionally, the deep deterministic policy gradient algorithm is well-suited for handling the continuous and intricate decision problems that arise in the P2P energy trading market. Through the judicious construction of a reinforcement learning environment, this paper achieves multi-objective collaborative optimization. Simulation results show that our proposed algorithm and model reduce costs by 16.5%, compared to the transaction between prosumers and grid, and can effectively decrease the dependence of prosumers on the main grid.
期刊介绍:
The Journal of Renewable and Sustainable Energy (JRSE) is an interdisciplinary, peer-reviewed journal covering all areas of renewable and sustainable energy relevant to the physical science and engineering communities. The interdisciplinary approach of the publication ensures that the editors draw from researchers worldwide in a diverse range of fields.
Topics covered include:
Renewable energy economics and policy
Renewable energy resource assessment
Solar energy: photovoltaics, solar thermal energy, solar energy for fuels
Wind energy: wind farms, rotors and blades, on- and offshore wind conditions, aerodynamics, fluid dynamics
Bioenergy: biofuels, biomass conversion, artificial photosynthesis
Distributed energy generation: rooftop PV, distributed fuel cells, distributed wind, micro-hydrogen power generation
Power distribution & systems modeling: power electronics and controls, smart grid
Energy efficient buildings: smart windows, PV, wind, power management
Energy conversion: flexoelectric, piezoelectric, thermoelectric, other technologies
Energy storage: batteries, supercapacitors, hydrogen storage, other fuels
Fuel cells: proton exchange membrane cells, solid oxide cells, hybrid fuel cells, other
Marine and hydroelectric energy: dams, tides, waves, other
Transportation: alternative vehicle technologies, plug-in technologies, other
Geothermal energy