{"title":"基于深度强化学习的电动汽车需求响应建筑能源管理","authors":"Daeyoung Kang, Seunghyun Yoon, Hyuk-Soon Lim","doi":"10.1109/ICAIIC57133.2023.10066975","DOIUrl":null,"url":null,"abstract":"In recent years, stability issues of power grids have become critical with the rapid increase in power consumption. Demand response (DR) is a policy that incentivizes consumers to reduce their power usage so that electricity demand does not exceed the supply of a power grid to prevent the power grid's instability. We propose a Deep Q-Network (DQN)-based building energy management system that reduces the amount of electricity supplied by electric power companies by utilizing the surplus power of electric vehicles (EVs) upon DR requests. The proposed scheme considers the DR incentives and penalties as well as the cost of buying energy from EVs. In addition, the amount of time used for discharging EVs is also taken into consideration in DQN's reward function. We perform the simulations to compare the proposed scheme with a random selection scheme and a greedy scheme to recruit the nearest EVs until the DR request is fulfilled. The simulation result indicates that the proposed scheme succeeds to balance the building cost and the EV waiting time performance at the EV stations.","PeriodicalId":105769,"journal":{"name":"2023 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-02-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Deep Reinforcement Learning-based Building Energy Management using Electric Vehicles for Demand Response\",\"authors\":\"Daeyoung Kang, Seunghyun Yoon, Hyuk-Soon Lim\",\"doi\":\"10.1109/ICAIIC57133.2023.10066975\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In recent years, stability issues of power grids have become critical with the rapid increase in power consumption. Demand response (DR) is a policy that incentivizes consumers to reduce their power usage so that electricity demand does not exceed the supply of a power grid to prevent the power grid's instability. We propose a Deep Q-Network (DQN)-based building energy management system that reduces the amount of electricity supplied by electric power companies by utilizing the surplus power of electric vehicles (EVs) upon DR requests. The proposed scheme considers the DR incentives and penalties as well as the cost of buying energy from EVs. In addition, the amount of time used for discharging EVs is also taken into consideration in DQN's reward function. We perform the simulations to compare the proposed scheme with a random selection scheme and a greedy scheme to recruit the nearest EVs until the DR request is fulfilled. The simulation result indicates that the proposed scheme succeeds to balance the building cost and the EV waiting time performance at the EV stations.\",\"PeriodicalId\":105769,\"journal\":{\"name\":\"2023 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-02-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICAIIC57133.2023.10066975\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAIIC57133.2023.10066975","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Deep Reinforcement Learning-based Building Energy Management using Electric Vehicles for Demand Response
In recent years, stability issues of power grids have become critical with the rapid increase in power consumption. Demand response (DR) is a policy that incentivizes consumers to reduce their power usage so that electricity demand does not exceed the supply of a power grid to prevent the power grid's instability. We propose a Deep Q-Network (DQN)-based building energy management system that reduces the amount of electricity supplied by electric power companies by utilizing the surplus power of electric vehicles (EVs) upon DR requests. The proposed scheme considers the DR incentives and penalties as well as the cost of buying energy from EVs. In addition, the amount of time used for discharging EVs is also taken into consideration in DQN's reward function. We perform the simulations to compare the proposed scheme with a random selection scheme and a greedy scheme to recruit the nearest EVs until the DR request is fulfilled. The simulation result indicates that the proposed scheme succeeds to balance the building cost and the EV waiting time performance at the EV stations.