Fei Hu , Yong Zhao , Yaowen Yu , Changshun Zhang , Yicheng Lian , Cheng Huang , Yuanzheng Li
{"title":"Strategic bidding with price-quantity pairs based on deep reinforcement learning considering competitors' behaviors","authors":"Fei Hu , Yong Zhao , Yaowen Yu , Changshun Zhang , Yicheng Lian , Cheng Huang , Yuanzheng Li","doi":"10.1016/j.apenergy.2025.125874","DOIUrl":null,"url":null,"abstract":"<div><div>In a smart electricity market, self-interested market participants may leverage a large amount of market data to bid strategically to maximize their profits. However, the existing studies in strategic bidding often ignore competitors' bidding behaviors and only consider strategic actions on prices without quantities. To bridge the gap, this paper develops a novel deep reinforcement learning-based framework to model and solve the strategic bidding problem of a producer. To capture competitors' historical bidding behaviors in the market environment, their demand-bid mappings are established based on a data-driven method combining K-medoids clustering and a deep neural network. To make full use of the bidding action space and increase the profit of the strategic producer, a bilevel optimization model considering bids in price-quantity pairs is formulated. To efficiently solve the problem with competitors' bidding behaviors, a twin delayed deep deterministic policy gradient-based algorithm is developed. Case studies on the IEEE 57-bus system show that the proposed framework obtains a 27.37 % higher expected value and a 47.60 % lower standard deviation of the profit compared to the existing approach, demonstrating its profitability and robustness under market dynamics. Another case on the IEEE 118-bus test system achieves a 33.34 % increase in the expected profit, further validating the advantages in profitability. These cases together demonstrate the effectiveness and scalability of our approach in systems of different sizes, as well as its potential application to strategic bidding in smart electricity markets.</div></div>","PeriodicalId":246,"journal":{"name":"Applied Energy","volume":"391 ","pages":""},"PeriodicalIF":10.1000,"publicationDate":"2025-04-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Energy","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S030626192500604X","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENERGY & FUELS","Score":null,"Total":0}
引用次数: 0
Abstract
In a smart electricity market, self-interested market participants may leverage a large amount of market data to bid strategically to maximize their profits. However, the existing studies in strategic bidding often ignore competitors' bidding behaviors and only consider strategic actions on prices without quantities. To bridge the gap, this paper develops a novel deep reinforcement learning-based framework to model and solve the strategic bidding problem of a producer. To capture competitors' historical bidding behaviors in the market environment, their demand-bid mappings are established based on a data-driven method combining K-medoids clustering and a deep neural network. To make full use of the bidding action space and increase the profit of the strategic producer, a bilevel optimization model considering bids in price-quantity pairs is formulated. To efficiently solve the problem with competitors' bidding behaviors, a twin delayed deep deterministic policy gradient-based algorithm is developed. Case studies on the IEEE 57-bus system show that the proposed framework obtains a 27.37 % higher expected value and a 47.60 % lower standard deviation of the profit compared to the existing approach, demonstrating its profitability and robustness under market dynamics. Another case on the IEEE 118-bus test system achieves a 33.34 % increase in the expected profit, further validating the advantages in profitability. These cases together demonstrate the effectiveness and scalability of our approach in systems of different sizes, as well as its potential application to strategic bidding in smart electricity markets.
期刊介绍:
Applied Energy serves as a platform for sharing innovations, research, development, and demonstrations in energy conversion, conservation, and sustainable energy systems. The journal covers topics such as optimal energy resource use, environmental pollutant mitigation, and energy process analysis. It welcomes original papers, review articles, technical notes, and letters to the editor. Authors are encouraged to submit manuscripts that bridge the gap between research, development, and implementation. The journal addresses a wide spectrum of topics, including fossil and renewable energy technologies, energy economics, and environmental impacts. Applied Energy also explores modeling and forecasting, conservation strategies, and the social and economic implications of energy policies, including climate change mitigation. It is complemented by the open-access journal Advances in Applied Energy.