{"title":"Smart Battery Swapping Control for an Electric Motorcycle Fleet With Peak Time Based on Deep Reinforcement Learning","authors":"YoonShik Park;Seungdon Zu;Chi Xie;Hyunwoo Lee;Taesu Cheong;Qing-Chang Lu;Meng Xu","doi":"10.1109/TITS.2024.3469110","DOIUrl":null,"url":null,"abstract":"This study proposes a deep Q-network (DQN) model for electric motorcycles (EMs) and a multi-agent reinforcement learning (MARL)-based central control system to support battery swapping decision-making in the delivery business. We aim to minimize expected delivery losses, especially in scenarios where delivery requests are randomly and independently generated for each EM, with fluctuating time distributions and limited BSS capacity. Our MARL benefits from a reservation mechanism and a profit-aggregated central system, which greatly reduces the complexity of MARL. Furthermore, to address the inherent non-stationary problems of MARL, we propose a decentralized agent-based MARL framework, named Decentralized Agents, Centralized Learning Deep Q Network. This framework, leveraging a tailored learning algorithm, achieves peak-averse behavior, reducing delivery losses. Additionally, we introduce a hybrid approach that combines the resulting DQN algorithm for determining when to visit the BSS, and a greedy algorithm for deciding which BSS to visit. Computational experiments using real-world delivery data are conducted to evaluate the performance of our algorithm. The results demonstrate that the hybrid approach maximizes the overall profit of the entire EM fleet in a challenging environment with limited BSS capacity.","PeriodicalId":13416,"journal":{"name":"IEEE Transactions on Intelligent Transportation Systems","volume":"25 12","pages":"20175-20189"},"PeriodicalIF":7.9000,"publicationDate":"2024-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Intelligent Transportation Systems","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10706996/","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, CIVIL","Score":null,"Total":0}
引用次数: 0
Abstract
This study proposes a deep Q-network (DQN) model for electric motorcycles (EMs) and a multi-agent reinforcement learning (MARL)-based central control system to support battery swapping decision-making in the delivery business. We aim to minimize expected delivery losses, especially in scenarios where delivery requests are randomly and independently generated for each EM, with fluctuating time distributions and limited BSS capacity. Our MARL benefits from a reservation mechanism and a profit-aggregated central system, which greatly reduces the complexity of MARL. Furthermore, to address the inherent non-stationary problems of MARL, we propose a decentralized agent-based MARL framework, named Decentralized Agents, Centralized Learning Deep Q Network. This framework, leveraging a tailored learning algorithm, achieves peak-averse behavior, reducing delivery losses. Additionally, we introduce a hybrid approach that combines the resulting DQN algorithm for determining when to visit the BSS, and a greedy algorithm for deciding which BSS to visit. Computational experiments using real-world delivery data are conducted to evaluate the performance of our algorithm. The results demonstrate that the hybrid approach maximizes the overall profit of the entire EM fleet in a challenging environment with limited BSS capacity.
期刊介绍:
The theoretical, experimental and operational aspects of electrical and electronics engineering and information technologies as applied to Intelligent Transportation Systems (ITS). Intelligent Transportation Systems are defined as those systems utilizing synergistic technologies and systems engineering concepts to develop and improve transportation systems of all kinds. The scope of this interdisciplinary activity includes the promotion, consolidation and coordination of ITS technical activities among IEEE entities, and providing a focus for cooperative activities, both internally and externally.