{"title":"Deep Reinforcement Learning Based Rendering Service Placement for Cloud Gaming in Mobile Edge Computing Systems","authors":"Yongqiang Gao, Zhihan Li","doi":"10.1109/COMPSAC57700.2023.00073","DOIUrl":null,"url":null,"abstract":"In recent years, the advancement of 4G/5G network technologies and smart devices has led to an increasing demand for smooth, massively multiplayer online games on mobile terminals. These games necessitate high performance and heavy workloads, often consuming substantial amounts of computing and storage resources while imposing strict latency requirements. However, due to the limited resources of end devices, such tasks cannot be efficiently and independently executed. The traditional solution typically involves processing gaming tasks at centralized cloud servers. However, this approach introduces issues such as bandwidth pressure, high latency, load imbalance, and elevated costs. Recently, mobile edge computing (MEC) has gained popularity, and its low-latency capabilities can be integrated with cloud gaming to enhance the gaming performance experience. In this paper, we explore the offloading and placement of rendering services in a scenario that combines MEC with cloud gaming. We propose a model-free algorithm based on deep reinforcement learning to learn the optimal task offloading and placement policy, which optimizes a combination of four metrics: latency, cost, bandwidth, and load balancing. Additionally, the algorithm predicts future bandwidth using LSTM, significantly improving the player's gaming experience and fairness. Simulation results demonstrate that our proposed task placement strategy outperforms state-of-the-art methods applied to similar problems.","PeriodicalId":296288,"journal":{"name":"2023 IEEE 47th Annual Computers, Software, and Applications Conference (COMPSAC)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE 47th Annual Computers, Software, and Applications Conference (COMPSAC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/COMPSAC57700.2023.00073","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In recent years, the advancement of 4G/5G network technologies and smart devices has led to an increasing demand for smooth, massively multiplayer online games on mobile terminals. These games necessitate high performance and heavy workloads, often consuming substantial amounts of computing and storage resources while imposing strict latency requirements. However, due to the limited resources of end devices, such tasks cannot be efficiently and independently executed. The traditional solution typically involves processing gaming tasks at centralized cloud servers. However, this approach introduces issues such as bandwidth pressure, high latency, load imbalance, and elevated costs. Recently, mobile edge computing (MEC) has gained popularity, and its low-latency capabilities can be integrated with cloud gaming to enhance the gaming performance experience. In this paper, we explore the offloading and placement of rendering services in a scenario that combines MEC with cloud gaming. We propose a model-free algorithm based on deep reinforcement learning to learn the optimal task offloading and placement policy, which optimizes a combination of four metrics: latency, cost, bandwidth, and load balancing. Additionally, the algorithm predicts future bandwidth using LSTM, significantly improving the player's gaming experience and fairness. Simulation results demonstrate that our proposed task placement strategy outperforms state-of-the-art methods applied to similar problems.