{"title":"超密集网络中移动边缘计算的卸载策略研究","authors":"Ruobin Wang, Lijun Li, Meiling Li, Wenhua Gao, Zengshou Dong","doi":"10.1117/12.3032060","DOIUrl":null,"url":null,"abstract":"Mobile Edge Computing (MEC) has emerged as a pivotal technology to meet the increasing demands of mobile applications. However, in high-dynamic MEC environments, load balancing and performance optimization among servers remain challenging. Focusing on server load balancing in task offloading in MEC environment. It constructs a framework for ultra-dense network environments and formulates the problem of computation offloading and resource allocation as a Markov Decision Process (MDP). Subsequently, a learning algorithm based on Proximal Policy Optimization (PPO) is proposed to reduce load standard deviation, achieve load balancing, and simultaneously minimize the system's total delay energy consumption, thereby enhancing the efficiency of the MEC system. Simulation results demonstrate that, compared to random offloading strategies, all-offloading strategies, and the Deep Deterministic Policy Gradient algorithm, the algorithm proposed consistently demonstrates superior performance in load balancing across varying numbers of users and task sizes.","PeriodicalId":342847,"journal":{"name":"International Conference on Algorithms, Microchips and Network Applications","volume":" 12","pages":"1317121 - 1317121-9"},"PeriodicalIF":0.0000,"publicationDate":"2024-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Research on offloading strategies for mobile edge computing in ultradense networks\",\"authors\":\"Ruobin Wang, Lijun Li, Meiling Li, Wenhua Gao, Zengshou Dong\",\"doi\":\"10.1117/12.3032060\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Mobile Edge Computing (MEC) has emerged as a pivotal technology to meet the increasing demands of mobile applications. However, in high-dynamic MEC environments, load balancing and performance optimization among servers remain challenging. Focusing on server load balancing in task offloading in MEC environment. It constructs a framework for ultra-dense network environments and formulates the problem of computation offloading and resource allocation as a Markov Decision Process (MDP). Subsequently, a learning algorithm based on Proximal Policy Optimization (PPO) is proposed to reduce load standard deviation, achieve load balancing, and simultaneously minimize the system's total delay energy consumption, thereby enhancing the efficiency of the MEC system. Simulation results demonstrate that, compared to random offloading strategies, all-offloading strategies, and the Deep Deterministic Policy Gradient algorithm, the algorithm proposed consistently demonstrates superior performance in load balancing across varying numbers of users and task sizes.\",\"PeriodicalId\":342847,\"journal\":{\"name\":\"International Conference on Algorithms, Microchips and Network Applications\",\"volume\":\" 12\",\"pages\":\"1317121 - 1317121-9\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-06-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Conference on Algorithms, Microchips and Network Applications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1117/12.3032060\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Conference on Algorithms, Microchips and Network Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.3032060","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Research on offloading strategies for mobile edge computing in ultradense networks
Mobile Edge Computing (MEC) has emerged as a pivotal technology to meet the increasing demands of mobile applications. However, in high-dynamic MEC environments, load balancing and performance optimization among servers remain challenging. Focusing on server load balancing in task offloading in MEC environment. It constructs a framework for ultra-dense network environments and formulates the problem of computation offloading and resource allocation as a Markov Decision Process (MDP). Subsequently, a learning algorithm based on Proximal Policy Optimization (PPO) is proposed to reduce load standard deviation, achieve load balancing, and simultaneously minimize the system's total delay energy consumption, thereby enhancing the efficiency of the MEC system. Simulation results demonstrate that, compared to random offloading strategies, all-offloading strategies, and the Deep Deterministic Policy Gradient algorithm, the algorithm proposed consistently demonstrates superior performance in load balancing across varying numbers of users and task sizes.