Rugui Yao;Lipei Liu;Xiaoya Zuo;Lin Yu;Juan Xu;Ye Fan;Wenhua Li
{"title":"基于物联网的智慧城市联合任务卸载和功率控制优化:基于深度强化学习的节能协调","authors":"Rugui Yao;Lipei Liu;Xiaoya Zuo;Lin Yu;Juan Xu;Ye Fan;Wenhua Li","doi":"10.1109/TCE.2025.3577809","DOIUrl":null,"url":null,"abstract":"Mobile Edge Computing (MEC) enhances computational efficiency by reducing data transmission distance, yet optimizing resource allocation and reducing operational cost remain critical challenges as the number of users grows. This paper investigates a multi-user partial computation offloading system under the time-varying channel environment and proposes a novel deep reinforcement learning-based framework to jointly optimize offloading strategy and power control, aiming to minimize the weighted sum of latency and energy consumption. Due to the problem’s multi-parameter, highly coupled, and non-convex characteristics, a deep neural network is firstly utilized to generate offloading ratio vectors, which are then discretized using an improved k-Nearest Neighbor (KNN) algorithm. Based on the quantized offloading actions, the Differential Evolution (DE) algorithm is employed to seek the optimal power control. Finally, the optimal action and state vectors are stored in an experience replay pool for subsequent network training until convergence, producing the optimal solution. Numerical results demonstrate that the proposed improved quantization method avoids the additional action exploration while accelerating convergence. Furthermore, the proposed algorithm significantly lowers user devices latency and energy consumption, outperforming other schemes and providing more efficient edge computing services.","PeriodicalId":13208,"journal":{"name":"IEEE Transactions on Consumer Electronics","volume":"71 2","pages":"2517-2529"},"PeriodicalIF":10.9000,"publicationDate":"2025-06-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Joint Task Offloading and Power Control Optimization for IoT-Enabled Smart Cities: An Energy-Efficient Coordination via Deep Reinforcement Learning\",\"authors\":\"Rugui Yao;Lipei Liu;Xiaoya Zuo;Lin Yu;Juan Xu;Ye Fan;Wenhua Li\",\"doi\":\"10.1109/TCE.2025.3577809\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Mobile Edge Computing (MEC) enhances computational efficiency by reducing data transmission distance, yet optimizing resource allocation and reducing operational cost remain critical challenges as the number of users grows. This paper investigates a multi-user partial computation offloading system under the time-varying channel environment and proposes a novel deep reinforcement learning-based framework to jointly optimize offloading strategy and power control, aiming to minimize the weighted sum of latency and energy consumption. Due to the problem’s multi-parameter, highly coupled, and non-convex characteristics, a deep neural network is firstly utilized to generate offloading ratio vectors, which are then discretized using an improved k-Nearest Neighbor (KNN) algorithm. Based on the quantized offloading actions, the Differential Evolution (DE) algorithm is employed to seek the optimal power control. Finally, the optimal action and state vectors are stored in an experience replay pool for subsequent network training until convergence, producing the optimal solution. Numerical results demonstrate that the proposed improved quantization method avoids the additional action exploration while accelerating convergence. Furthermore, the proposed algorithm significantly lowers user devices latency and energy consumption, outperforming other schemes and providing more efficient edge computing services.\",\"PeriodicalId\":13208,\"journal\":{\"name\":\"IEEE Transactions on Consumer Electronics\",\"volume\":\"71 2\",\"pages\":\"2517-2529\"},\"PeriodicalIF\":10.9000,\"publicationDate\":\"2025-06-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Consumer Electronics\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/11028897/\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Consumer Electronics","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/11028897/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
Joint Task Offloading and Power Control Optimization for IoT-Enabled Smart Cities: An Energy-Efficient Coordination via Deep Reinforcement Learning
Mobile Edge Computing (MEC) enhances computational efficiency by reducing data transmission distance, yet optimizing resource allocation and reducing operational cost remain critical challenges as the number of users grows. This paper investigates a multi-user partial computation offloading system under the time-varying channel environment and proposes a novel deep reinforcement learning-based framework to jointly optimize offloading strategy and power control, aiming to minimize the weighted sum of latency and energy consumption. Due to the problem’s multi-parameter, highly coupled, and non-convex characteristics, a deep neural network is firstly utilized to generate offloading ratio vectors, which are then discretized using an improved k-Nearest Neighbor (KNN) algorithm. Based on the quantized offloading actions, the Differential Evolution (DE) algorithm is employed to seek the optimal power control. Finally, the optimal action and state vectors are stored in an experience replay pool for subsequent network training until convergence, producing the optimal solution. Numerical results demonstrate that the proposed improved quantization method avoids the additional action exploration while accelerating convergence. Furthermore, the proposed algorithm significantly lowers user devices latency and energy consumption, outperforming other schemes and providing more efficient edge computing services.
期刊介绍:
The main focus for the IEEE Transactions on Consumer Electronics is the engineering and research aspects of the theory, design, construction, manufacture or end use of mass market electronics, systems, software and services for consumers.