{"title":"无人机辅助MEC系统中的节能和弹性任务卸载","authors":"Mohamed El-Emary;Diala Naboulsi;Razvan Stanica","doi":"10.1109/OJVT.2025.3598154","DOIUrl":null,"url":null,"abstract":"Unmanned aerial vehicle (UAV)-assisted Mobile Edge Computing (MEC) presents a critical trade-off between minimizing user equipment (UE) energy consumption and ensuring high task execution reliability, especially for mission-critical applications.While many frameworks focus on either energy efficiency or resiliency, few address both objectives simultaneously with a structured redundancy model. To bridge this gap, this paper proposes a novel reinforcement learning (RL)-based framework that intelligently distributes computational tasks among UAVs and base stations (BSs). We introduce an <inline-formula><tex-math>$(h+1)$</tex-math></inline-formula>-server permutation strategy that redundantly assigns tasks to multiple edge servers, guaranteeing execution continuity even under partial system failures. An RL agent optimizes the offloading process by leveraging network state information to balance energy consumption with system robustness. Extensive simulations demonstrate the superiority of our approach over state-of-the-art benchmarks. Notably, our proposed framework sustains average UE energy levels above 75% under high user densities, exceeds 95% efficiency with more base stations, and maintains over 90% energy retention when 20 or more UAVs are deployed. Even under high computational loads, it preserves more than 50% of UE energy, outperforming all benchmarks by a significant margin—especially for mid-range task sizes where it leads by over 15–20% in energy efficiency. These findings highlight the potential of our framework to support energy-efficient and failure-resilient services for next-generation wireless networks.","PeriodicalId":34270,"journal":{"name":"IEEE Open Journal of Vehicular Technology","volume":"6 ","pages":"2236-2254"},"PeriodicalIF":4.8000,"publicationDate":"2025-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11122595","citationCount":"0","resultStr":"{\"title\":\"Energy Efficient and Resilient Task Offloading in UAV-Assisted MEC Systems\",\"authors\":\"Mohamed El-Emary;Diala Naboulsi;Razvan Stanica\",\"doi\":\"10.1109/OJVT.2025.3598154\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Unmanned aerial vehicle (UAV)-assisted Mobile Edge Computing (MEC) presents a critical trade-off between minimizing user equipment (UE) energy consumption and ensuring high task execution reliability, especially for mission-critical applications.While many frameworks focus on either energy efficiency or resiliency, few address both objectives simultaneously with a structured redundancy model. To bridge this gap, this paper proposes a novel reinforcement learning (RL)-based framework that intelligently distributes computational tasks among UAVs and base stations (BSs). We introduce an <inline-formula><tex-math>$(h+1)$</tex-math></inline-formula>-server permutation strategy that redundantly assigns tasks to multiple edge servers, guaranteeing execution continuity even under partial system failures. An RL agent optimizes the offloading process by leveraging network state information to balance energy consumption with system robustness. Extensive simulations demonstrate the superiority of our approach over state-of-the-art benchmarks. Notably, our proposed framework sustains average UE energy levels above 75% under high user densities, exceeds 95% efficiency with more base stations, and maintains over 90% energy retention when 20 or more UAVs are deployed. Even under high computational loads, it preserves more than 50% of UE energy, outperforming all benchmarks by a significant margin—especially for mid-range task sizes where it leads by over 15–20% in energy efficiency. These findings highlight the potential of our framework to support energy-efficient and failure-resilient services for next-generation wireless networks.\",\"PeriodicalId\":34270,\"journal\":{\"name\":\"IEEE Open Journal of Vehicular Technology\",\"volume\":\"6 \",\"pages\":\"2236-2254\"},\"PeriodicalIF\":4.8000,\"publicationDate\":\"2025-08-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11122595\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Open Journal of Vehicular Technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/11122595/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Open Journal of Vehicular Technology","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11122595/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
Energy Efficient and Resilient Task Offloading in UAV-Assisted MEC Systems
Unmanned aerial vehicle (UAV)-assisted Mobile Edge Computing (MEC) presents a critical trade-off between minimizing user equipment (UE) energy consumption and ensuring high task execution reliability, especially for mission-critical applications.While many frameworks focus on either energy efficiency or resiliency, few address both objectives simultaneously with a structured redundancy model. To bridge this gap, this paper proposes a novel reinforcement learning (RL)-based framework that intelligently distributes computational tasks among UAVs and base stations (BSs). We introduce an $(h+1)$-server permutation strategy that redundantly assigns tasks to multiple edge servers, guaranteeing execution continuity even under partial system failures. An RL agent optimizes the offloading process by leveraging network state information to balance energy consumption with system robustness. Extensive simulations demonstrate the superiority of our approach over state-of-the-art benchmarks. Notably, our proposed framework sustains average UE energy levels above 75% under high user densities, exceeds 95% efficiency with more base stations, and maintains over 90% energy retention when 20 or more UAVs are deployed. Even under high computational loads, it preserves more than 50% of UE energy, outperforming all benchmarks by a significant margin—especially for mid-range task sizes where it leads by over 15–20% in energy efficiency. These findings highlight the potential of our framework to support energy-efficient and failure-resilient services for next-generation wireless networks.