Energy Efficient and Resilient Task Offloading in UAV-Assisted MEC Systems

IF 4.8 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC
Mohamed El-Emary;Diala Naboulsi;Razvan Stanica
{"title":"Energy Efficient and Resilient Task Offloading in UAV-Assisted MEC Systems","authors":"Mohamed El-Emary;Diala Naboulsi;Razvan Stanica","doi":"10.1109/OJVT.2025.3598154","DOIUrl":null,"url":null,"abstract":"Unmanned aerial vehicle (UAV)-assisted Mobile Edge Computing (MEC) presents a critical trade-off between minimizing user equipment (UE) energy consumption and ensuring high task execution reliability, especially for mission-critical applications.While many frameworks focus on either energy efficiency or resiliency, few address both objectives simultaneously with a structured redundancy model. To bridge this gap, this paper proposes a novel reinforcement learning (RL)-based framework that intelligently distributes computational tasks among UAVs and base stations (BSs). We introduce an <inline-formula><tex-math>$(h+1)$</tex-math></inline-formula>-server permutation strategy that redundantly assigns tasks to multiple edge servers, guaranteeing execution continuity even under partial system failures. An RL agent optimizes the offloading process by leveraging network state information to balance energy consumption with system robustness. Extensive simulations demonstrate the superiority of our approach over state-of-the-art benchmarks. Notably, our proposed framework sustains average UE energy levels above 75% under high user densities, exceeds 95% efficiency with more base stations, and maintains over 90% energy retention when 20 or more UAVs are deployed. Even under high computational loads, it preserves more than 50% of UE energy, outperforming all benchmarks by a significant margin—especially for mid-range task sizes where it leads by over 15–20% in energy efficiency. These findings highlight the potential of our framework to support energy-efficient and failure-resilient services for next-generation wireless networks.","PeriodicalId":34270,"journal":{"name":"IEEE Open Journal of Vehicular Technology","volume":"6 ","pages":"2236-2254"},"PeriodicalIF":4.8000,"publicationDate":"2025-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11122595","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Open Journal of Vehicular Technology","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11122595/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Unmanned aerial vehicle (UAV)-assisted Mobile Edge Computing (MEC) presents a critical trade-off between minimizing user equipment (UE) energy consumption and ensuring high task execution reliability, especially for mission-critical applications.While many frameworks focus on either energy efficiency or resiliency, few address both objectives simultaneously with a structured redundancy model. To bridge this gap, this paper proposes a novel reinforcement learning (RL)-based framework that intelligently distributes computational tasks among UAVs and base stations (BSs). We introduce an $(h+1)$-server permutation strategy that redundantly assigns tasks to multiple edge servers, guaranteeing execution continuity even under partial system failures. An RL agent optimizes the offloading process by leveraging network state information to balance energy consumption with system robustness. Extensive simulations demonstrate the superiority of our approach over state-of-the-art benchmarks. Notably, our proposed framework sustains average UE energy levels above 75% under high user densities, exceeds 95% efficiency with more base stations, and maintains over 90% energy retention when 20 or more UAVs are deployed. Even under high computational loads, it preserves more than 50% of UE energy, outperforming all benchmarks by a significant margin—especially for mid-range task sizes where it leads by over 15–20% in energy efficiency. These findings highlight the potential of our framework to support energy-efficient and failure-resilient services for next-generation wireless networks.
无人机辅助MEC系统中的节能和弹性任务卸载
无人机(UAV)辅助移动边缘计算(MEC)在最大限度地减少用户设备(UE)能耗和确保高任务执行可靠性之间提出了关键的权衡,特别是对于关键任务应用。虽然许多框架要么关注能源效率,要么关注弹性,但很少有框架通过结构化冗余模型同时解决这两个目标。为了弥补这一差距,本文提出了一种新的基于强化学习(RL)的框架,该框架可以在无人机和基站(BSs)之间智能地分配计算任务。我们引入$(h+1)$-server排列策略,将任务冗余地分配给多个边缘服务器,即使在部分系统故障的情况下也保证执行连续性。RL代理通过利用网络状态信息来平衡能耗和系统鲁棒性,从而优化卸载过程。大量的模拟证明了我们的方法优于最先进的基准。值得注意的是,我们提出的框架在高用户密度下将平均UE能量水平维持在75%以上,在更多基站时效率超过95%,在部署20架或更多无人机时保持90%以上的能量保留。即使在高计算负载下,它也可以节省超过50%的UE能量,大大优于所有基准测试,特别是在中等任务规模下,它的能源效率领先15-20%以上。这些发现突出了我们的框架在支持下一代无线网络的节能和故障弹性服务方面的潜力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
9.60
自引率
0.00%
发文量
25
审稿时长
10 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信