{"title":"基于年龄感知的无人机辅助能量采集无线可充电移动网络设计","authors":"Aditya Singh;Rajesh M. Hegde","doi":"10.1109/TAI.2025.3528377","DOIUrl":null,"url":null,"abstract":"The proliferation of Internet of Things (IoT) technology has enhanced connectivity and automation in industries and daily life. The introduction of mobile IoT devices (IoTDs) has further expanded the productivity of these networks beyond conventional cyber–physical systems, resulting in wireless rechargeable mobile networks (WRMNs). However, the inherent limitations of low-powered IoTDs mandate their repetitive charging in dynamic environments. To address this, we propose radio frequency (RF) energy harvesting from unmanned aerial vehicles (UAVs) to supplement the energy needs of IoTDs. Moreover, the IoTDs’ mobility and nonuniform energy utilization are challenging for UAV scheduling in WRMNs. Additionally, maintaining a balance between efficient utilization of UAV energy and IoTD energy harvesting adds complexity to the problem. In this work, we introduce the age of charging (AoC) metric to quantify IoTDs’ repetitive charging and propose an energy-efficient UAV scheduling scheme to maximize UAV energy usage efficiency (EUE) in WRMNs. Moreover, a Markov decision process (MDP) is formulated to address UAV-EUE maximization. Subsequently, a deep reinforcement learning (DRL) scheme is proposed within the deep deterministic policy gradient (DDPG) framework to optimize UAV charging sequences. The DRL agent (UAV) autonomously learns optimal charging strategies considering IoTD mobility patterns, energy demand fluctuations, and IoTD energy-harvesting capabilities. Simulation results demonstrate the superiority of the proposed DRL algorithm over existing DRL-based UAV scheduling schemes, significantly enhancing the operational lifespan of WRMNs and ensuring network stability and continuous functionality. This motivates the adoption of the proposed DRL scheme for developing autonomous, energy-aware, next-generation IoT applications.","PeriodicalId":73305,"journal":{"name":"IEEE transactions on artificial intelligence","volume":"6 7","pages":"1797-1807"},"PeriodicalIF":0.0000,"publicationDate":"2025-01-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Age-Aware UAV-Aided Energy Harvesting for the Design of Wireless Rechargeable Mobile Networks\",\"authors\":\"Aditya Singh;Rajesh M. Hegde\",\"doi\":\"10.1109/TAI.2025.3528377\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The proliferation of Internet of Things (IoT) technology has enhanced connectivity and automation in industries and daily life. The introduction of mobile IoT devices (IoTDs) has further expanded the productivity of these networks beyond conventional cyber–physical systems, resulting in wireless rechargeable mobile networks (WRMNs). However, the inherent limitations of low-powered IoTDs mandate their repetitive charging in dynamic environments. To address this, we propose radio frequency (RF) energy harvesting from unmanned aerial vehicles (UAVs) to supplement the energy needs of IoTDs. Moreover, the IoTDs’ mobility and nonuniform energy utilization are challenging for UAV scheduling in WRMNs. Additionally, maintaining a balance between efficient utilization of UAV energy and IoTD energy harvesting adds complexity to the problem. In this work, we introduce the age of charging (AoC) metric to quantify IoTDs’ repetitive charging and propose an energy-efficient UAV scheduling scheme to maximize UAV energy usage efficiency (EUE) in WRMNs. Moreover, a Markov decision process (MDP) is formulated to address UAV-EUE maximization. Subsequently, a deep reinforcement learning (DRL) scheme is proposed within the deep deterministic policy gradient (DDPG) framework to optimize UAV charging sequences. The DRL agent (UAV) autonomously learns optimal charging strategies considering IoTD mobility patterns, energy demand fluctuations, and IoTD energy-harvesting capabilities. Simulation results demonstrate the superiority of the proposed DRL algorithm over existing DRL-based UAV scheduling schemes, significantly enhancing the operational lifespan of WRMNs and ensuring network stability and continuous functionality. This motivates the adoption of the proposed DRL scheme for developing autonomous, energy-aware, next-generation IoT applications.\",\"PeriodicalId\":73305,\"journal\":{\"name\":\"IEEE transactions on artificial intelligence\",\"volume\":\"6 7\",\"pages\":\"1797-1807\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2025-01-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on artificial intelligence\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10838612/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on artificial intelligence","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10838612/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Age-Aware UAV-Aided Energy Harvesting for the Design of Wireless Rechargeable Mobile Networks
The proliferation of Internet of Things (IoT) technology has enhanced connectivity and automation in industries and daily life. The introduction of mobile IoT devices (IoTDs) has further expanded the productivity of these networks beyond conventional cyber–physical systems, resulting in wireless rechargeable mobile networks (WRMNs). However, the inherent limitations of low-powered IoTDs mandate their repetitive charging in dynamic environments. To address this, we propose radio frequency (RF) energy harvesting from unmanned aerial vehicles (UAVs) to supplement the energy needs of IoTDs. Moreover, the IoTDs’ mobility and nonuniform energy utilization are challenging for UAV scheduling in WRMNs. Additionally, maintaining a balance between efficient utilization of UAV energy and IoTD energy harvesting adds complexity to the problem. In this work, we introduce the age of charging (AoC) metric to quantify IoTDs’ repetitive charging and propose an energy-efficient UAV scheduling scheme to maximize UAV energy usage efficiency (EUE) in WRMNs. Moreover, a Markov decision process (MDP) is formulated to address UAV-EUE maximization. Subsequently, a deep reinforcement learning (DRL) scheme is proposed within the deep deterministic policy gradient (DDPG) framework to optimize UAV charging sequences. The DRL agent (UAV) autonomously learns optimal charging strategies considering IoTD mobility patterns, energy demand fluctuations, and IoTD energy-harvesting capabilities. Simulation results demonstrate the superiority of the proposed DRL algorithm over existing DRL-based UAV scheduling schemes, significantly enhancing the operational lifespan of WRMNs and ensuring network stability and continuous functionality. This motivates the adoption of the proposed DRL scheme for developing autonomous, energy-aware, next-generation IoT applications.