Gaurav Kumar Pandey;Devendra Singh Gurjar;Suneel Yadav;Xingwang Li
{"title":"基于射频能量收集的aoi感知无人机辅助网络的深度强化学习","authors":"Gaurav Kumar Pandey;Devendra Singh Gurjar;Suneel Yadav;Xingwang Li","doi":"10.1109/LNET.2025.3550931","DOIUrl":null,"url":null,"abstract":"This letter considers UAV-assisted data collection from energy-constrained Internet of Things (IoT) devices. Herein, a UAV utilizes radio frequency-based wireless power transfer technique to charge multiple IoT devices or schedules one IoT device to transmit its sensed data. Using the harvested energy, the IoT devices share the collected data with the UAV as per their schedule. For this setup, we aim to minimize IoT devices’ average Age of Information (AoI) by optimally controlling the UAV’s trajectory and scheduling of IoT devices while adhering to the energy consumption limitations of UAV and IoT devices. Considering the dynamic scenario for the considered network, the optimization problem is modeled as a Markov Decision Process and solved through dueling double deep Q-networks (D3QN) algorithm. The simulation results show that the proposed framework outperforms the baseline methods in reducing the average AoI of the IoT devices.","PeriodicalId":100628,"journal":{"name":"IEEE Networking Letters","volume":"7 2","pages":"88-92"},"PeriodicalIF":0.0000,"publicationDate":"2025-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Deep Reinforcement Learning for AoI-Aware UAV-Assisted Networks With RF Energy Harvesting\",\"authors\":\"Gaurav Kumar Pandey;Devendra Singh Gurjar;Suneel Yadav;Xingwang Li\",\"doi\":\"10.1109/LNET.2025.3550931\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This letter considers UAV-assisted data collection from energy-constrained Internet of Things (IoT) devices. Herein, a UAV utilizes radio frequency-based wireless power transfer technique to charge multiple IoT devices or schedules one IoT device to transmit its sensed data. Using the harvested energy, the IoT devices share the collected data with the UAV as per their schedule. For this setup, we aim to minimize IoT devices’ average Age of Information (AoI) by optimally controlling the UAV’s trajectory and scheduling of IoT devices while adhering to the energy consumption limitations of UAV and IoT devices. Considering the dynamic scenario for the considered network, the optimization problem is modeled as a Markov Decision Process and solved through dueling double deep Q-networks (D3QN) algorithm. The simulation results show that the proposed framework outperforms the baseline methods in reducing the average AoI of the IoT devices.\",\"PeriodicalId\":100628,\"journal\":{\"name\":\"IEEE Networking Letters\",\"volume\":\"7 2\",\"pages\":\"88-92\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2025-03-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Networking Letters\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10925373/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Networking Letters","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10925373/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Deep Reinforcement Learning for AoI-Aware UAV-Assisted Networks With RF Energy Harvesting
This letter considers UAV-assisted data collection from energy-constrained Internet of Things (IoT) devices. Herein, a UAV utilizes radio frequency-based wireless power transfer technique to charge multiple IoT devices or schedules one IoT device to transmit its sensed data. Using the harvested energy, the IoT devices share the collected data with the UAV as per their schedule. For this setup, we aim to minimize IoT devices’ average Age of Information (AoI) by optimally controlling the UAV’s trajectory and scheduling of IoT devices while adhering to the energy consumption limitations of UAV and IoT devices. Considering the dynamic scenario for the considered network, the optimization problem is modeled as a Markov Decision Process and solved through dueling double deep Q-networks (D3QN) algorithm. The simulation results show that the proposed framework outperforms the baseline methods in reducing the average AoI of the IoT devices.