{"title":"RF Aerially Charging Scheduling for UAV Fleet : A Q-Learning Approach","authors":"Jinwei Xu, K. Zhu, Ran Wang","doi":"10.1109/MSN48538.2019.00046","DOIUrl":null,"url":null,"abstract":"In recent years, unmanned aerial vehicles (UAVs) have attracted extensive interests from both academia and industry due to the potential wide applications with universal applicable nature of the deployment. However, currently the bottleneck for UAVs is the limited carried energy resources (e.g. oil box, battery), especially for electric-driven UAVs. For a system consisting of multiple UAVs using batteries, its stability depends on each UAV. Therefore, the lifetime of each UAV is expected to be extended. In this paper, we propose the concept of RF charging aerially for the UAV fleet. Specifically, in order to ensure the stability of the system, wireless charging is considered for enhancing the lifetime of each UAV. However, it may be unbalanced. Accordingly, the issue of charging scheduling arises. The problem is formulated as a Q-Learning problem in this paper. Agent constantly explores and optimizes its scheduling policy. Finally, it can adapt to different UAV distribution situations. We take the energy levels of UAVs as input, which is easy for implementation. We have compared with two other algorithms (RSA and LESA) and compared with the case of no-charging. The results show that comparing with no-charging, the stability of the system can be improved by up to 78%. Compared with RSA and LESA, system stability is increased by up to 30%-40%. In addition, our method is more flexible and applicable to fleet than other ways (such as return to base station, landing to power line, ground laser, etc) to supplement energy.","PeriodicalId":368318,"journal":{"name":"2019 15th International Conference on Mobile Ad-Hoc and Sensor Networks (MSN)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 15th International Conference on Mobile Ad-Hoc and Sensor Networks (MSN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MSN48538.2019.00046","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7
Abstract
In recent years, unmanned aerial vehicles (UAVs) have attracted extensive interests from both academia and industry due to the potential wide applications with universal applicable nature of the deployment. However, currently the bottleneck for UAVs is the limited carried energy resources (e.g. oil box, battery), especially for electric-driven UAVs. For a system consisting of multiple UAVs using batteries, its stability depends on each UAV. Therefore, the lifetime of each UAV is expected to be extended. In this paper, we propose the concept of RF charging aerially for the UAV fleet. Specifically, in order to ensure the stability of the system, wireless charging is considered for enhancing the lifetime of each UAV. However, it may be unbalanced. Accordingly, the issue of charging scheduling arises. The problem is formulated as a Q-Learning problem in this paper. Agent constantly explores and optimizes its scheduling policy. Finally, it can adapt to different UAV distribution situations. We take the energy levels of UAVs as input, which is easy for implementation. We have compared with two other algorithms (RSA and LESA) and compared with the case of no-charging. The results show that comparing with no-charging, the stability of the system can be improved by up to 78%. Compared with RSA and LESA, system stability is increased by up to 30%-40%. In addition, our method is more flexible and applicable to fleet than other ways (such as return to base station, landing to power line, ground laser, etc) to supplement energy.