A vision-based system for autonomous vertical landing of unmanned aerial vehicles

J. Wubben, Francisco Fabra, C. Calafate, Tomasz Krzeszowski, J. Márquez-Barja, Juan-Carlos Cano, P. Manzoni
{"title":"A vision-based system for autonomous vertical landing of unmanned aerial vehicles","authors":"J. Wubben, Francisco Fabra, C. Calafate, Tomasz Krzeszowski, J. Márquez-Barja, Juan-Carlos Cano, P. Manzoni","doi":"10.1109/DS-RT47707.2019.8958701","DOIUrl":null,"url":null,"abstract":"Over the last few years, different researchers have been developing protocols and applications in order to land unmanned aerial vehicles (UAVs) autonomously. However, most of the proposed protocols rely on expensive equipment or do not satisfy the high precision needs of some UAV applications, such as package retrieval and delivery. Therefore, in this paper, we present a solution for high precision landing based on the use of ArUco markers. In our solution, a UAV equipped with a camera is able to detect ArUco markers from an altitude of 20 meters. Once the marker is detected, the UAV changes its flight behavior in order to land on the exact position where the marker is located. We evaluated our proposal using our own UAV simulation platform (ArduSim), and validated it using real UAVs. The results show an average offset of only 11 centimeters, which vastly improves the landing accuracy compared to the traditional GPS-based landing, that typically deviates from the intended target by 1 to 3 meters.","PeriodicalId":377914,"journal":{"name":"2019 IEEE/ACM 23rd International Symposium on Distributed Simulation and Real Time Applications (DS-RT)","volume":"38 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE/ACM 23rd International Symposium on Distributed Simulation and Real Time Applications (DS-RT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DS-RT47707.2019.8958701","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Over the last few years, different researchers have been developing protocols and applications in order to land unmanned aerial vehicles (UAVs) autonomously. However, most of the proposed protocols rely on expensive equipment or do not satisfy the high precision needs of some UAV applications, such as package retrieval and delivery. Therefore, in this paper, we present a solution for high precision landing based on the use of ArUco markers. In our solution, a UAV equipped with a camera is able to detect ArUco markers from an altitude of 20 meters. Once the marker is detected, the UAV changes its flight behavior in order to land on the exact position where the marker is located. We evaluated our proposal using our own UAV simulation platform (ArduSim), and validated it using real UAVs. The results show an average offset of only 11 centimeters, which vastly improves the landing accuracy compared to the traditional GPS-based landing, that typically deviates from the intended target by 1 to 3 meters.
一种基于视觉的无人机自主垂直降落系统
在过去的几年里,不同的研究人员一直在开发协议和应用程序,以实现无人驾驶飞行器(uav)的自主着陆。然而,大多数提出的协议依赖于昂贵的设备或不能满足某些无人机应用的高精度需求,例如包裹检索和交付。因此,本文提出了一种基于ArUco标记的高精度着陆解决方案。在我们的解决方案中,配备摄像头的无人机能够从20米的高度检测到ArUco标记。一旦检测到标记,无人机改变其飞行行为,以便降落在标记所在的确切位置。我们使用我们自己的无人机仿真平台(ArduSim)评估了我们的建议,并使用真实的无人机验证了它。结果显示,平均偏移量仅为11厘米,与传统的gps着陆相比,这大大提高了着陆精度,传统的gps着陆通常会偏离预定目标1到3米。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信