Visual odometry with failure detection for the aegis UAV

Jose F. Roger-Verdeguer, Mikael Mannberg, A. Savvaris
{"title":"Visual odometry with failure detection for the aegis UAV","authors":"Jose F. Roger-Verdeguer, Mikael Mannberg, A. Savvaris","doi":"10.1109/IST.2012.6295501","DOIUrl":null,"url":null,"abstract":"In this paper, a visual odometry system that has been developed to help solve the navigation problem on a UAV is presented. This system is part of the vision-based positioning system that will be used on a new UAV currently in development at Cranfield University. Using images captured from a single camera, the ego-motion of the aircraft is estimated and the relative position is updated every time a new frame is processed. To understand how this is achieved, each of the steps in the implemented visual odometry algorithm is explained in detail, taking a look to the techniques that make it possible. In addition, a failure detection system based on the corner tracking error has also been added to the algorithm to make the system more robust and able to automatically deactivate in poor conditions. Following the description of the visual odometry system, its performance is evaluated using terrain images from Google Earth (GE) and also from a real aerial footage captured on board a Curtis Pitts aircraft. The influence on the error of several factors such as the altitude, the flight speed, the terrain type and the number of corners to be tracked is studied and explained in detail. Finally, the future work to integrate the visual odometry system with a geolocation system to produce the vision-based positioning system of the Aegis UAV is briefly outlined.","PeriodicalId":213330,"journal":{"name":"2012 IEEE International Conference on Imaging Systems and Techniques Proceedings","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 IEEE International Conference on Imaging Systems and Techniques Proceedings","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IST.2012.6295501","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7

Abstract

In this paper, a visual odometry system that has been developed to help solve the navigation problem on a UAV is presented. This system is part of the vision-based positioning system that will be used on a new UAV currently in development at Cranfield University. Using images captured from a single camera, the ego-motion of the aircraft is estimated and the relative position is updated every time a new frame is processed. To understand how this is achieved, each of the steps in the implemented visual odometry algorithm is explained in detail, taking a look to the techniques that make it possible. In addition, a failure detection system based on the corner tracking error has also been added to the algorithm to make the system more robust and able to automatically deactivate in poor conditions. Following the description of the visual odometry system, its performance is evaluated using terrain images from Google Earth (GE) and also from a real aerial footage captured on board a Curtis Pitts aircraft. The influence on the error of several factors such as the altitude, the flight speed, the terrain type and the number of corners to be tracked is studied and explained in detail. Finally, the future work to integrate the visual odometry system with a geolocation system to produce the vision-based positioning system of the Aegis UAV is briefly outlined.
基于故障检测的宙斯盾无人机视觉里程计
本文介绍了一种用于解决无人机导航问题的视觉里程计系统。该系统是基于视觉的定位系统的一部分,将用于克兰菲尔德大学目前正在开发的新型无人机。使用从单个摄像机捕获的图像,估计飞机的自我运动,并在每次处理新帧时更新相对位置。为了理解这是如何实现的,我们详细解释了实现视觉里程计算法中的每个步骤,并介绍了使其成为可能的技术。此外,算法中还加入了基于转角跟踪误差的故障检测系统,使系统具有更强的鲁棒性,能够在恶劣条件下自动停用。根据视觉里程计系统的描述,使用谷歌地球(GE)的地形图像以及柯蒂斯皮茨飞机上拍摄的真实空中镜头来评估其性能。详细分析了高度、飞行速度、地形类型、待跟踪角数等因素对误差的影响。最后,简要概述了将视觉里程计系统与地理定位系统相结合以形成基于视觉的宙斯盾无人机定位系统的未来工作。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信