Integration of 3D and 2D imaging data for assured navigation in unknown environments

Evan Dill, M. U. de Haag
{"title":"Integration of 3D and 2D imaging data for assured navigation in unknown environments","authors":"Evan Dill, M. U. de Haag","doi":"10.1109/PLANS.2010.5507244","DOIUrl":null,"url":null,"abstract":"This paper discusses the development of a novel navigation method that integrates three-dimensional (3D) point cloud data, two-dimensional (2D) digital camera data, and data from an Inertial Measurement Unit (IMU). The target application is to provide an accurate position and attitude determination of unmanned aerial vehicles (UAV) or autonomous ground vehicles (AGV) in any urban or indoor environments, during any scenario. In some urban and indoor environments, GPS signals are attainable and usable for these target applications, but this is not always the case. GPS position capability may not only be unavailable due to shadowing, significant signal attenuation or multipath, but also due to intentional denial or deception. In these scenarios where GPS is not a viable, or reliable option, a system must be developed that compliments GPS and works in the environments where GPS encounters problems. The proposed algorithm is an effort to show one possible method that a complementary system to GPS could use. It extracts key features such as planar surfaces, lines, corners, and points from both the 3D (point-cloud) and 2D (intensity) imagery. Consecutive observations of corresponding features in the 3D and 2D image frames are then used to compute estimates of position and orientation changes. Since the use of 3D image features for positioning suffers from limited feature observability resulting in deteriorated position accuracies, and the 2D imagery suffers from an unknown depth when estimating the pose from consecutive image frames, it is expected that the integration of both data sets will alleviate the problems with the individual methods resulting in a position and attitude determination procedure with a high level of assurance. An Inertial Measurement Unit (IMU) is used to set up the tracking gates necessary to perform data association of the features in consecutive frames. Finally, the position and orientation change estimates can be used to correct for and mitigate the IMU drift errors.","PeriodicalId":94036,"journal":{"name":"IEEE/ION Position Location and Navigation Symposium : [proceedings]. IEEE/ION Position Location and Navigation Symposium","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2010-05-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE/ION Position Location and Navigation Symposium : [proceedings]. IEEE/ION Position Location and Navigation Symposium","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PLANS.2010.5507244","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7

Abstract

This paper discusses the development of a novel navigation method that integrates three-dimensional (3D) point cloud data, two-dimensional (2D) digital camera data, and data from an Inertial Measurement Unit (IMU). The target application is to provide an accurate position and attitude determination of unmanned aerial vehicles (UAV) or autonomous ground vehicles (AGV) in any urban or indoor environments, during any scenario. In some urban and indoor environments, GPS signals are attainable and usable for these target applications, but this is not always the case. GPS position capability may not only be unavailable due to shadowing, significant signal attenuation or multipath, but also due to intentional denial or deception. In these scenarios where GPS is not a viable, or reliable option, a system must be developed that compliments GPS and works in the environments where GPS encounters problems. The proposed algorithm is an effort to show one possible method that a complementary system to GPS could use. It extracts key features such as planar surfaces, lines, corners, and points from both the 3D (point-cloud) and 2D (intensity) imagery. Consecutive observations of corresponding features in the 3D and 2D image frames are then used to compute estimates of position and orientation changes. Since the use of 3D image features for positioning suffers from limited feature observability resulting in deteriorated position accuracies, and the 2D imagery suffers from an unknown depth when estimating the pose from consecutive image frames, it is expected that the integration of both data sets will alleviate the problems with the individual methods resulting in a position and attitude determination procedure with a high level of assurance. An Inertial Measurement Unit (IMU) is used to set up the tracking gates necessary to perform data association of the features in consecutive frames. Finally, the position and orientation change estimates can be used to correct for and mitigate the IMU drift errors.
集成3D和2D成像数据,确保在未知环境中导航
本文讨论了一种集成三维(3D)点云数据、二维(2D)数码相机数据和惯性测量单元(IMU)数据的新型导航方法的发展。目标应用是在任何场景下,在任何城市或室内环境中为无人驾驶飞行器(UAV)或自主地面车辆(AGV)提供准确的位置和姿态测定。在一些城市和室内环境中,GPS信号可用于这些目标应用,但情况并非总是如此。GPS定位能力不仅可能由于阴影、显著的信号衰减或多径而不可用,而且还可能由于故意拒绝或欺骗而不可用。在GPS不可行或不可靠的情况下,必须开发一种系统来补充GPS,并在GPS遇到问题的环境中工作。所提出的算法是为了展示一种与GPS互补系统可以使用的可能方法。它从3D(点云)和2D(强度)图像中提取关键特征,如平面、线、角和点。然后使用对三维和二维图像帧中相应特征的连续观测来计算位置和方向变化的估计。由于使用3D图像特征进行定位存在特征可观察性有限导致位置精度下降的问题,并且2D图像在从连续图像帧估计姿态时存在未知深度的问题,因此预计两种数据集的集成将缓解单个方法的问题,从而产生具有高水平保证的位置和姿态确定过程。惯性测量单元(IMU)用于设置跟踪门,以便在连续帧中执行特征的数据关联。最后,利用位置和方向变化估计来修正和减轻IMU漂移误差。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信