DGNSS-Vision Integration for Robust and Accurate Relative Spacecraft Navigation

V. Capuano, A. Harvard, Yvette Lin, Soon-Jo Chung
{"title":"DGNSS-Vision Integration for Robust and Accurate Relative Spacecraft Navigation","authors":"V. Capuano, A. Harvard, Yvette Lin, Soon-Jo Chung","doi":"10.33012/2019.16961","DOIUrl":null,"url":null,"abstract":"Relative spacecraft navigation based on Global Navigation Satellite System (GNSS) has been already successfully performed in low earth orbit (LEO). Very high accuracy, of the order of the millimeter, has been achieved in postprocessing using carrier phase differential GNSS (CDGNSS) and recovering the integer number of wavelength (Ambiguity) \nbetween the GNSS transmitters and the receiver. However the performance achievable on-board, in real time, \nabove LEO and the GNSS constellation would be significantly lower due to limited computational resources, weaker \nsignals, and worse geometric dilution of precision (GDOP). At the same time, monocular vision provides lower accuracy \nthan CDGNSS when there is significant spacecraft separation, and it becomes even lower for larger baselines and wider field of views (FOVs). In order to increase the robustness, continuity, and accuracy of a real-time on-board \nGNSS-based relative navigation solution in a GNSS degraded environment such as Geosynchronous and High Earth \nOrbits, we propose a novel navigation architecture based on a tight fusion of carrier phase GNSS observations and \nmonocular vision-based measurements, which enables fast autonomous relative pose estimation of cooperative spacecraft \nalso in case of high GDOP and low GNSS visibility, where the GNSS signals are degraded, weak, or cannot be \ntracked continuously. \nIn this paper we describe the architecture and implementation of a multi-sensor navigation solution and validate the \nproposed method in simulation. We use a dataset of images synthetically generated according to a chaser/target relative \nmotion in Geostationary Earth Orbit (GEO) and realistic carrier phase and code-based GNSS observations simulated \nat the receiver position in the same orbits. We demonstrate that our fusion solution provides higher accuracy, higher \nrobustness, and faster ambiguity resolution in case of degraded GNSS signal conditions, even when using high FOV \ncameras.","PeriodicalId":381025,"journal":{"name":"Proceedings of the 32nd International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2019)","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"11","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 32nd International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2019)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.33012/2019.16961","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 11

Abstract

Relative spacecraft navigation based on Global Navigation Satellite System (GNSS) has been already successfully performed in low earth orbit (LEO). Very high accuracy, of the order of the millimeter, has been achieved in postprocessing using carrier phase differential GNSS (CDGNSS) and recovering the integer number of wavelength (Ambiguity) between the GNSS transmitters and the receiver. However the performance achievable on-board, in real time, above LEO and the GNSS constellation would be significantly lower due to limited computational resources, weaker signals, and worse geometric dilution of precision (GDOP). At the same time, monocular vision provides lower accuracy than CDGNSS when there is significant spacecraft separation, and it becomes even lower for larger baselines and wider field of views (FOVs). In order to increase the robustness, continuity, and accuracy of a real-time on-board GNSS-based relative navigation solution in a GNSS degraded environment such as Geosynchronous and High Earth Orbits, we propose a novel navigation architecture based on a tight fusion of carrier phase GNSS observations and monocular vision-based measurements, which enables fast autonomous relative pose estimation of cooperative spacecraft also in case of high GDOP and low GNSS visibility, where the GNSS signals are degraded, weak, or cannot be tracked continuously. In this paper we describe the architecture and implementation of a multi-sensor navigation solution and validate the proposed method in simulation. We use a dataset of images synthetically generated according to a chaser/target relative motion in Geostationary Earth Orbit (GEO) and realistic carrier phase and code-based GNSS observations simulated at the receiver position in the same orbits. We demonstrate that our fusion solution provides higher accuracy, higher robustness, and faster ambiguity resolution in case of degraded GNSS signal conditions, even when using high FOV cameras.
基于dgnss -视觉的航天器相对导航鲁棒精确集成
基于全球卫星导航系统(GNSS)的航天器相对导航已经在近地轨道(LEO)成功实现。利用载波相位差分GNSS (CDGNSS)进行后处理,并恢复GNSS发射机和接收机之间的波长整数(模糊度),实现了毫米级的高精度。然而,由于有限的计算资源、较弱的信号和较差的几何精度稀释(GDOP),在LEO和GNSS星座上方实时实现的机载性能将显著降低。与此同时,当存在明显的航天器分离时,单目视觉的精度低于CDGNSS,当基线更大、视场(fov)更宽时,单目视觉的精度更低。为了提高实时星载GNSS相对导航解决方案在地球同步轨道和高地球轨道等GNSS退化环境下的鲁棒性、连续性和准确性,我们提出了一种基于载波相位GNSS观测和单目视觉测量紧密融合的新型导航架构,该架构能够在高GDOP和低GNSS能见度情况下实现协作航天器的快速自主相对姿态估计。其中GNSS信号降级、微弱或无法连续跟踪。在本文中,我们描述了一个多传感器导航方案的结构和实现,并在仿真中验证了所提出的方法。我们使用了根据地球静止轨道(GEO)中追逐者/目标相对运动合成的图像数据集,以及在相同轨道上接收器位置模拟的真实载波相位和基于代码的GNSS观测数据集。我们证明,在GNSS信号退化的情况下,即使使用高视场摄像机,我们的融合解决方案也能提供更高的精度、更高的鲁棒性和更快的模糊分辨率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信