用于行星漫游车的地标感知自主里程测量校正和地图修剪

IF 3.1 2区 物理与天体物理 Q1 ENGINEERING, AEROSPACE
Chenxi Lu , Meng Yu , Hua Li , Hutao Cui
{"title":"用于行星漫游车的地标感知自主里程测量校正和地图修剪","authors":"Chenxi Lu ,&nbsp;Meng Yu ,&nbsp;Hua Li ,&nbsp;Hutao Cui","doi":"10.1016/j.actaastro.2024.10.025","DOIUrl":null,"url":null,"abstract":"<div><div>Planetary rover autonomous localization is paramount for a planetary surface exploration mission. However, existing methods demonstrate limited localization accuracy, mostly due to the unstructured texture characterization of planetary surface. In response, this study presents a novel Neural Radiance Field (NeRF) driven visual odometry correction method that allows for high-precision 6-DoF rover pose estimation and local map pruning. First, an innovative image saliency evaluation approach, combining binarization and feature detection, is introduced to meticulously select landmarks that are conducive to rover re-localization. Subsequently, we conduct 3D reconstruction and rendering of the chosen landmarks based on <em>a-priori</em> knowledge of planetary surface images and their Neural Radiance Field (NeRF) models. High-precision odometry correction is achieved through the optimization of photometric loss between NeRF rending images and real images. Simultaneously, the odometry correction mechanism is employed in an autonomous manner to refine the NeRF model of the corresponding landmark, leading to an improved local map and gradually enhanced rover localization accuracy. Numerical simulation and experiment trials are carried out to evaluate the performance of the proposed method, results of which demonstrate state-of-the-art rover re-localization accuracy and local map pruning.</div></div>","PeriodicalId":44971,"journal":{"name":"Acta Astronautica","volume":"226 ","pages":"Pages 86-96"},"PeriodicalIF":3.1000,"publicationDate":"2024-10-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Landmark-aware autonomous odometry correction and map pruning for planetary rovers\",\"authors\":\"Chenxi Lu ,&nbsp;Meng Yu ,&nbsp;Hua Li ,&nbsp;Hutao Cui\",\"doi\":\"10.1016/j.actaastro.2024.10.025\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Planetary rover autonomous localization is paramount for a planetary surface exploration mission. However, existing methods demonstrate limited localization accuracy, mostly due to the unstructured texture characterization of planetary surface. In response, this study presents a novel Neural Radiance Field (NeRF) driven visual odometry correction method that allows for high-precision 6-DoF rover pose estimation and local map pruning. First, an innovative image saliency evaluation approach, combining binarization and feature detection, is introduced to meticulously select landmarks that are conducive to rover re-localization. Subsequently, we conduct 3D reconstruction and rendering of the chosen landmarks based on <em>a-priori</em> knowledge of planetary surface images and their Neural Radiance Field (NeRF) models. High-precision odometry correction is achieved through the optimization of photometric loss between NeRF rending images and real images. Simultaneously, the odometry correction mechanism is employed in an autonomous manner to refine the NeRF model of the corresponding landmark, leading to an improved local map and gradually enhanced rover localization accuracy. Numerical simulation and experiment trials are carried out to evaluate the performance of the proposed method, results of which demonstrate state-of-the-art rover re-localization accuracy and local map pruning.</div></div>\",\"PeriodicalId\":44971,\"journal\":{\"name\":\"Acta Astronautica\",\"volume\":\"226 \",\"pages\":\"Pages 86-96\"},\"PeriodicalIF\":3.1000,\"publicationDate\":\"2024-10-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Acta Astronautica\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S009457652400599X\",\"RegionNum\":2,\"RegionCategory\":\"物理与天体物理\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, AEROSPACE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Acta Astronautica","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S009457652400599X","RegionNum":2,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, AEROSPACE","Score":null,"Total":0}
引用次数: 0

摘要

行星漫游车自主定位对于行星表面探测任务至关重要。然而,现有的方法显示出有限的定位精度,这主要是由于行星表面非结构化的纹理特征造成的。为此,本研究提出了一种新颖的神经辐射场(NeRF)驱动的视觉里程测量校正方法,可实现高精度的 6-DoF 漫游车姿态估计和局部地图修剪。首先,我们引入了一种创新的图像显著性评估方法,该方法结合了二值化和特征检测,以精心选择有利于漫游者重新定位的地标。随后,我们根据行星表面图像及其神经辐射场(NeRF)模型的先验知识,对所选地标进行三维重建和渲染。通过优化 NeRF 渲染图像与真实图像之间的光度损失,实现了高精度的测距校正。同时,采用自主的测距校正机制来完善相应地标的 NeRF 模型,从而改进本地地图并逐步提高漫游车定位精度。为评估所提方法的性能,我们进行了数值模拟和实验测试,结果表明该方法在漫游车再定位精度和局部地图修剪方面达到了最先进的水平。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Landmark-aware autonomous odometry correction and map pruning for planetary rovers
Planetary rover autonomous localization is paramount for a planetary surface exploration mission. However, existing methods demonstrate limited localization accuracy, mostly due to the unstructured texture characterization of planetary surface. In response, this study presents a novel Neural Radiance Field (NeRF) driven visual odometry correction method that allows for high-precision 6-DoF rover pose estimation and local map pruning. First, an innovative image saliency evaluation approach, combining binarization and feature detection, is introduced to meticulously select landmarks that are conducive to rover re-localization. Subsequently, we conduct 3D reconstruction and rendering of the chosen landmarks based on a-priori knowledge of planetary surface images and their Neural Radiance Field (NeRF) models. High-precision odometry correction is achieved through the optimization of photometric loss between NeRF rending images and real images. Simultaneously, the odometry correction mechanism is employed in an autonomous manner to refine the NeRF model of the corresponding landmark, leading to an improved local map and gradually enhanced rover localization accuracy. Numerical simulation and experiment trials are carried out to evaluate the performance of the proposed method, results of which demonstrate state-of-the-art rover re-localization accuracy and local map pruning.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Acta Astronautica
Acta Astronautica 工程技术-工程:宇航
CiteScore
7.20
自引率
22.90%
发文量
599
审稿时长
53 days
期刊介绍: Acta Astronautica is sponsored by the International Academy of Astronautics. Content is based on original contributions in all fields of basic, engineering, life and social space sciences and of space technology related to: The peaceful scientific exploration of space, Its exploitation for human welfare and progress, Conception, design, development and operation of space-borne and Earth-based systems, In addition to regular issues, the journal publishes selected proceedings of the annual International Astronautical Congress (IAC), transactions of the IAA and special issues on topics of current interest, such as microgravity, space station technology, geostationary orbits, and space economics. Other subject areas include satellite technology, space transportation and communications, space energy, power and propulsion, astrodynamics, extraterrestrial intelligence and Earth observations.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信