轨道近距离操作中的三维重建

Martin Dziura, Tim Wiese, J. Harder
{"title":"轨道近距离操作中的三维重建","authors":"Martin Dziura, Tim Wiese, J. Harder","doi":"10.1109/AERO.2017.7943679","DOIUrl":null,"url":null,"abstract":"This paper presents the application of 3D object reconstruction in orbital proximity operations. This promising novel technology is proposed to improve both Human Machine Interfaces (HMI) and autonomous algorithms for Guidance, Navigation and Control (GNC) in terms of situation awareness, docking efficiency and resource consumption. During this study a software framework was developed which implements a flexible real-time-capable toolchain to perform all necessary tasks for 3D object reconstruction. A driver module reads and filters the data stream from a given optical sensor (e.g. stereo camera or combined visual camera and infrared time-of-flight sensor). Image maps and depth information are then provided to computer vision algorithms for Simultaneous Localization and Mapping (SLAM) and algorithms for 3D reconstruction. As an output these algorithms generate a 3D point cloud and a 3D mesh that can be displayed to the human operator, fed into GNC algorithms or further processed to generate adequate surface models for visualization and inspection. This concept was verified in the Robotic Actuation and On-Orbit Navigation Laboratory (RACOON-Lab), a simulation environment for end-to-end technology development and evaluation for close-range proximity operations. A sub-scale hardware mock-up of a geostationary target satellite attached to the RACOON-Lab facility was successfully reconstructed using the described setup. During the simulated maneuver a rotating target satellite was observed by the sensors attached to the simulated chasing satellite. The software was executed on the embedded computer which is part of the facility. The cameras Kinect v2 and ZED produced adequate 3D reconstructions in intervals of less than 10 seconds. The Kinect v2 generates more accurate structures and includes more details, whereas the ZED results in a better color fidelity. Both cameras were sensitive to changes of lighting conditions. For longer acquisition times, drift caused by uncertainties in the pose estimation decreases the quality of the reconstruction significantly.","PeriodicalId":224475,"journal":{"name":"2017 IEEE Aerospace Conference","volume":"48 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-03-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"3D reconstruction in orbital proximity operations\",\"authors\":\"Martin Dziura, Tim Wiese, J. Harder\",\"doi\":\"10.1109/AERO.2017.7943679\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents the application of 3D object reconstruction in orbital proximity operations. This promising novel technology is proposed to improve both Human Machine Interfaces (HMI) and autonomous algorithms for Guidance, Navigation and Control (GNC) in terms of situation awareness, docking efficiency and resource consumption. During this study a software framework was developed which implements a flexible real-time-capable toolchain to perform all necessary tasks for 3D object reconstruction. A driver module reads and filters the data stream from a given optical sensor (e.g. stereo camera or combined visual camera and infrared time-of-flight sensor). Image maps and depth information are then provided to computer vision algorithms for Simultaneous Localization and Mapping (SLAM) and algorithms for 3D reconstruction. As an output these algorithms generate a 3D point cloud and a 3D mesh that can be displayed to the human operator, fed into GNC algorithms or further processed to generate adequate surface models for visualization and inspection. This concept was verified in the Robotic Actuation and On-Orbit Navigation Laboratory (RACOON-Lab), a simulation environment for end-to-end technology development and evaluation for close-range proximity operations. A sub-scale hardware mock-up of a geostationary target satellite attached to the RACOON-Lab facility was successfully reconstructed using the described setup. During the simulated maneuver a rotating target satellite was observed by the sensors attached to the simulated chasing satellite. The software was executed on the embedded computer which is part of the facility. The cameras Kinect v2 and ZED produced adequate 3D reconstructions in intervals of less than 10 seconds. The Kinect v2 generates more accurate structures and includes more details, whereas the ZED results in a better color fidelity. Both cameras were sensitive to changes of lighting conditions. For longer acquisition times, drift caused by uncertainties in the pose estimation decreases the quality of the reconstruction significantly.\",\"PeriodicalId\":224475,\"journal\":{\"name\":\"2017 IEEE Aerospace Conference\",\"volume\":\"48 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-03-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 IEEE Aerospace Conference\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/AERO.2017.7943679\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE Aerospace Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AERO.2017.7943679","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

摘要

介绍了三维目标重建在轨道接近作战中的应用。提出了一种在态势感知、对接效率和资源消耗方面改进制导、导航和控制(GNC)的人机界面(HMI)和自主算法的新技术。在这项研究中,开发了一个软件框架,实现了一个灵活的实时工具链,以执行三维物体重建的所有必要任务。驱动模块读取并过滤来自给定光学传感器的数据流(例如立体相机或组合视觉相机和红外飞行时间传感器)。然后将图像地图和深度信息提供给计算机视觉算法,用于同时定位和地图绘制(SLAM)和3D重建算法。作为输出,这些算法生成一个3D点云和3D网格,可以显示给人类操作员,输入GNC算法或进一步处理以生成足够的表面模型,用于可视化和检查。这个概念在机器人驱动和在轨导航实验室(RACOON-Lab)中得到了验证,这是一个模拟环境,用于近距离接近操作的端到端技术开发和评估。使用所描述的设置,成功地重建了附着在浣熊实验室设施上的地球同步目标卫星的亚比例硬件模型。在模拟机动过程中,模拟跟踪卫星上的传感器观测到一个旋转的目标卫星。该软件在作为设备一部分的嵌入式计算机上执行。Kinect v2和ZED相机在不到10秒的时间间隔内产生了足够的3D重建。Kinect v2产生更精确的结构,包含更多的细节,而ZED产生更好的色彩保真度。这两款相机对光照条件的变化都很敏感。对于较长的采集时间,姿态估计中的不确定性引起的漂移会显著降低重建的质量。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
3D reconstruction in orbital proximity operations
This paper presents the application of 3D object reconstruction in orbital proximity operations. This promising novel technology is proposed to improve both Human Machine Interfaces (HMI) and autonomous algorithms for Guidance, Navigation and Control (GNC) in terms of situation awareness, docking efficiency and resource consumption. During this study a software framework was developed which implements a flexible real-time-capable toolchain to perform all necessary tasks for 3D object reconstruction. A driver module reads and filters the data stream from a given optical sensor (e.g. stereo camera or combined visual camera and infrared time-of-flight sensor). Image maps and depth information are then provided to computer vision algorithms for Simultaneous Localization and Mapping (SLAM) and algorithms for 3D reconstruction. As an output these algorithms generate a 3D point cloud and a 3D mesh that can be displayed to the human operator, fed into GNC algorithms or further processed to generate adequate surface models for visualization and inspection. This concept was verified in the Robotic Actuation and On-Orbit Navigation Laboratory (RACOON-Lab), a simulation environment for end-to-end technology development and evaluation for close-range proximity operations. A sub-scale hardware mock-up of a geostationary target satellite attached to the RACOON-Lab facility was successfully reconstructed using the described setup. During the simulated maneuver a rotating target satellite was observed by the sensors attached to the simulated chasing satellite. The software was executed on the embedded computer which is part of the facility. The cameras Kinect v2 and ZED produced adequate 3D reconstructions in intervals of less than 10 seconds. The Kinect v2 generates more accurate structures and includes more details, whereas the ZED results in a better color fidelity. Both cameras were sensitive to changes of lighting conditions. For longer acquisition times, drift caused by uncertainties in the pose estimation decreases the quality of the reconstruction significantly.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信