{"title":"轨道近距离操作中的三维重建","authors":"Martin Dziura, Tim Wiese, J. Harder","doi":"10.1109/AERO.2017.7943679","DOIUrl":null,"url":null,"abstract":"This paper presents the application of 3D object reconstruction in orbital proximity operations. This promising novel technology is proposed to improve both Human Machine Interfaces (HMI) and autonomous algorithms for Guidance, Navigation and Control (GNC) in terms of situation awareness, docking efficiency and resource consumption. During this study a software framework was developed which implements a flexible real-time-capable toolchain to perform all necessary tasks for 3D object reconstruction. A driver module reads and filters the data stream from a given optical sensor (e.g. stereo camera or combined visual camera and infrared time-of-flight sensor). Image maps and depth information are then provided to computer vision algorithms for Simultaneous Localization and Mapping (SLAM) and algorithms for 3D reconstruction. As an output these algorithms generate a 3D point cloud and a 3D mesh that can be displayed to the human operator, fed into GNC algorithms or further processed to generate adequate surface models for visualization and inspection. This concept was verified in the Robotic Actuation and On-Orbit Navigation Laboratory (RACOON-Lab), a simulation environment for end-to-end technology development and evaluation for close-range proximity operations. A sub-scale hardware mock-up of a geostationary target satellite attached to the RACOON-Lab facility was successfully reconstructed using the described setup. During the simulated maneuver a rotating target satellite was observed by the sensors attached to the simulated chasing satellite. The software was executed on the embedded computer which is part of the facility. The cameras Kinect v2 and ZED produced adequate 3D reconstructions in intervals of less than 10 seconds. The Kinect v2 generates more accurate structures and includes more details, whereas the ZED results in a better color fidelity. Both cameras were sensitive to changes of lighting conditions. For longer acquisition times, drift caused by uncertainties in the pose estimation decreases the quality of the reconstruction significantly.","PeriodicalId":224475,"journal":{"name":"2017 IEEE Aerospace Conference","volume":"48 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-03-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"3D reconstruction in orbital proximity operations\",\"authors\":\"Martin Dziura, Tim Wiese, J. Harder\",\"doi\":\"10.1109/AERO.2017.7943679\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents the application of 3D object reconstruction in orbital proximity operations. This promising novel technology is proposed to improve both Human Machine Interfaces (HMI) and autonomous algorithms for Guidance, Navigation and Control (GNC) in terms of situation awareness, docking efficiency and resource consumption. During this study a software framework was developed which implements a flexible real-time-capable toolchain to perform all necessary tasks for 3D object reconstruction. A driver module reads and filters the data stream from a given optical sensor (e.g. stereo camera or combined visual camera and infrared time-of-flight sensor). Image maps and depth information are then provided to computer vision algorithms for Simultaneous Localization and Mapping (SLAM) and algorithms for 3D reconstruction. As an output these algorithms generate a 3D point cloud and a 3D mesh that can be displayed to the human operator, fed into GNC algorithms or further processed to generate adequate surface models for visualization and inspection. This concept was verified in the Robotic Actuation and On-Orbit Navigation Laboratory (RACOON-Lab), a simulation environment for end-to-end technology development and evaluation for close-range proximity operations. A sub-scale hardware mock-up of a geostationary target satellite attached to the RACOON-Lab facility was successfully reconstructed using the described setup. During the simulated maneuver a rotating target satellite was observed by the sensors attached to the simulated chasing satellite. The software was executed on the embedded computer which is part of the facility. The cameras Kinect v2 and ZED produced adequate 3D reconstructions in intervals of less than 10 seconds. The Kinect v2 generates more accurate structures and includes more details, whereas the ZED results in a better color fidelity. Both cameras were sensitive to changes of lighting conditions. For longer acquisition times, drift caused by uncertainties in the pose estimation decreases the quality of the reconstruction significantly.\",\"PeriodicalId\":224475,\"journal\":{\"name\":\"2017 IEEE Aerospace Conference\",\"volume\":\"48 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-03-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 IEEE Aerospace Conference\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/AERO.2017.7943679\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE Aerospace Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AERO.2017.7943679","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
This paper presents the application of 3D object reconstruction in orbital proximity operations. This promising novel technology is proposed to improve both Human Machine Interfaces (HMI) and autonomous algorithms for Guidance, Navigation and Control (GNC) in terms of situation awareness, docking efficiency and resource consumption. During this study a software framework was developed which implements a flexible real-time-capable toolchain to perform all necessary tasks for 3D object reconstruction. A driver module reads and filters the data stream from a given optical sensor (e.g. stereo camera or combined visual camera and infrared time-of-flight sensor). Image maps and depth information are then provided to computer vision algorithms for Simultaneous Localization and Mapping (SLAM) and algorithms for 3D reconstruction. As an output these algorithms generate a 3D point cloud and a 3D mesh that can be displayed to the human operator, fed into GNC algorithms or further processed to generate adequate surface models for visualization and inspection. This concept was verified in the Robotic Actuation and On-Orbit Navigation Laboratory (RACOON-Lab), a simulation environment for end-to-end technology development and evaluation for close-range proximity operations. A sub-scale hardware mock-up of a geostationary target satellite attached to the RACOON-Lab facility was successfully reconstructed using the described setup. During the simulated maneuver a rotating target satellite was observed by the sensors attached to the simulated chasing satellite. The software was executed on the embedded computer which is part of the facility. The cameras Kinect v2 and ZED produced adequate 3D reconstructions in intervals of less than 10 seconds. The Kinect v2 generates more accurate structures and includes more details, whereas the ZED results in a better color fidelity. Both cameras were sensitive to changes of lighting conditions. For longer acquisition times, drift caused by uncertainties in the pose estimation decreases the quality of the reconstruction significantly.