{"title":"FastFusion: Deep stereo-LiDAR fusion for real-time high-precision dense depth sensing","authors":"Haitao Meng, Changcai Li, Chonghao Zhong, Jianfeng Gu, Gang Chen, Alois Knoll","doi":"10.1002/rob.22216","DOIUrl":null,"url":null,"abstract":"<p>Light detection and ranging (LiDAR) and stereo cameras are two generally used solutions for perceiving 3D information. The complementary properties of these two sensor modalities motivate a fusion to derive practicable depth sensing toward real-world applications. Promoted by deep neural network (DNN) techniques, recent works achieve superior performance on accuracy. However, the complex architecture and the sheer number of DNN parameters often lead to poor generalization capacity and non-real-time computing. In this paper, we present FastFusion, a three-stage stereo-LiDAR deep fusion scheme, which integrates the LiDAR priors into each step of classical stereo-matching taxonomy, gaining high-precision dense depth sensing in a real-time manner. We integrate stereo-LiDAR information by taking advantage of a compact binary neural network and utilize the proposed cross-based LiDAR trust aggregation to further fuse the sparse LiDAR measurements in the back-end of stereo matching. To align the photometrical of the input image and the depth of the estimation, we introduce a refinement network to guarantee consistency. More importantly, we present a graphic processing unit-based acceleration framework for providing a low-latency implementation of FastFusion, gaining both accuracy improvement and real-time responsiveness. In the experiments, we demonstrate the effectiveness and practicability of FastFusion, which obtains a significant speedup over state-of-the-art baselines while achieving comparable accuracy on depth sensing. The video demo for real-time depth estimation of FastFusion on the real-world driving scenario is available at https://youtu.be/nP7cls2BA8s.</p>","PeriodicalId":192,"journal":{"name":"Journal of Field Robotics","volume":"40 7","pages":"1804-1816"},"PeriodicalIF":4.2000,"publicationDate":"2023-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Field Robotics","FirstCategoryId":"94","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/rob.22216","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0
Abstract
Light detection and ranging (LiDAR) and stereo cameras are two generally used solutions for perceiving 3D information. The complementary properties of these two sensor modalities motivate a fusion to derive practicable depth sensing toward real-world applications. Promoted by deep neural network (DNN) techniques, recent works achieve superior performance on accuracy. However, the complex architecture and the sheer number of DNN parameters often lead to poor generalization capacity and non-real-time computing. In this paper, we present FastFusion, a three-stage stereo-LiDAR deep fusion scheme, which integrates the LiDAR priors into each step of classical stereo-matching taxonomy, gaining high-precision dense depth sensing in a real-time manner. We integrate stereo-LiDAR information by taking advantage of a compact binary neural network and utilize the proposed cross-based LiDAR trust aggregation to further fuse the sparse LiDAR measurements in the back-end of stereo matching. To align the photometrical of the input image and the depth of the estimation, we introduce a refinement network to guarantee consistency. More importantly, we present a graphic processing unit-based acceleration framework for providing a low-latency implementation of FastFusion, gaining both accuracy improvement and real-time responsiveness. In the experiments, we demonstrate the effectiveness and practicability of FastFusion, which obtains a significant speedup over state-of-the-art baselines while achieving comparable accuracy on depth sensing. The video demo for real-time depth estimation of FastFusion on the real-world driving scenario is available at https://youtu.be/nP7cls2BA8s.
期刊介绍:
The Journal of Field Robotics seeks to promote scholarly publications dealing with the fundamentals of robotics in unstructured and dynamic environments.
The Journal focuses on experimental robotics and encourages publication of work that has both theoretical and practical significance.