Chichao Cheng;Guangming Wang;Yin-Dong Zheng;Lu Liu;Hesheng Wang
{"title":"MonoPCFlow: Enabling Efficient Scene Flow Estimation From Monocular View","authors":"Chichao Cheng;Guangming Wang;Yin-Dong Zheng;Lu Liu;Hesheng Wang","doi":"10.1109/TIM.2025.3600732","DOIUrl":null,"url":null,"abstract":"Scene flow captures the dynamic changes of points in a 3-D scene, essential for understanding motion in physical environments. Light detection and ranging (LiDAR)-based scene flow estimation methods face challenges related to resolution, refresh rate, and cost. In contrast, monocular image-based methods estimate optical flow and depth separately at different stages. This fragmented approach inevitably compromises spatial–temporal consistency and introduces error accumulation. We propose monocular point cloud FlowNet (MonoPCFlow), a novel framework for scene flow estimation directly from a pair of consecutive monocular images. We integrate pseudo-LiDAR representations with dense 3-D scene flow estimation, effectively bridging the 2-D-to-3-D domain gap for monocular motion analysis. We develop a depth-enhanced refinement module that mitigates information loss in pseudo-LiDAR generation, preserving critical geometric and appearance features to improve scene flow accuracy. Experimental validation demonstrates MonoPCFlow’s superior performance, achieving 37.0% (FlyingThings3D) and 39.7% Karlsruhe Institute of Technology and Toyota Institute of Technology (KITTI) relative reductions in endpoint-error compared to contemporary benchmarks.","PeriodicalId":13341,"journal":{"name":"IEEE Transactions on Instrumentation and Measurement","volume":"74 ","pages":"1-10"},"PeriodicalIF":5.9000,"publicationDate":"2025-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Instrumentation and Measurement","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/11147169/","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Scene flow captures the dynamic changes of points in a 3-D scene, essential for understanding motion in physical environments. Light detection and ranging (LiDAR)-based scene flow estimation methods face challenges related to resolution, refresh rate, and cost. In contrast, monocular image-based methods estimate optical flow and depth separately at different stages. This fragmented approach inevitably compromises spatial–temporal consistency and introduces error accumulation. We propose monocular point cloud FlowNet (MonoPCFlow), a novel framework for scene flow estimation directly from a pair of consecutive monocular images. We integrate pseudo-LiDAR representations with dense 3-D scene flow estimation, effectively bridging the 2-D-to-3-D domain gap for monocular motion analysis. We develop a depth-enhanced refinement module that mitigates information loss in pseudo-LiDAR generation, preserving critical geometric and appearance features to improve scene flow accuracy. Experimental validation demonstrates MonoPCFlow’s superior performance, achieving 37.0% (FlyingThings3D) and 39.7% Karlsruhe Institute of Technology and Toyota Institute of Technology (KITTI) relative reductions in endpoint-error compared to contemporary benchmarks.
期刊介绍:
Papers are sought that address innovative solutions to the development and use of electrical and electronic instruments and equipment to measure, monitor and/or record physical phenomena for the purpose of advancing measurement science, methods, functionality and applications. The scope of these papers may encompass: (1) theory, methodology, and practice of measurement; (2) design, development and evaluation of instrumentation and measurement systems and components used in generating, acquiring, conditioning and processing signals; (3) analysis, representation, display, and preservation of the information obtained from a set of measurements; and (4) scientific and technical support to establishment and maintenance of technical standards in the field of Instrumentation and Measurement.