Bootstrapped real-time ego motion estimation and scene modeling

Xiang Zhang, Yakup Genç
{"title":"Bootstrapped real-time ego motion estimation and scene modeling","authors":"Xiang Zhang, Yakup Genç","doi":"10.1109/3DIM.2005.25","DOIUrl":null,"url":null,"abstract":"Estimating the motion of a moving camera in an unknown environment is essential for a number of applications ranging from as-built reconstruction to augmented reality. It is a challenging problem especially when real-time performance is required. Our approach is to estimate the camera motion while reconstructing the shape and appearance of the most salient visual features in the scene. In our 3D reconstruction process, correspondences are obtained by tracking the visual features from frame to frame with optical flow tracking. Optical-flow-based tracking methods have limitations in tracking the salient features. Often larger translational motions and even moderate rotational motions can result in drifts. We propose to augment flow-based tracking by building a landmark representation around reliably reconstructed features. A planar patch around the reconstructed feature point provides matching information that prevents drifts in flow-based feature tracking and allows establishment of correspondences across the frames with large baselines. Selective and periodic such correspondence mappings drastically improve scene and motion reconstruction while adhering to the real-time requirements. The method is experimentally tested to be both accurate and computational efficient.","PeriodicalId":170883,"journal":{"name":"Fifth International Conference on 3-D Digital Imaging and Modeling (3DIM'05)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2005-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Fifth International Conference on 3-D Digital Imaging and Modeling (3DIM'05)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/3DIM.2005.25","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Estimating the motion of a moving camera in an unknown environment is essential for a number of applications ranging from as-built reconstruction to augmented reality. It is a challenging problem especially when real-time performance is required. Our approach is to estimate the camera motion while reconstructing the shape and appearance of the most salient visual features in the scene. In our 3D reconstruction process, correspondences are obtained by tracking the visual features from frame to frame with optical flow tracking. Optical-flow-based tracking methods have limitations in tracking the salient features. Often larger translational motions and even moderate rotational motions can result in drifts. We propose to augment flow-based tracking by building a landmark representation around reliably reconstructed features. A planar patch around the reconstructed feature point provides matching information that prevents drifts in flow-based feature tracking and allows establishment of correspondences across the frames with large baselines. Selective and periodic such correspondence mappings drastically improve scene and motion reconstruction while adhering to the real-time requirements. The method is experimentally tested to be both accurate and computational efficient.
bootstrap实时自我运动估计和场景建模
在未知环境中估计移动摄像机的运动对于从建成重建到增强现实的许多应用都是必不可少的。这是一个具有挑战性的问题,特别是当需要实时性能时。我们的方法是估计摄像机的运动,同时重建场景中最显著的视觉特征的形状和外观。在我们的三维重建过程中,我们利用光流跟踪技术对每一帧的视觉特征进行跟踪,从而获得对应关系。基于光流的跟踪方法在跟踪显著特征方面存在局限性。通常较大的平移运动和甚至适度的旋转运动都可能导致漂移。我们建议通过构建可靠重构特征周围的地标表示来增强基于流的跟踪。重构特征点周围的平面补丁提供匹配信息,防止基于流的特征跟踪中的漂移,并允许在具有大基线的帧之间建立对应关系。选择性和周期性的这种对应映射极大地改善了场景和运动重建,同时坚持实时要求。实验证明,该方法计算精度高,计算效率高。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信