{"title":"Estimating camera motion precisely from omni-directional images","authors":"Shigang Li, M. Chiba, S. Tsuji","doi":"10.1109/IROS.1994.407472","DOIUrl":null,"url":null,"abstract":"A human being often looks around to locate him/herself from the arrangement of surrounding objects. Here, the authors propose an idea to determine camera motion by line correspondence in omni-directional images. An omni-directional image has a 360 degree view and contains much more information than those taken by a conventional imaging method. Since the camera motion is estimated from line displacements in images and the line displacements change due to line arrangement in space and camera motion, an omni-directional view makes it possible to select a better line combination for estimating more accurate motion. Experimental results show that camera motion can be estimated more accurately using lines surrounding the camera in omni-directional images than using lines in a narrow view. First, the authors use lines surrounding the camera to determine the camera rotational component and find the true solution from multiple ones by voting from multiple line combinations. Then, the authors select an optimal line combination to estimate the translation component by examining their linear independence. Simulation and real-world experimental results are given.<<ETX>>","PeriodicalId":437805,"journal":{"name":"Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'94)","volume":"105 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1994-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'94)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IROS.1994.407472","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
A human being often looks around to locate him/herself from the arrangement of surrounding objects. Here, the authors propose an idea to determine camera motion by line correspondence in omni-directional images. An omni-directional image has a 360 degree view and contains much more information than those taken by a conventional imaging method. Since the camera motion is estimated from line displacements in images and the line displacements change due to line arrangement in space and camera motion, an omni-directional view makes it possible to select a better line combination for estimating more accurate motion. Experimental results show that camera motion can be estimated more accurately using lines surrounding the camera in omni-directional images than using lines in a narrow view. First, the authors use lines surrounding the camera to determine the camera rotational component and find the true solution from multiple ones by voting from multiple line combinations. Then, the authors select an optimal line combination to estimate the translation component by examining their linear independence. Simulation and real-world experimental results are given.<>