{"title":"Fusion of omni-directional sonar and omni-directional vision for environment recognition of mobile robots","authors":"T. Yata, A. Ohya, S. Yuta","doi":"10.1109/ROBOT.2000.845343","DOIUrl":null,"url":null,"abstract":"This paper propose a new method of sensor fusion of an omni-directional ultrasonic sensor and an omnidirectional vision sensor. A new omni-directional sonar, which we developed, can measure accurate distance and direction of reflecting points, and an omnidirectional vision can give direction to edges of segment. We propose a sensor fusion method using both the reflecting points measured by the sonar and the edges of segment measured by the vision, based on the angles. Those data are different in character, so they compensate each other in the proposed method, and it becomes possible to get better information which are useful for environment recognition of mobile robots. We describe the proposed method and an experimental result to show its potential.","PeriodicalId":286422,"journal":{"name":"Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2000-12-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"14","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ROBOT.2000.845343","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 14
Abstract
This paper propose a new method of sensor fusion of an omni-directional ultrasonic sensor and an omnidirectional vision sensor. A new omni-directional sonar, which we developed, can measure accurate distance and direction of reflecting points, and an omnidirectional vision can give direction to edges of segment. We propose a sensor fusion method using both the reflecting points measured by the sonar and the edges of segment measured by the vision, based on the angles. Those data are different in character, so they compensate each other in the proposed method, and it becomes possible to get better information which are useful for environment recognition of mobile robots. We describe the proposed method and an experimental result to show its potential.