{"title":"基于实时视觉的室内自主移动机器人导航与三维深度估计","authors":"F Vázquez, E Paz, R Marín","doi":"10.1016/0066-4138(94)90069-8","DOIUrl":null,"url":null,"abstract":"<div><p>A Transputer based artificial vision system that allows estimating a mobile robot's absolute position (navigation), it's motion parameters (egomotion) and a depth map of the sight of view is presented. Navigation is achieved by tracking expected features present in basic 3D CAD information of the environment. Egomotion and relative depth are estimated using optic flow calculated on closed contours of points in the scene presenting high spatio-temporal gradients and require no a-priori knowledge.</p></div>","PeriodicalId":100097,"journal":{"name":"Annual Review in Automatic Programming","volume":"19 ","pages":"Pages 221-226"},"PeriodicalIF":0.0000,"publicationDate":"1994-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/0066-4138(94)90069-8","citationCount":"3","resultStr":"{\"title\":\"Real-time vision-based navigation and 3D depth estimation for an indoor autonomous mobile robot\",\"authors\":\"F Vázquez, E Paz, R Marín\",\"doi\":\"10.1016/0066-4138(94)90069-8\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>A Transputer based artificial vision system that allows estimating a mobile robot's absolute position (navigation), it's motion parameters (egomotion) and a depth map of the sight of view is presented. Navigation is achieved by tracking expected features present in basic 3D CAD information of the environment. Egomotion and relative depth are estimated using optic flow calculated on closed contours of points in the scene presenting high spatio-temporal gradients and require no a-priori knowledge.</p></div>\",\"PeriodicalId\":100097,\"journal\":{\"name\":\"Annual Review in Automatic Programming\",\"volume\":\"19 \",\"pages\":\"Pages 221-226\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1994-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1016/0066-4138(94)90069-8\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Annual Review in Automatic Programming\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/0066413894900698\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Annual Review in Automatic Programming","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/0066413894900698","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Real-time vision-based navigation and 3D depth estimation for an indoor autonomous mobile robot
A Transputer based artificial vision system that allows estimating a mobile robot's absolute position (navigation), it's motion parameters (egomotion) and a depth map of the sight of view is presented. Navigation is achieved by tracking expected features present in basic 3D CAD information of the environment. Egomotion and relative depth are estimated using optic flow calculated on closed contours of points in the scene presenting high spatio-temporal gradients and require no a-priori knowledge.