{"title":"实时定位和地图与可穿戴的主动视觉","authors":"A. Davison, W. Mayol-Cuevas, D. W. Murray","doi":"10.1109/ISMAR.2003.1240684","DOIUrl":null,"url":null,"abstract":"We present a general method for real-time, vision-only single-camera simultaneous localisation and mapping (SLAM) - an algorithm which is applicable to the localisation of any camera moving through a scene - and study its application to the localisation of a wearable robot with active vision. Starting from very sparse initial scene knowledge, a map of natural point features spanning a section of a room is generated on-the-fly as the motion of the camera is simultaneously estimated in full 3D. Naturally this permits the annotation of the scene with rigidly-registered graphics, but further it permits automatic control of the robot's active camera: for instance, fixation on a particular object can be maintained during extended periods of arbitrary user motion, then shifted at will to another object which has potentially been out of the field of view. This kind of functionality is the key to the understanding or \"management\" of a workspace which the robot needs to have in order to assist its wearer usefully in tasks. We believe that the techniques and technology developed are of particular immediate value in scenarios of remote collaboration, where a remote expert is able to annotate, through the robot, the environment the wearer is working in.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"59 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"137","resultStr":"{\"title\":\"Real-time localization and mapping with wearable active vision\",\"authors\":\"A. Davison, W. Mayol-Cuevas, D. W. Murray\",\"doi\":\"10.1109/ISMAR.2003.1240684\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We present a general method for real-time, vision-only single-camera simultaneous localisation and mapping (SLAM) - an algorithm which is applicable to the localisation of any camera moving through a scene - and study its application to the localisation of a wearable robot with active vision. Starting from very sparse initial scene knowledge, a map of natural point features spanning a section of a room is generated on-the-fly as the motion of the camera is simultaneously estimated in full 3D. Naturally this permits the annotation of the scene with rigidly-registered graphics, but further it permits automatic control of the robot's active camera: for instance, fixation on a particular object can be maintained during extended periods of arbitrary user motion, then shifted at will to another object which has potentially been out of the field of view. This kind of functionality is the key to the understanding or \\\"management\\\" of a workspace which the robot needs to have in order to assist its wearer usefully in tasks. We believe that the techniques and technology developed are of particular immediate value in scenarios of remote collaboration, where a remote expert is able to annotate, through the robot, the environment the wearer is working in.\",\"PeriodicalId\":296266,\"journal\":{\"name\":\"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.\",\"volume\":\"59 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2003-10-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"137\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISMAR.2003.1240684\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISMAR.2003.1240684","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Real-time localization and mapping with wearable active vision
We present a general method for real-time, vision-only single-camera simultaneous localisation and mapping (SLAM) - an algorithm which is applicable to the localisation of any camera moving through a scene - and study its application to the localisation of a wearable robot with active vision. Starting from very sparse initial scene knowledge, a map of natural point features spanning a section of a room is generated on-the-fly as the motion of the camera is simultaneously estimated in full 3D. Naturally this permits the annotation of the scene with rigidly-registered graphics, but further it permits automatic control of the robot's active camera: for instance, fixation on a particular object can be maintained during extended periods of arbitrary user motion, then shifted at will to another object which has potentially been out of the field of view. This kind of functionality is the key to the understanding or "management" of a workspace which the robot needs to have in order to assist its wearer usefully in tasks. We believe that the techniques and technology developed are of particular immediate value in scenarios of remote collaboration, where a remote expert is able to annotate, through the robot, the environment the wearer is working in.