R. Nunez, J. R. Bandera, J. M. Pérez-Lorenzo, Francisco Sandoval
{"title":"A human-robot interaction system for navigation supervision based on augmented reality","authors":"R. Nunez, J. R. Bandera, J. M. Pérez-Lorenzo, Francisco Sandoval","doi":"10.1109/MELCON.2006.1653133","DOIUrl":null,"url":null,"abstract":"This paper proposes an innovative human-robot interaction mechanism that permits users to interact intuitively with an autonomous mobile robot which localisation problem is solved using a new and fast feature extraction method. To allow that human-robot interaction, we use an augmented reality display. This mechanism makes it possible to overlay planning, world model and sensory data provided by the robot over the same field of view. The determination of the camera pose in the AR system is solved using this novelty feature-based localisation method. Thus, the human user can intuitively help to build a topological map in an unknown environment by setting and manipulating map nodes and visualize and correct the robot's path planning","PeriodicalId":299928,"journal":{"name":"MELECON 2006 - 2006 IEEE Mediterranean Electrotechnical Conference","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2006-05-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"12","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"MELECON 2006 - 2006 IEEE Mediterranean Electrotechnical Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MELCON.2006.1653133","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 12
Abstract
This paper proposes an innovative human-robot interaction mechanism that permits users to interact intuitively with an autonomous mobile robot which localisation problem is solved using a new and fast feature extraction method. To allow that human-robot interaction, we use an augmented reality display. This mechanism makes it possible to overlay planning, world model and sensory data provided by the robot over the same field of view. The determination of the camera pose in the AR system is solved using this novelty feature-based localisation method. Thus, the human user can intuitively help to build a topological map in an unknown environment by setting and manipulating map nodes and visualize and correct the robot's path planning