T. Sasai, Yousuke Takahashi, M. Kotani, A. Nakamura
{"title":"利用信息投影与用户交互的引导机器人基本系统的开发","authors":"T. Sasai, Yousuke Takahashi, M. Kotani, A. Nakamura","doi":"10.1109/ICMA.2011.5985849","DOIUrl":null,"url":null,"abstract":"This paper presents a guide robot with an information projection interface. The robot can detect multiple persons around it and choose the closest one as a user. This is realized using an omnidirectional camera and a laser range finder. The robot also has a projector with a pan-tile mechanism. It can project information anywhere in the environment and guide a person. The user can input commands by simple gestures using the foot on the dialog box projected onto the floor. After destination decision, the robot guides him/her by its motion and information projected onto the floor/wall/ceiling. The displayed information is changed corresponding to his/her position. Through experiments, we verified that the robot could detect a user and guide him/her.","PeriodicalId":317730,"journal":{"name":"2011 IEEE International Conference on Mechatronics and Automation","volume":"72 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-08-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"19","resultStr":"{\"title\":\"Development of a guide robot interacting with the user using information projection — Basic system\",\"authors\":\"T. Sasai, Yousuke Takahashi, M. Kotani, A. Nakamura\",\"doi\":\"10.1109/ICMA.2011.5985849\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents a guide robot with an information projection interface. The robot can detect multiple persons around it and choose the closest one as a user. This is realized using an omnidirectional camera and a laser range finder. The robot also has a projector with a pan-tile mechanism. It can project information anywhere in the environment and guide a person. The user can input commands by simple gestures using the foot on the dialog box projected onto the floor. After destination decision, the robot guides him/her by its motion and information projected onto the floor/wall/ceiling. The displayed information is changed corresponding to his/her position. Through experiments, we verified that the robot could detect a user and guide him/her.\",\"PeriodicalId\":317730,\"journal\":{\"name\":\"2011 IEEE International Conference on Mechatronics and Automation\",\"volume\":\"72 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-08-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"19\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2011 IEEE International Conference on Mechatronics and Automation\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICMA.2011.5985849\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 IEEE International Conference on Mechatronics and Automation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMA.2011.5985849","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Development of a guide robot interacting with the user using information projection — Basic system
This paper presents a guide robot with an information projection interface. The robot can detect multiple persons around it and choose the closest one as a user. This is realized using an omnidirectional camera and a laser range finder. The robot also has a projector with a pan-tile mechanism. It can project information anywhere in the environment and guide a person. The user can input commands by simple gestures using the foot on the dialog box projected onto the floor. After destination decision, the robot guides him/her by its motion and information projected onto the floor/wall/ceiling. The displayed information is changed corresponding to his/her position. Through experiments, we verified that the robot could detect a user and guide him/her.