Runze Wang, Huimin Lu, Junhao Xiao, Yi Li, Qihang Qiu
{"title":"The Design of an Augmented Reality System for Urban Search and Rescue","authors":"Runze Wang, Huimin Lu, Junhao Xiao, Yi Li, Qihang Qiu","doi":"10.1109/IISR.2018.8535823","DOIUrl":null,"url":null,"abstract":"Most robots in urban search and rescue (USAR) fulfill tasks teleoperated by human operators (Such as controlling robot movement to avoid obstacles and explore unknown environments). The operator has to know the location of the robot and find the position of the target (victim). This paper presents an augmented reality system using a Kinect sensor on a customly designed rescue robot. Firstly, Simultaneous Localization and Mapping (SLAM) using RGB-D cameras is running to get the position and posture of the robot. Secondly, a deep learning method is adopted to obtain the location of the target. Finally, we place an AR marker of the target in the global coordinate and display it on the operator's screen to indicate the target even when the target is out of the camera's field of view. The experimental results show that the proposed system can be applied to help humans interact with robots.","PeriodicalId":201828,"journal":{"name":"2018 IEEE International Conference on Intelligence and Safety for Robotics (ISR)","volume":"35 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE International Conference on Intelligence and Safety for Robotics (ISR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IISR.2018.8535823","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
Most robots in urban search and rescue (USAR) fulfill tasks teleoperated by human operators (Such as controlling robot movement to avoid obstacles and explore unknown environments). The operator has to know the location of the robot and find the position of the target (victim). This paper presents an augmented reality system using a Kinect sensor on a customly designed rescue robot. Firstly, Simultaneous Localization and Mapping (SLAM) using RGB-D cameras is running to get the position and posture of the robot. Secondly, a deep learning method is adopted to obtain the location of the target. Finally, we place an AR marker of the target in the global coordinate and display it on the operator's screen to indicate the target even when the target is out of the camera's field of view. The experimental results show that the proposed system can be applied to help humans interact with robots.