Joshua Samuel P. Siy, Robert Keith C. Chan, R. Baldovino
{"title":"全自主城市搜索机器人中基于外观的实时映射实现","authors":"Joshua Samuel P. Siy, Robert Keith C. Chan, R. Baldovino","doi":"10.1109/HNICEM.2018.8666380","DOIUrl":null,"url":null,"abstract":"The paper describes the findings of using a real time 3d mapping program, the real-time appearance-based mapping (RTAB-Map) used for a certain project. Coming from a designed wirelessly controlled mobile robotic platform capable of navigating an unknown environment, creating a 3D reconstruction of said environment, while detecting human sources it could find while performing real time 3d reconstruction of the given environment. The purpose of the project is to assist rescue operatives by providing computer generated model of the environment and allow rescue teams to analyze and learn the general layout of the area without being expendable to following hazards from the unknown environments as well as identifying the presence of humans and damage done and possible effects of hazardous reaction. Giving the team to search and route giving priority to saving as much survivors possible. The project was done by using a mobile platform and an RGBD camera, through autonomous navigation or manual control, to generate a graphical representation of the traversed area of the mobile platform and running a face detection algorithm which could alarm operators by audio cue or sound from the video. Upon examination of the results, it was found that the fabricated platform is, indeed, capable of exploring an unknown area, developing a close approximation of the area using a 3D map, detecting faces scattered throughout the area, all while performing wirelessly, separate from the viewing terminal despite its limitations. This research shows the potential that robots possess in assisting rescue operations.","PeriodicalId":426103,"journal":{"name":"2018 IEEE 10th International Conference on Humanoid, Nanotechnology, Information Technology,Communication and Control, Environment and Management (HNICEM)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Implementation of a Real-time Appearance-based Mapping in a Fully Autonomous Urban Search Robot\",\"authors\":\"Joshua Samuel P. Siy, Robert Keith C. Chan, R. Baldovino\",\"doi\":\"10.1109/HNICEM.2018.8666380\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The paper describes the findings of using a real time 3d mapping program, the real-time appearance-based mapping (RTAB-Map) used for a certain project. Coming from a designed wirelessly controlled mobile robotic platform capable of navigating an unknown environment, creating a 3D reconstruction of said environment, while detecting human sources it could find while performing real time 3d reconstruction of the given environment. The purpose of the project is to assist rescue operatives by providing computer generated model of the environment and allow rescue teams to analyze and learn the general layout of the area without being expendable to following hazards from the unknown environments as well as identifying the presence of humans and damage done and possible effects of hazardous reaction. Giving the team to search and route giving priority to saving as much survivors possible. The project was done by using a mobile platform and an RGBD camera, through autonomous navigation or manual control, to generate a graphical representation of the traversed area of the mobile platform and running a face detection algorithm which could alarm operators by audio cue or sound from the video. Upon examination of the results, it was found that the fabricated platform is, indeed, capable of exploring an unknown area, developing a close approximation of the area using a 3D map, detecting faces scattered throughout the area, all while performing wirelessly, separate from the viewing terminal despite its limitations. This research shows the potential that robots possess in assisting rescue operations.\",\"PeriodicalId\":426103,\"journal\":{\"name\":\"2018 IEEE 10th International Conference on Humanoid, Nanotechnology, Information Technology,Communication and Control, Environment and Management (HNICEM)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 IEEE 10th International Conference on Humanoid, Nanotechnology, Information Technology,Communication and Control, Environment and Management (HNICEM)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/HNICEM.2018.8666380\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE 10th International Conference on Humanoid, Nanotechnology, Information Technology,Communication and Control, Environment and Management (HNICEM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HNICEM.2018.8666380","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Implementation of a Real-time Appearance-based Mapping in a Fully Autonomous Urban Search Robot
The paper describes the findings of using a real time 3d mapping program, the real-time appearance-based mapping (RTAB-Map) used for a certain project. Coming from a designed wirelessly controlled mobile robotic platform capable of navigating an unknown environment, creating a 3D reconstruction of said environment, while detecting human sources it could find while performing real time 3d reconstruction of the given environment. The purpose of the project is to assist rescue operatives by providing computer generated model of the environment and allow rescue teams to analyze and learn the general layout of the area without being expendable to following hazards from the unknown environments as well as identifying the presence of humans and damage done and possible effects of hazardous reaction. Giving the team to search and route giving priority to saving as much survivors possible. The project was done by using a mobile platform and an RGBD camera, through autonomous navigation or manual control, to generate a graphical representation of the traversed area of the mobile platform and running a face detection algorithm which could alarm operators by audio cue or sound from the video. Upon examination of the results, it was found that the fabricated platform is, indeed, capable of exploring an unknown area, developing a close approximation of the area using a 3D map, detecting faces scattered throughout the area, all while performing wirelessly, separate from the viewing terminal despite its limitations. This research shows the potential that robots possess in assisting rescue operations.