Christopher M. Reardon, Fei Han, Hao Zhang, Jonathan R. Fink
{"title":"Optimizing autonomous surveillance route solutions from minimal human-robot interaction","authors":"Christopher M. Reardon, Fei Han, Hao Zhang, Jonathan R. Fink","doi":"10.1109/SSRR.2017.8088165","DOIUrl":"https://doi.org/10.1109/SSRR.2017.8088165","url":null,"abstract":"Resource-constrained surveillance tasks represent a promising domain for autonomous robotic systems in a variety of real-world applications. In particular, we consider tasks where the system must maximize the probability of detecting a target while traversing an environment subject to resource constraints that make full coverage infeasible. In order to perform well, accurate knowledge of the underlying distribution of the surveillance targets is essential for practical use, but this is typically not available to robots. To successfully address surveillance route planning in human-robot teams, the design and optimization of human-robot interaction is critical. Further, in human-robot teaming, the human often possesses essential knowledge of the mission, environment, or other agents. In this paper, we introduce a new approach named Human-robot Autonomous Route Planning (HARP) that explores the space of surveillance solutions to maximize task-performance using information provided through interactions with humans. Experimental results have shown that with minimal interaction, we can successfully leverage human knowledge to create more successful surveillance routes under resource constraints.","PeriodicalId":403881,"journal":{"name":"2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR)","volume":"116 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122100154","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Robotic teleoperation: Mediated and supported by virtual testbeds","authors":"T. Cichon, J. Roßmann","doi":"10.1109/SSRR.2017.8088140","DOIUrl":"https://doi.org/10.1109/SSRR.2017.8088140","url":null,"abstract":"In disaster scenarios or search and rescue applications an efficient, constructive and safe mission needs more than just robotic hardware and control. One major element is the interface between operator and robot. In this contribution, we present the approach of VTB-mediated teleoperation. This comprises (i) using standardized interfaces to integrate existing knowledge and libraries of the robotics community, (ii) the modular integration of necessary functionalities into the VTB, (iii) data handling and visualization in 3D simulation, (iv) direct control and feedback possibilities of real and virtual robotic systems, and (v) an overall modularity in terms of input and output modalities, internal and external libraries, as well as real and virtual data. This results in a holistic system of operator, VTB, and robotic system for training, support, prediction, and analysis before, after, and also during the mission.","PeriodicalId":403881,"journal":{"name":"2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114900094","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Visual servoing for teleoperation using a tethered UAV","authors":"Xuesu Xiao, J. Dufek, R. Murphy","doi":"10.1109/SSRR.2017.8088155","DOIUrl":"https://doi.org/10.1109/SSRR.2017.8088155","url":null,"abstract":"This paper presents a visual servoing approach for robotic teleoperation using a tethered unmanned aerial vehicle (UAV). When teleoperating a robot, human operator's perception of the remote situation is limited by the robot's onboard camera. This deteriorates situational awareness and poses challenges on operation precision and efficiency. Tele- operated visual assistants are used in practice. For example, in Fukushima Daiichi nuclear disaster decommissioning, a secondary ground robot is used to follow and watch the primary robot. However, this requires two robots and 2-4 operators to perform one task. Furthermore, it introduces more problems, such as extra teamwork demand, miscommunication risk, suboptimal viewpoints. This work proposes to use a tethered UAV to replace the extra ground robot and human operators. In order to visually assist the primary robot autonomously, a visual servoing algorithm is developed and implemented based on a fiducial marker mounted on the primary robot, representing the operator's point of interest. Visual servoing configuration is controlled using 6 Degrees of Freedom of the fiducial. Servoing performances from physical experiments are analyzed. This paper lays the groundwork for and points out the direction of further visual assisting research.","PeriodicalId":403881,"journal":{"name":"2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR)","volume":"202 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133671360","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Paul Fritsche, B. Zeise, Patrick Hemme, Bernardo Wagner
{"title":"Fusion of radar, LiDAR and thermal information for hazard detection in low visibility environments","authors":"Paul Fritsche, B. Zeise, Patrick Hemme, Bernardo Wagner","doi":"10.1109/SSRR.2017.8088146","DOIUrl":"https://doi.org/10.1109/SSRR.2017.8088146","url":null,"abstract":"Nowadays, mobile robots are widely used to support fire brigades in search and rescue missions. The utilization of those robots - especially under low visibility conditions due to smoke, fog or dust - is limited. Under these circumstances, environmental perception is still a huge challenge. In this work, we present an approach on using LiDAR, radar and thermal imaging in order to detect hazards that are potentially harmful to the robot or firefighters. We show the benefits of fusing LiDAR and radar before projecting temperatures recorded with a thermal imaging camera onto the range scans. Additionally, a hotspot detection method using the tempered range scans is presented. We demonstrate the functionality of our approach by teleoperating a robot through a smoky room.","PeriodicalId":403881,"journal":{"name":"2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR)","volume":"284 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122961369","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Lorenz Wellhausen, Renaud Dubé, A. Gawel, R. Siegwart, César Cadena
{"title":"Reliable real-time change detection and mapping for 3D LiDARs","authors":"Lorenz Wellhausen, Renaud Dubé, A. Gawel, R. Siegwart, César Cadena","doi":"10.1109/SSRR.2017.8088144","DOIUrl":"https://doi.org/10.1109/SSRR.2017.8088144","url":null,"abstract":"A common scenario in Search and Rescue robotics is to map and patrol a disaster site to assess the situation and plan potential missions of rescue teams. Particular importance has to be given to changes in the environment as these may correspond to critical events like building collapses, movement of objects, etc. This paper presents a change detection pipeline for LiDAR-equipped robots to assist humans in detecting those changes. The local 3D point cloud data is compared to an octree-based occupancy map representation of the environment by computing the Mahalanobis distance to the closest voxel in the map. The thresholded distance is processed by a clustering algorithm to obtain a set of change candidates. Finally, outliers in these sets are filtered using a random forest classifier. Changes are continuously mapped during a sortie based on their classification score and number of occurrences. Changes are reported in real time during robot operation.","PeriodicalId":403881,"journal":{"name":"2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR)","volume":"175 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131440724","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A review on cybersecurity vulnerabilities for unmanned aerial vehicles","authors":"C. Krishna, R. Murphy","doi":"10.1109/SSRR.2017.8088163","DOIUrl":"https://doi.org/10.1109/SSRR.2017.8088163","url":null,"abstract":"This paper surveys the scientific and trade literature on cybersecurity for unmanned aerial vehicles (UAV), concentrating on actual and simulated attacks, and the implications for small UAVs. The review is motivated by the increasing use of small UAVs for inspecting critical infrastructures such as the electric utility transmission and distribution grid, which could be a target for terrorism. The paper presents a modified taxonomy to organize cyber attacks on UAVs and exploiting threats by Attack Vector and Target. It shows that, by Attack Vector, there has been one physical attack and ten remote attacks. By Target, there have been six attacks on GPS (two jamming, four spoofing), two attacks on the control communications stream (a deauthentication attack and a zero-day vulnerabilities attack), and two attacks on data communications stream (two intercepting the data feed, zero executing a video replay attack). The paper also divides and discusses the findings by large or small UAVs, over or under 25 kg, but concentrates on small UAVs. The survey concludes that UAV-related research to counter cybersecurity threats focuses on GPS Jamming and Spoofing, but ignores attacks on the controls and data communications stream. The gap in research on attacks on the data communications stream is concerning, as an operator can see a UAV flying off course due to a control stream attack but has no way of detecting a video replay attack (substitution of a video feed).","PeriodicalId":403881,"journal":{"name":"2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127670444","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Gawel, Renaud Dubé, H. Surmann, Juan I. Nieto, R. Siegwart, César Cadena
{"title":"3D registration of aerial and ground robots for disaster response: An evaluation of features, descriptors, and transformation estimation","authors":"A. Gawel, Renaud Dubé, H. Surmann, Juan I. Nieto, R. Siegwart, César Cadena","doi":"10.1109/SSRR.2017.8088136","DOIUrl":"https://doi.org/10.1109/SSRR.2017.8088136","url":null,"abstract":"Global registration of heterogeneous ground and aerial mapping data is a challenging task. This is especially difficult in disaster response scenarios when we have no prior information on the environment and cannot assume the regular order of man-made environments or meaningful semantic cues. In this work we extensively evaluate different approaches to globally register UGV generated 3D point-cloud data from LiDAR sensors with UAV generated point-cloud maps from vision sensors. The approaches are realizations of different selections for: a) local features: key-points or segments; b) descriptors: FPFH, SHOT, or ESF; and c) transformation estimations: RANSAC or FGR. Additionally, we compare the results against standard approaches like applying ICP after a good prior transformation has been given. The evaluation criteria include the distance which a UGV needs to travel to successfully localize, the registration error, and the computational cost. In this context, we report our findings on effectively performing the task on two new Search and Rescue datasets. Our results have the potential to help the community take informed decisions when registering point-cloud maps from ground robots to those from aerial robots.","PeriodicalId":403881,"journal":{"name":"2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR)","volume":"152 6 Suppl 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-09-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116327132","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Rik Bähnemann, Dominik Schindler, Mina Kamel, R. Siegwart, Juan I. Nieto
{"title":"A decentralized multi-agent unmanned aerial system to search, pick up, and relocate objects","authors":"Rik Bähnemann, Dominik Schindler, Mina Kamel, R. Siegwart, Juan I. Nieto","doi":"10.1109/SSRR.2017.8088150","DOIUrl":"https://doi.org/10.1109/SSRR.2017.8088150","url":null,"abstract":"We present a fully integrated autonomous multirobot aerial system for finding and collecting moving and static objects with unknown locations. This task addresses multiple relevant problems in search and rescue (SAR) robotics such as multi-agent aerial exploration, object detection and tracking, and aerial gripping. Usually, the community tackles these problems individually but the integration into a working system generates extra complexity which is rarely addressed. We show that this task can be solved reliably using only simple components. Our decentralized system uses accurate global state estimation, reactive collision avoidance, and sweep planning for multi-agent exploration. Objects are detected, tracked, and picked up using blob detection, inverse 3D-projection, Kalman filtering, visual-servoing, and a magnetic gripper. We evaluate the individual components of our system on the real platform. The full system has been deployed successfully in various public demonstrations, field tests, and the Mohamed Bin Zayed International Robotics Challenge 2017 (MBZIRC). Among the contestants we showed reliable performances and reached second place out of 17 in the individual challenge.","PeriodicalId":403881,"journal":{"name":"2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-07-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132134777","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Malcolm Mielle, Martin Magnusson, Henrik Andreasson, A. Lilienthal
{"title":"SLAM auto-complete: Completing a robot map using an emergency map","authors":"Malcolm Mielle, Martin Magnusson, Henrik Andreasson, A. Lilienthal","doi":"10.1109/SSRR.2017.8088137","DOIUrl":"https://doi.org/10.1109/SSRR.2017.8088137","url":null,"abstract":"In search and rescue missions, time is an important factor; fast navigation and quickly acquiring situation awareness might be matters of life and death. Hence, the use of robots in such scenarios has been restricted by the time needed to explore and build a map. One way to speed up exploration and mapping is to reason about unknown parts of the environment using prior information. While previous research on using external priors for robot mapping mainly focused on accurate maps or aerial images, such data are not always possible to get, especially indoor. We focus on emergency maps as priors for robot mapping since they are easy to get and already extensively used by firemen in rescue missions. However, those maps can be outdated, information might be missing, and the scales of rooms are typically not consistent. We have developed a formulation of graph-based SLAM that incorporates information from an emergency map. The graph-SLAM is optimized using a combination of robust kernels, fusing the emergency map and the robot map into one map, even when faced with scale inaccuracies and inexact start poses. We typically have more than 50% of wrong correspondences in the settings studied in this paper, and the method we propose correctly handles them. Experiments in an office environment show that we can handle up to 70% of wrong correspondences and still get the expected result. The robot can navigate and explore while taking into account places it has not yet seen. We demonstrate this in a test scenario and also show that the emergency map is enhanced by adding information not represented such as closed doors or new walls.","PeriodicalId":403881,"journal":{"name":"2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR)","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-02-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130709315","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}