{"title":"Visibility-based UAV path planning for surveillance in cluttered environments","authors":"Vengatesan Govindaraju, G. Leng, Qian Zhang","doi":"10.1109/SSRR.2014.7017660","DOIUrl":"https://doi.org/10.1109/SSRR.2014.7017660","url":null,"abstract":"This paper focusses on the problem of determining near-optimal observation locations for an effective close-range UAV surveillance in terrains cluttered with buildings and trees. Use of Small-Unmanned Aerial Vehicles (S-UAVs) in civil defence applications has increased due to their portability and low operational costs. In close-range S-UAV surveillance in cluttered environments, there are two significant occlusions to visibility: complete (terrain) and partial (vegetation). However, in the existing literatures, the partial occlusions are generally neglected. In this paper, a probabilistic visibility model is proposed which considers both complete and partial occlusions to determine near-optimal surveillance path to enhance visibility of the desired regions on the ground using a two-step approach. In the first step, the waypoints are deployed in regions which provide near-uniform visibility of the desired target regions. This step involves finding the visibility space (region of space from which the desired target regions are visible) using the Fast Marching Method (FMM) and then deploying the waypoints in this region using Centroidal Voronoi tessellation (CVT). In the second step, flyable paths are constructed along the waypoints using an improved clustered spiral-alternating algorithm. Visibility with the proposed method is simulated for a synthetically generated terrain that resembles a residential area with buildings and trees. The results show the effectiveness of the proposed surveillance method in improving the visibility of the desired target regions.","PeriodicalId":267630,"journal":{"name":"2014 IEEE International Symposium on Safety, Security, and Rescue Robotics (2014)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127269782","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
L. Pfotzer, Jan Oberländer, A. Rönnau, R. Dillmann
{"title":"Development and calibration of KaRoLa, a compact, high-resolution 3D laser scanner","authors":"L. Pfotzer, Jan Oberländer, A. Rönnau, R. Dillmann","doi":"10.1109/SSRR.2014.7017677","DOIUrl":"https://doi.org/10.1109/SSRR.2014.7017677","url":null,"abstract":"We present KaRoLa, a new rotating 3D laser scanner with a modular and flexible hardware design and an integrated control software stack implemented in the ROS framework. Based on our requirements - light-weight and compact hardware, high resolution and accuracy - we compare different 2D laser range finders which are commercially available. We describe the hardware design, including the mechanical and electrical components, and the included software stack in detail. Furthermore, we present a particle swarm based calibration method to compensate mounting offsets between the 2D laser scanner and the rotational axis. The calibration significantly improves the overall accuracy and lowers the requirements for the mounting precision. Field studies for evaluating KaRoLa in real-world application scenarios such as planetary exploration and search and rescue missions complete this article.","PeriodicalId":267630,"journal":{"name":"2014 IEEE International Symposium on Safety, Security, and Rescue Robotics (2014)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127881334","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yanzhe Cui, Joshua T. Lane, R. Voyles, Akshay Krishnamoorthy
{"title":"A new fault tolerance method for field robotics through a self-adaptation architecture","authors":"Yanzhe Cui, Joshua T. Lane, R. Voyles, Akshay Krishnamoorthy","doi":"10.1109/SSRR.2014.7017646","DOIUrl":"https://doi.org/10.1109/SSRR.2014.7017646","url":null,"abstract":"Fault tolerance is increasingly important for urban search and rescue robotic systems because any failure mode may affect the reliability of the robot in meeting mission objectives. To support convenient development of fault tolerant robotic systems, this paper proposes ReFrESH, which is a self-adaptive framework that provides systemic self-diagnosis and self-maintenance mechanisms in the presence of unanticipated situations. Specifically, ReFrESH augments the port-based object by attaching performance evaluation and estimation methods to each functional component so that the robot can easily detect and locate faults. In conjunction, a task level decision mechanism interacts with these fault detection elements in order to dynamically find an optimal solution to mitigate faults. A demonstrative application of ReFrESH illustrates its applicability through a task for visual servoing to a target deployed on a multi-robot system.","PeriodicalId":267630,"journal":{"name":"2014 IEEE International Symposium on Safety, Security, and Rescue Robotics (2014)","volume":"72 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115841782","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
H. Tsukagoshi, T. Hamada, M. Watanabe, Ryuma Iizuka, Dameitry Ashlih
{"title":"Aerial manipulator aimed for door opening mission","authors":"H. Tsukagoshi, T. Hamada, M. Watanabe, Ryuma Iizuka, Dameitry Ashlih","doi":"10.1109/SSRR.2014.7017685","DOIUrl":"https://doi.org/10.1109/SSRR.2014.7017685","url":null,"abstract":"This paper describes an aerial robot with a manipulator to realize door opening mission. Although general aerial robots have advantages of flying in the three-dimensional space freely, they don't have any capability of moving to another room when the door is closed. To overcome this problem, we propose a new configuration of an aerial manipulator with perching function, knob-twisting function, and door-pushing function. With regard to knob-twisting function, the design concept of a light-weight manipulator generating large enough force is introduced, which is composed of an airbag actuator with variable restriction to perform the arbitrary curved trajectory. On the other hand, the door pushing force is aimed to be generated by the lift of the propeller, which is helpful to avoid gaining the additional weight. The validity of the proposed methods is experimentally verified by using the developed prototype.","PeriodicalId":267630,"journal":{"name":"2014 IEEE International Symposium on Safety, Security, and Rescue Robotics (2014)","volume":"105 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133648247","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Directed Exploration for Goal Oriented Navigation in unknown environments","authors":"P. Senarathne, Danwei W. Wang","doi":"10.1109/SSRR.2014.7017657","DOIUrl":"https://doi.org/10.1109/SSRR.2014.7017657","url":null,"abstract":"Autonomously navigating a robot in an unknown environment filled with obstacles to a goal location using a directed exploration strategy is presented. The exploration is directed towards the goal by iteratively selecting intermediate target points in the environment that reduces the distance estimate to the goal from robot's current position, until the goal is reached. A repetitive rechecking approach is also introduced to quickly detect and recover from dead-ends present in the environment. The proposed method alleviates the need to have access to the grid cell representing the goal, at the start of the navigation mission, as required by many dynamic grid path planning algorithms such as D*. In addition, the proposed method inherently supports any mapping application that outputs a grid of mapped information unlike many dynamic planning methods that require additional changes to support particle filter based mapping systems due to map orientation changes. Simulation experiments conducted in multiple environment conditions reveal that the proposed method is comparable to the optimal performance of dynamic path planning algorithms while requiring lesser number of decision steps to reach the goal.","PeriodicalId":267630,"journal":{"name":"2014 IEEE International Symposium on Safety, Security, and Rescue Robotics (2014)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120960954","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Pelka, K. Majek, J. Będkowski, P. Musialik, A. Maslowski, G. D. Cubber, Haris Balta, A. Coelho, R. Gonçalves, R. Baptista, Jose M. Sanchez, S. Govindaraj
{"title":"Training and Support system in the Cloud for improving the situational awareness in Search and Rescue (SAR) operations","authors":"M. Pelka, K. Majek, J. Będkowski, P. Musialik, A. Maslowski, G. D. Cubber, Haris Balta, A. Coelho, R. Gonçalves, R. Baptista, Jose M. Sanchez, S. Govindaraj","doi":"10.1109/SSRR.2014.7017644","DOIUrl":"https://doi.org/10.1109/SSRR.2014.7017644","url":null,"abstract":"In this paper, a Training and Support system for Search and Rescue operations is described. The system is a component of the ICARUS project (http://www.fp7-icarus.eu) which has a goal to develop sensor, robotic and communication technologies for Human Search And Rescue teams. The support system for planning and managing complex SAR operations is implemented as a command and control component that integrates different sources of spatial information, such as maps of the affected area, satellite images and sensor data coming from the unmanned robots, in order to provide a situation snapshot to the rescue team who will make the necessary decisions. Support issues will include planning of frequency resources needed for given areas, prediction of coverage conditions, location of fixed communication relays, etc. The training system is developed for the ICARUS operators controlling UGVs (Unmanned Ground Vehicles), UAVs (Unmanned Aerial Vehicles) and USVs (Unmanned Surface Vehicles) from a unified Remote Control Station (RC2). The Training and Support system is implemented in SaaS model (Software as a Service). Therefore, its functionality is available over the Ethernet. SAR ICARUS teams from different countries can be trained simultaneously on a shared virtual stage. In this paper we will show the multi-robot 3D mapping component (aerial vehicle and ground vehicles). We will demonstrate that these 3D maps can be used for Training purpose. Finally we demonstrate current approach for ICARUS Urban SAR (USAR) and Marine SAR (MSAR) operation training.","PeriodicalId":267630,"journal":{"name":"2014 IEEE International Symposium on Safety, Security, and Rescue Robotics (2014)","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126999926","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ryohei Tsuzuki, Genki Yamauchi, K. Nagatani, Kazuya Yoshida
{"title":"Teleoperation of mobile robots using hybrid communication system in unreliable radio communication environments","authors":"Ryohei Tsuzuki, Genki Yamauchi, K. Nagatani, Kazuya Yoshida","doi":"10.1109/SSRR.2014.7017649","DOIUrl":"https://doi.org/10.1109/SSRR.2014.7017649","url":null,"abstract":"When an active volcano erupts, it is important to have visual images of the area to be able to forecast debris floods and/or pyroclastic flows. However, restricted zones are usually established within a radius of a few kilometers of the crater because of the direct danger to humans. Therefore, we propose an observation system based on a teleoperated mobile robot that is controlled using radio communication during volcanic activity. To evaluate the system, we conducted field tests using a 3G cellular phone inside certain volcanoes. During the experiments, we faced several dangerous situations where the robot stopped all motion because of the weakness of the 3G signal. To solve this problem, we developed a hybrid communication system with multiple robots that employs two radio communication links. In the proposed system, each robot is controlled via 3G communication signals. However, if any of the robots lose the 3G link, the control signal is relayed by another neighboring robot using a local communication link. In this paper, we explain the system, introduce our newly designed robots, and present results of our operation tests.","PeriodicalId":267630,"journal":{"name":"2014 IEEE International Symposium on Safety, Security, and Rescue Robotics (2014)","volume":"89 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134274459","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
G. D. Cubber, Haris Balta, D. Doroftei, Y. Baudoin
{"title":"UAS deployment and data processing during the Balkans flooding","authors":"G. D. Cubber, Haris Balta, D. Doroftei, Y. Baudoin","doi":"10.1109/SSRR.2014.7017670","DOIUrl":"https://doi.org/10.1109/SSRR.2014.7017670","url":null,"abstract":"This project paper provides a report on a real relief operation mission, jointly conducted by two European research projects, in response to the massive flooding in the Balkan in spring 2014. Un Unmanned Aerial System was deployed on-site in collaboration with traditional relief workers, to support them with damage assessment, area mapping, visual inspection and re-localizing the many explosive remnants of war which have been moved due to the flooding and landslides. Novel robotic technologies and data processing methodologies were brought from the research labs and directly applied onto the terrain in order to support the relief workers and minimize human suffering.","PeriodicalId":267630,"journal":{"name":"2014 IEEE International Symposium on Safety, Security, and Rescue Robotics (2014)","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122807326","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Real-time moving objects tracking for mobile-robots using motion information","authors":"M. A. Mohamed, Christoph Böddeker, B. Mertsching","doi":"10.1109/SSRR.2014.7017674","DOIUrl":"https://doi.org/10.1109/SSRR.2014.7017674","url":null,"abstract":"The perceptional of the motion of objects is a key problem for a mobile robot to perform tasks in a dynamic environment. Thus, we present a real-time approach for tracking multiple moving objects. The proposed algorithm initially detects moving regions and a dense optical flow technique is exclusively applied to those regions between two consecutive frames. Afterwards, the moving objects in each region are determined based on the planar parallax motion by assuming that independently moving objects undergo pure translation. For subsequent frames, the detected moving objects are tracked based on the orientation of the flow fields, while the new position is updated. In turn, the new detected objects are modeled during a tracking period. The proposed algorithm has been tested with various scenarios and the experimental results demonstrate that the proposed algorithm works properly. It can be shown that there is a significant reduction in the overall processing time for detecting and tracking multiple moving objects in a scene.","PeriodicalId":267630,"journal":{"name":"2014 IEEE International Symposium on Safety, Security, and Rescue Robotics (2014)","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128842455","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Amanatiadis, E. Karakasis, Loukas Bampis, Themistoklis Giitsidis, P. Panagiotou, G. Sirakoulis, A. Gasteratos, P. Tsalides, A. Goulas, K. Yakinthos
{"title":"The HCUAV project: Electronics and software development for medium altitude remote sensing","authors":"A. Amanatiadis, E. Karakasis, Loukas Bampis, Themistoklis Giitsidis, P. Panagiotou, G. Sirakoulis, A. Gasteratos, P. Tsalides, A. Goulas, K. Yakinthos","doi":"10.1109/SSRR.2014.7017668","DOIUrl":"https://doi.org/10.1109/SSRR.2014.7017668","url":null,"abstract":"The continuous increase of illegal migration flows to southern European countries has been recently in the spotlight of European Union due to numerous deadly incidents. Another common issue that the aforementioned countries share is the Mediterranean wildfires which are becoming more frequent due to the warming climate and increasing magnitudes of droughts. Different ground early warning systems have been funded and developed across these countries separately for these incidents, however they have been proved insufficient mainly because of the limited surveyed areas and challenging Mediterranean shoreline and landscape. In 2011, the Greek Government along with European Commission, decided to support the development of the first Hellenic Civil Unmanned Aerial Vehicle (HCUAV), which will provide solutions to both illegal migration and wildfires. This paper presents the challenges in the electronics and software design, and especially the under development solutions for detection of human and fire activity, image mosaicking and orthorectification using commercial off-the-shelf sensors. Preliminary experimental results of the HCUAV medium altitude remote sensing algorithms, show accurate and adequate results using low cost sensors and electronic devices.","PeriodicalId":267630,"journal":{"name":"2014 IEEE International Symposium on Safety, Security, and Rescue Robotics (2014)","volume":"146 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122603379","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}