{"title":"Fusion of stereo and Lidar data for dense depth map computation","authors":"Hugo Courtois, N. Aouf","doi":"10.1109/RED-UAS.2017.8101664","DOIUrl":"https://doi.org/10.1109/RED-UAS.2017.8101664","url":null,"abstract":"Creating a map is a necessity in a lot of robotic applications, and depth maps are a way to estimate the position of other objects or obstacles. In this paper, an algorithm to compute depth maps is proposed. It operates by fusing information from two types of sensor: a stereo camera, and a LIDAR scanner. The strategy is to estimate reliably the disparities of a sparse set of points, then a bilateral filter is used to interpolate the missing disparities. Finally, the interpolation is refined. Our method is tested on the KITTI dataset and is compared against several other methods which fuse those modalities, or are extended to perform this fusion. Those tests show that our method is competitive with other fusion methods.","PeriodicalId":299104,"journal":{"name":"2017 Workshop on Research, Education and Development of Unmanned Aerial Systems (RED-UAS)","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130793991","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Beatriz Hernandez-Hernandez, J. Martínez-Carranza, J. Rangel-Magdaleno
{"title":"Keeping a moving target within the field of view of a Drone's onboard camera via stochastic estimation","authors":"Beatriz Hernandez-Hernandez, J. Martínez-Carranza, J. Rangel-Magdaleno","doi":"10.1109/RED-UAS.2017.8101659","DOIUrl":"https://doi.org/10.1109/RED-UAS.2017.8101659","url":null,"abstract":"The use of Drones in areas such as cinema, sports, social events, and even video selfies has been increasing due to their flexibility to capture video in scenarios where there is an interest to keep a target within the field of view of the drone's onboard camera. In order to remove the dependence of the pilot that controls the drone, in this work we present a system for an autonomous flight control of the drone with the goal of keeping a target within the field of view of its onboard camera. For the latter, the images were captured by a camera onboard the drone, whose output was combined with a stochastic estimator of the target states, based on the Unscented Kalman Filter, to generate control commands in order that the drone which performs such recording autonomously. The system was validated with real-time tests involving different targets moving with different trajectories and it was compared against human pilots. Our approach kept the target within the field of view with a 96.6% of success compared to 83.3% of success obtained by human pilots. The latter indicates that our approach has the potential to be used in applications where autonomous drones could be used for aerial video recording, with a special interest in keeping a target within the field of view of the drone's camera.","PeriodicalId":299104,"journal":{"name":"2017 Workshop on Research, Education and Development of Unmanned Aerial Systems (RED-UAS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130858987","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Hovenburg, Fabio Augusto de Alcantara Andrade, Christopher Dahlin Rodin, T. Johansen, R. Storvold
{"title":"Contingency path planning for hybrid-electric UAS","authors":"A. Hovenburg, Fabio Augusto de Alcantara Andrade, Christopher Dahlin Rodin, T. Johansen, R. Storvold","doi":"10.1109/RED-UAS.2017.8101640","DOIUrl":"https://doi.org/10.1109/RED-UAS.2017.8101640","url":null,"abstract":"This article presents a path planning optimization method which aims to mitigate the risks in the event of a critical engine or generator failure in hybrid-electric UAS. This is achieved through continuous determination of the optimum flight path, based on the remaining battery range and expected local wind conditions. The result is a dynamically adjusting flight path which ensures the aircraft to remain within range of pre-specified safe landing spots. The developed algorithm uses the particle swarm optimization technique to optimize the flight path, and incorporates regional wind information in order to increase the accuracy of the expected in-flight performance of the aircraft.","PeriodicalId":299104,"journal":{"name":"2017 Workshop on Research, Education and Development of Unmanned Aerial Systems (RED-UAS)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130938273","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Efficient real-time video stabilization for UAVs using only IMU data","authors":"M. Odelga, N. Kochanek, H. Bülthoff","doi":"10.1109/RED-UAS.2017.8101668","DOIUrl":"https://doi.org/10.1109/RED-UAS.2017.8101668","url":null,"abstract":"While some unmanned aerial vehicles (UAVs) have the capacity to carry mechanically stabilized camera equipment, weight limits or other problems may make mechanical stabilization impractical. As a result many UAVs rely on fixed cameras to provide a video stream to an operator or observer. With a fixed camera, the video stream is often unsteady due to the multirotor's movement from wind and acceleration. These video streams are often analyzed by both humans and machines, and the unwanted camera movement can cause problems for both. For a human observer, unwanted movement may simply make it harder to follow the video, while for computer algorithms, it may severely impair the algorithm's intended function. There has been significant research on how to stabilize videos using feature tracking to determine camera movement, which in turn is used to manipulate frames and stabilize the camera stream. We believe, however, that this process could be greatly simplified by using data from a UAV's on-board inertial measurement unit (IMU) to stabilize the camera feed. In this paper we present an algorithm for video stabilization based only on IMU data from a UAV platform. Our results show that our algorithm successfully stabilizes the camera stream with the added benefit of requiring less computational power.","PeriodicalId":299104,"journal":{"name":"2017 Workshop on Research, Education and Development of Unmanned Aerial Systems (RED-UAS)","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126357659","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}