{"title":"Heterogeneous sensor fusion based omnidirectional object detection","authors":"Hyunjee Ryu, Inhwan Wee, Taeyeon Kim, D. Shim","doi":"10.23919/ICCAS50221.2020.9268431","DOIUrl":null,"url":null,"abstract":"Nowadays, the importance of the object recognition of an aerial vehicle has increased, and many studies have been conducted. For Urban Aerial Mobility (UAM), it is important to recognize other vehicles, such as drones and birds, and avoid collisions when flying. In this paper, two sensors are fused to detect objects. It is used by fusing a camera sensor that can be used with light and low power, and a lidar sensor that has high near-field reliability and can know the location information of an object. Typically, Radar is used to recognize objects on airplanes but has been replaced by riders to conduct research on the drone platform. By using the features of the two sensors, the recognition rate of objects at short and long distances is increased, and reliability is increased through sensor redundancy. In addition, it was possible to drive in real-time on the embedded board through system optimization. Existing vehicles recognize other vehicles using Radar and communication. However, through the sensor fusion presented in this paper, it is possible to increase the object recognition rate in stand-alone situations.","PeriodicalId":6732,"journal":{"name":"2020 20th International Conference on Control, Automation and Systems (ICCAS)","volume":"109 1","pages":"924-927"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 20th International Conference on Control, Automation and Systems (ICCAS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.23919/ICCAS50221.2020.9268431","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
Nowadays, the importance of the object recognition of an aerial vehicle has increased, and many studies have been conducted. For Urban Aerial Mobility (UAM), it is important to recognize other vehicles, such as drones and birds, and avoid collisions when flying. In this paper, two sensors are fused to detect objects. It is used by fusing a camera sensor that can be used with light and low power, and a lidar sensor that has high near-field reliability and can know the location information of an object. Typically, Radar is used to recognize objects on airplanes but has been replaced by riders to conduct research on the drone platform. By using the features of the two sensors, the recognition rate of objects at short and long distances is increased, and reliability is increased through sensor redundancy. In addition, it was possible to drive in real-time on the embedded board through system optimization. Existing vehicles recognize other vehicles using Radar and communication. However, through the sensor fusion presented in this paper, it is possible to increase the object recognition rate in stand-alone situations.