Pawarut Karaked, Watcharapol Saengphet, S. Tantrairatn
{"title":"Multi-Sensor Fusion with Extended Kalman Filter for Indoor Localization system of Multirotor UAV","authors":"Pawarut Karaked, Watcharapol Saengphet, S. Tantrairatn","doi":"10.1109/jcsse54890.2022.9836275","DOIUrl":null,"url":null,"abstract":"This research presents the method to improve the robustness of indoor UAV localization via fusion of visual SLAM and Lidar SLAM with Extended Kalman Filter (EKF). The visual and Lidar SLAM methodologies are applied to compensate for different pose errors in various situations, such as various lighting and reflection, respectively. In the experiment, Lidar and a stereo camera with SLAM methods are installed on the drone. When starting SLAM in both methods will localize and provide position and orientation data. The data will be fused by Extended Kalman Filter and provides updated data. Therefore, if there is an error in either of the SLAM methods, the system will continue to work properly. In the test, the drone was conducted in various situations where the drone is used to have an error using both SLAM. A result shows that the data is obtained from the EKF remains normal in various situations.","PeriodicalId":284735,"journal":{"name":"2022 19th International Joint Conference on Computer Science and Software Engineering (JCSSE)","volume":"26 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 19th International Joint Conference on Computer Science and Software Engineering (JCSSE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/jcsse54890.2022.9836275","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
This research presents the method to improve the robustness of indoor UAV localization via fusion of visual SLAM and Lidar SLAM with Extended Kalman Filter (EKF). The visual and Lidar SLAM methodologies are applied to compensate for different pose errors in various situations, such as various lighting and reflection, respectively. In the experiment, Lidar and a stereo camera with SLAM methods are installed on the drone. When starting SLAM in both methods will localize and provide position and orientation data. The data will be fused by Extended Kalman Filter and provides updated data. Therefore, if there is an error in either of the SLAM methods, the system will continue to work properly. In the test, the drone was conducted in various situations where the drone is used to have an error using both SLAM. A result shows that the data is obtained from the EKF remains normal in various situations.