S. Sajjadi, Jeremy Bittick, F. Janabi-Sharifi, I. Mantegh
{"title":"A Robust and Adaptive Sensor Fusion Approach for Indoor UAV Localization*","authors":"S. Sajjadi, Jeremy Bittick, F. Janabi-Sharifi, I. Mantegh","doi":"10.1109/ICUAS57906.2023.10156526","DOIUrl":null,"url":null,"abstract":"Localization of uncrewed systems in indoor environments is challenging. The fundamental challenge with indoor localization and navigation is that the Global Navigation Satellite Systems (GNSS) signal is either unavailable or not sufficiently accurate for state estimation. uncrewed agents also commonly have to navigate through unstructured environments, which can be challenging given the absence of recognizable landmarks or patterns. Furthermore, in dynamic environments where the layout or obstacles may change frequently, the drone may need to continuously update its state estimations. In the absence of GNSS measurements, uncrewed systems rely on other onboard sensors for localization. However, each set of sensors contains its own associated uncertainty and/or the possibility of occlusion or malfunction. Hence, the design and development of reliable multi-sensor fusion algorithms for localization are deemed necessary. This paper presents the implementation and performance evaluation of an adaptive and robust Moving Horizon Estimator (MHE) for improving the state estimation of a previously developed indoor localization framework using ArUco markers. The effectiveness of the proposed sensor fusion algorithm is evaluated using an experimental setup in comparison to the high-accuracy Vicon ® motion tracking camera system.","PeriodicalId":379073,"journal":{"name":"2023 International Conference on Unmanned Aircraft Systems (ICUAS)","volume":"194 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Conference on Unmanned Aircraft Systems (ICUAS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICUAS57906.2023.10156526","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Localization of uncrewed systems in indoor environments is challenging. The fundamental challenge with indoor localization and navigation is that the Global Navigation Satellite Systems (GNSS) signal is either unavailable or not sufficiently accurate for state estimation. uncrewed agents also commonly have to navigate through unstructured environments, which can be challenging given the absence of recognizable landmarks or patterns. Furthermore, in dynamic environments where the layout or obstacles may change frequently, the drone may need to continuously update its state estimations. In the absence of GNSS measurements, uncrewed systems rely on other onboard sensors for localization. However, each set of sensors contains its own associated uncertainty and/or the possibility of occlusion or malfunction. Hence, the design and development of reliable multi-sensor fusion algorithms for localization are deemed necessary. This paper presents the implementation and performance evaluation of an adaptive and robust Moving Horizon Estimator (MHE) for improving the state estimation of a previously developed indoor localization framework using ArUco markers. The effectiveness of the proposed sensor fusion algorithm is evaluated using an experimental setup in comparison to the high-accuracy Vicon ® motion tracking camera system.