Sándor Gazdag, Dániel Pasztornicky, Zsolt Jankó, T. Szirányi, A. Majdik
{"title":"Collaborative Visual-Inertial Localization of Teams With Floorplan Extraction","authors":"Sándor Gazdag, Dániel Pasztornicky, Zsolt Jankó, T. Szirányi, A. Majdik","doi":"10.1109/ICASSPW59220.2023.10192967","DOIUrl":null,"url":null,"abstract":"This paper showcases a real-world example of a system that achieves collaborative localization and mapping of multiple agents within a building. The proposed system processes the odometry and 3D point cloud data collected by the agents moving around the building to automatically generate the building’s floorplan on which the agent trajectories are overlaid. The wearable hardware consists of a low-cost passive integrated sensor that includes both a camera and an IMU (Inertial Measurement Unit) and an embedded compute unit. The system’s capabilities are shown through real-world experiments.","PeriodicalId":158726,"journal":{"name":"2023 IEEE International Conference on Acoustics, Speech, and Signal Processing Workshops (ICASSPW)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE International Conference on Acoustics, Speech, and Signal Processing Workshops (ICASSPW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICASSPW59220.2023.10192967","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
This paper showcases a real-world example of a system that achieves collaborative localization and mapping of multiple agents within a building. The proposed system processes the odometry and 3D point cloud data collected by the agents moving around the building to automatically generate the building’s floorplan on which the agent trajectories are overlaid. The wearable hardware consists of a low-cost passive integrated sensor that includes both a camera and an IMU (Inertial Measurement Unit) and an embedded compute unit. The system’s capabilities are shown through real-world experiments.