{"title":"User-Aided Global Registration Method using Geospatial 3D Data for Large-Scale Mobile Outdoor Augmented Reality","authors":"S. Burkard, F. Fuchs-Kittowski","doi":"10.1109/ISMAR-Adjunct51615.2020.00041","DOIUrl":null,"url":null,"abstract":"Accurate global camera registration is a key requirement for precise AR visualizations in large-scale outdoor AR applications. Existing approaches mostly use complex image-based registration methods requiring large pre-registered databases of geo-referenced images or point clouds that are hardly applicable to large-scale areas. In this paper, we present a simple yet effective user-aided registration method that utilizes common geospatial 3D data to globally register mobile devices. For this purpose, text-based 3D geospatial data including digital 3D terrain and city models is processed into small-scale 3D meshes and displayed in a live AR view. Via two common mobile touch gestures the generated virtual models can be aligned manually to match the actual perception of the real-world environment. Experimental results show that - combined with a robust local visual-inertial tracking system - this approach enables an efficient and accurate global registration of mobile devices in various environments determining the camera attitude with less than one degree deviation while achieving a high degree of immersion through realistic occlusion behavior.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"140 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00041","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
Accurate global camera registration is a key requirement for precise AR visualizations in large-scale outdoor AR applications. Existing approaches mostly use complex image-based registration methods requiring large pre-registered databases of geo-referenced images or point clouds that are hardly applicable to large-scale areas. In this paper, we present a simple yet effective user-aided registration method that utilizes common geospatial 3D data to globally register mobile devices. For this purpose, text-based 3D geospatial data including digital 3D terrain and city models is processed into small-scale 3D meshes and displayed in a live AR view. Via two common mobile touch gestures the generated virtual models can be aligned manually to match the actual perception of the real-world environment. Experimental results show that - combined with a robust local visual-inertial tracking system - this approach enables an efficient and accurate global registration of mobile devices in various environments determining the camera attitude with less than one degree deviation while achieving a high degree of immersion through realistic occlusion behavior.