Cross-View Visual Geo-Localization for Outdoor Augmented Reality

Niluthpol Chowdhury Mithun, Kshitij Minhas, Han-Pang Chiu, T. Oskiper, Mikhail Sizintsev, S. Samarasekera, Rakesh Kumar
{"title":"Cross-View Visual Geo-Localization for Outdoor Augmented Reality","authors":"Niluthpol Chowdhury Mithun, Kshitij Minhas, Han-Pang Chiu, T. Oskiper, Mikhail Sizintsev, S. Samarasekera, Rakesh Kumar","doi":"10.1109/VR55154.2023.00064","DOIUrl":null,"url":null,"abstract":"Precise estimation of global orientation and location is critical to ensure a compelling outdoor Augmented Reality (AR) experience. We address the problem of geo-pose estimation by cross-view matching of query ground images to a geo-referenced aerial satellite image database. Recently, neural network-based methods have shown state-of-the-art performance in cross-view matching. However, most of the prior works focus only on location estimation, ignoring orientation, which cannot meet the requirements in outdoor AR applications. We propose a new transformer neural network-based model and a modified triplet ranking loss for joint location and orientation estimation. Experiments on several benchmark cross-view geo-localization datasets show that our model achieves state-of-the-art performance. Furthermore, we present an approach to extend the single image query-based geo-localization approach by utilizing temporal information from a navigation pipeline for robust continuous geo-localization. Experimentation on several large-scale real-world video sequences demonstrates that our approach enables high-precision and stable AR insertion.","PeriodicalId":346767,"journal":{"name":"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)","volume":"43 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/VR55154.2023.00064","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Precise estimation of global orientation and location is critical to ensure a compelling outdoor Augmented Reality (AR) experience. We address the problem of geo-pose estimation by cross-view matching of query ground images to a geo-referenced aerial satellite image database. Recently, neural network-based methods have shown state-of-the-art performance in cross-view matching. However, most of the prior works focus only on location estimation, ignoring orientation, which cannot meet the requirements in outdoor AR applications. We propose a new transformer neural network-based model and a modified triplet ranking loss for joint location and orientation estimation. Experiments on several benchmark cross-view geo-localization datasets show that our model achieves state-of-the-art performance. Furthermore, we present an approach to extend the single image query-based geo-localization approach by utilizing temporal information from a navigation pipeline for robust continuous geo-localization. Experimentation on several large-scale real-world video sequences demonstrates that our approach enables high-precision and stable AR insertion.
面向户外增强现实的交叉视角视觉地理定位
精确估计全球方向和位置对于确保令人信服的户外增强现实(AR)体验至关重要。我们通过将查询地面图像与地理参考航空卫星图像数据库进行交叉视图匹配来解决地理姿态估计问题。近年来,基于神经网络的方法在交叉视图匹配中表现出了最先进的性能。然而,以往的工作大多只关注位置估计,忽略了方向,无法满足户外AR应用的要求。我们提出了一种新的基于变压器神经网络的模型和一种改进的三重秩损失来估计关节的位置和方向。在几个基准交叉视角地理定位数据集上的实验表明,我们的模型达到了最先进的性能。此外,我们提出了一种方法,通过利用来自导航管道的时间信息来扩展基于单幅图像查询的地理定位方法,以实现鲁棒的连续地理定位。在几个大规模真实视频序列上的实验表明,我们的方法可以实现高精度和稳定的AR插入。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信