Wenwu Ou , Qingwu Hu , Mingyao Ai , Pengcheng Zhao , Shunli Wang , Xujie Zhang , Shuowen Huang
{"title":"一种基于360度全景视觉特征地图的实时图像检索与定位方法","authors":"Wenwu Ou , Qingwu Hu , Mingyao Ai , Pengcheng Zhao , Shunli Wang , Xujie Zhang , Shuowen Huang","doi":"10.1016/j.isprsjprs.2025.08.018","DOIUrl":null,"url":null,"abstract":"<div><div>Accurate and reliable localization in large indoor environments without satellite signals remains a significant challenge. In recent years, visual localization has emerged as a popular indoor localization method. Its core idea is to pre-built a 3D sparse feature map database and estimate the 6-DoF pose of query images for precise localization. This technology holds great potential for applications such as augmented reality (AR) and AR navigation in large indoor scenes. However, the presence of weak textures and repetitive textures poses substantial challenges to the pre-built feature map database and image retrieval, severely affecting the accuracy and robustness of localization. In this paper, we propose a real-time image retrieval and localization method based on a 360-degree panoramic visual feature global map. The proposed method consists of three main components: 360° panoramic sparse feature map construction (PGFC); an image retrieval strategy based on point cloud overlap (PCO-IR); visual localization method enhanced by PCO-IR. Extensive experiments demonstrate that our approach surpasses both state-of-the-art research methods and commercial software (e.g., COLMAP, Metashape) in weak-texture and repetitive-texture regions. Across three distinct indoor scenarios, the PCO-IR enhancement yields significant accuracy gains: after optimization, PixLoc and HLOC achieve localization success rates of 95% and 97%, respectively, with mean pose errors reduced to 72% and 37% of their original values. The code for our proposed method can be found at <span><span>https://github.com/ouwenwu/pco_ir</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":50269,"journal":{"name":"ISPRS Journal of Photogrammetry and Remote Sensing","volume":"229 ","pages":"Pages 351-365"},"PeriodicalIF":12.2000,"publicationDate":"2025-09-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A real-time image retrieval and localization method based on 360-degree panoramic visual feature maps\",\"authors\":\"Wenwu Ou , Qingwu Hu , Mingyao Ai , Pengcheng Zhao , Shunli Wang , Xujie Zhang , Shuowen Huang\",\"doi\":\"10.1016/j.isprsjprs.2025.08.018\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Accurate and reliable localization in large indoor environments without satellite signals remains a significant challenge. In recent years, visual localization has emerged as a popular indoor localization method. Its core idea is to pre-built a 3D sparse feature map database and estimate the 6-DoF pose of query images for precise localization. This technology holds great potential for applications such as augmented reality (AR) and AR navigation in large indoor scenes. However, the presence of weak textures and repetitive textures poses substantial challenges to the pre-built feature map database and image retrieval, severely affecting the accuracy and robustness of localization. In this paper, we propose a real-time image retrieval and localization method based on a 360-degree panoramic visual feature global map. The proposed method consists of three main components: 360° panoramic sparse feature map construction (PGFC); an image retrieval strategy based on point cloud overlap (PCO-IR); visual localization method enhanced by PCO-IR. Extensive experiments demonstrate that our approach surpasses both state-of-the-art research methods and commercial software (e.g., COLMAP, Metashape) in weak-texture and repetitive-texture regions. Across three distinct indoor scenarios, the PCO-IR enhancement yields significant accuracy gains: after optimization, PixLoc and HLOC achieve localization success rates of 95% and 97%, respectively, with mean pose errors reduced to 72% and 37% of their original values. The code for our proposed method can be found at <span><span>https://github.com/ouwenwu/pco_ir</span><svg><path></path></svg></span>.</div></div>\",\"PeriodicalId\":50269,\"journal\":{\"name\":\"ISPRS Journal of Photogrammetry and Remote Sensing\",\"volume\":\"229 \",\"pages\":\"Pages 351-365\"},\"PeriodicalIF\":12.2000,\"publicationDate\":\"2025-09-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ISPRS Journal of Photogrammetry and Remote Sensing\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0924271625003296\",\"RegionNum\":1,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"GEOGRAPHY, PHYSICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ISPRS Journal of Photogrammetry and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0924271625003296","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GEOGRAPHY, PHYSICAL","Score":null,"Total":0}
A real-time image retrieval and localization method based on 360-degree panoramic visual feature maps
Accurate and reliable localization in large indoor environments without satellite signals remains a significant challenge. In recent years, visual localization has emerged as a popular indoor localization method. Its core idea is to pre-built a 3D sparse feature map database and estimate the 6-DoF pose of query images for precise localization. This technology holds great potential for applications such as augmented reality (AR) and AR navigation in large indoor scenes. However, the presence of weak textures and repetitive textures poses substantial challenges to the pre-built feature map database and image retrieval, severely affecting the accuracy and robustness of localization. In this paper, we propose a real-time image retrieval and localization method based on a 360-degree panoramic visual feature global map. The proposed method consists of three main components: 360° panoramic sparse feature map construction (PGFC); an image retrieval strategy based on point cloud overlap (PCO-IR); visual localization method enhanced by PCO-IR. Extensive experiments demonstrate that our approach surpasses both state-of-the-art research methods and commercial software (e.g., COLMAP, Metashape) in weak-texture and repetitive-texture regions. Across three distinct indoor scenarios, the PCO-IR enhancement yields significant accuracy gains: after optimization, PixLoc and HLOC achieve localization success rates of 95% and 97%, respectively, with mean pose errors reduced to 72% and 37% of their original values. The code for our proposed method can be found at https://github.com/ouwenwu/pco_ir.
期刊介绍:
The ISPRS Journal of Photogrammetry and Remote Sensing (P&RS) serves as the official journal of the International Society for Photogrammetry and Remote Sensing (ISPRS). It acts as a platform for scientists and professionals worldwide who are involved in various disciplines that utilize photogrammetry, remote sensing, spatial information systems, computer vision, and related fields. The journal aims to facilitate communication and dissemination of advancements in these disciplines, while also acting as a comprehensive source of reference and archive.
P&RS endeavors to publish high-quality, peer-reviewed research papers that are preferably original and have not been published before. These papers can cover scientific/research, technological development, or application/practical aspects. Additionally, the journal welcomes papers that are based on presentations from ISPRS meetings, as long as they are considered significant contributions to the aforementioned fields.
In particular, P&RS encourages the submission of papers that are of broad scientific interest, showcase innovative applications (especially in emerging fields), have an interdisciplinary focus, discuss topics that have received limited attention in P&RS or related journals, or explore new directions in scientific or professional realms. It is preferred that theoretical papers include practical applications, while papers focusing on systems and applications should include a theoretical background.