Yong Tang , Jingyi Zhang , Jianhua Gong , Yi Li , Banghui Yang
{"title":"基于地图匹配网络的城市级航空地理定位","authors":"Yong Tang , Jingyi Zhang , Jianhua Gong , Yi Li , Banghui Yang","doi":"10.1016/j.isprsjprs.2025.08.002","DOIUrl":null,"url":null,"abstract":"<div><div>Autonomous localization of aircraft relies on precise geo-localization, and under Global Navigation Satellite System (GNSS)-denied conditions, visual localization methods are among the most important techniques for aircraft autonomous localization. Global visual localization typically relies on pre-established 3D maps, which require significant storage and computational overhead, limiting the applicability of aerial visual localization. Therefore, we propose a visual localization method based on OpenStreetMap, an openly accessible 2D map. This method not only enables localization in the absence of GNSS but also has lower storage and computational requirements compared to 3D map-based visual methods. This makes our approach feasible for visual geo-localization at the urban scale. We designed a neural network model based on the Vision Transformer (ViT) to extract features from aerial images and 2D maps for fast matching and retrieval, thereby estimating the global geo-location of the aerial images. Additionally, we employ particle filtering to fuse location estimates from map retrieval, visual odometry, and GNSS, achieving higher-precision real-time localization. Moreover, we collected aerial images and map tiles covering over 1000 square kilometers from the urban and suburban areas of four large cities, creating a novel aerial image-to-map matching dataset. Experiments show that, compared to the current state-of-the-art methods, our map retrieval network achieves a higher average recall rate on the dataset. In GNSS-denied conditions, our multi-source fusion localization method can achieve real-time global geo-localization at the urban scale, and under weak GNSS signals, our method provides significantly higher localization accuracy than GNSS alone.</div></div>","PeriodicalId":50269,"journal":{"name":"ISPRS Journal of Photogrammetry and Remote Sensing","volume":"229 ","pages":"Pages 65-77"},"PeriodicalIF":12.2000,"publicationDate":"2025-08-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"City-level aerial geo-localization based on map matching network\",\"authors\":\"Yong Tang , Jingyi Zhang , Jianhua Gong , Yi Li , Banghui Yang\",\"doi\":\"10.1016/j.isprsjprs.2025.08.002\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Autonomous localization of aircraft relies on precise geo-localization, and under Global Navigation Satellite System (GNSS)-denied conditions, visual localization methods are among the most important techniques for aircraft autonomous localization. Global visual localization typically relies on pre-established 3D maps, which require significant storage and computational overhead, limiting the applicability of aerial visual localization. Therefore, we propose a visual localization method based on OpenStreetMap, an openly accessible 2D map. This method not only enables localization in the absence of GNSS but also has lower storage and computational requirements compared to 3D map-based visual methods. This makes our approach feasible for visual geo-localization at the urban scale. We designed a neural network model based on the Vision Transformer (ViT) to extract features from aerial images and 2D maps for fast matching and retrieval, thereby estimating the global geo-location of the aerial images. Additionally, we employ particle filtering to fuse location estimates from map retrieval, visual odometry, and GNSS, achieving higher-precision real-time localization. Moreover, we collected aerial images and map tiles covering over 1000 square kilometers from the urban and suburban areas of four large cities, creating a novel aerial image-to-map matching dataset. Experiments show that, compared to the current state-of-the-art methods, our map retrieval network achieves a higher average recall rate on the dataset. In GNSS-denied conditions, our multi-source fusion localization method can achieve real-time global geo-localization at the urban scale, and under weak GNSS signals, our method provides significantly higher localization accuracy than GNSS alone.</div></div>\",\"PeriodicalId\":50269,\"journal\":{\"name\":\"ISPRS Journal of Photogrammetry and Remote Sensing\",\"volume\":\"229 \",\"pages\":\"Pages 65-77\"},\"PeriodicalIF\":12.2000,\"publicationDate\":\"2025-08-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ISPRS Journal of Photogrammetry and Remote Sensing\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0924271625003144\",\"RegionNum\":1,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"GEOGRAPHY, PHYSICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ISPRS Journal of Photogrammetry and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0924271625003144","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GEOGRAPHY, PHYSICAL","Score":null,"Total":0}
City-level aerial geo-localization based on map matching network
Autonomous localization of aircraft relies on precise geo-localization, and under Global Navigation Satellite System (GNSS)-denied conditions, visual localization methods are among the most important techniques for aircraft autonomous localization. Global visual localization typically relies on pre-established 3D maps, which require significant storage and computational overhead, limiting the applicability of aerial visual localization. Therefore, we propose a visual localization method based on OpenStreetMap, an openly accessible 2D map. This method not only enables localization in the absence of GNSS but also has lower storage and computational requirements compared to 3D map-based visual methods. This makes our approach feasible for visual geo-localization at the urban scale. We designed a neural network model based on the Vision Transformer (ViT) to extract features from aerial images and 2D maps for fast matching and retrieval, thereby estimating the global geo-location of the aerial images. Additionally, we employ particle filtering to fuse location estimates from map retrieval, visual odometry, and GNSS, achieving higher-precision real-time localization. Moreover, we collected aerial images and map tiles covering over 1000 square kilometers from the urban and suburban areas of four large cities, creating a novel aerial image-to-map matching dataset. Experiments show that, compared to the current state-of-the-art methods, our map retrieval network achieves a higher average recall rate on the dataset. In GNSS-denied conditions, our multi-source fusion localization method can achieve real-time global geo-localization at the urban scale, and under weak GNSS signals, our method provides significantly higher localization accuracy than GNSS alone.
期刊介绍:
The ISPRS Journal of Photogrammetry and Remote Sensing (P&RS) serves as the official journal of the International Society for Photogrammetry and Remote Sensing (ISPRS). It acts as a platform for scientists and professionals worldwide who are involved in various disciplines that utilize photogrammetry, remote sensing, spatial information systems, computer vision, and related fields. The journal aims to facilitate communication and dissemination of advancements in these disciplines, while also acting as a comprehensive source of reference and archive.
P&RS endeavors to publish high-quality, peer-reviewed research papers that are preferably original and have not been published before. These papers can cover scientific/research, technological development, or application/practical aspects. Additionally, the journal welcomes papers that are based on presentations from ISPRS meetings, as long as they are considered significant contributions to the aforementioned fields.
In particular, P&RS encourages the submission of papers that are of broad scientific interest, showcase innovative applications (especially in emerging fields), have an interdisciplinary focus, discuss topics that have received limited attention in P&RS or related journals, or explore new directions in scientific or professional realms. It is preferred that theoretical papers include practical applications, while papers focusing on systems and applications should include a theoretical background.