基于对比学习的城市建成环境众包图像多源地理定位

IF 12.2 1区 地球科学 Q1 GEOGRAPHY, PHYSICAL
Qianbao Hou , Ce Hou , Fan Zhang , Qihao Weng
{"title":"基于对比学习的城市建成环境众包图像多源地理定位","authors":"Qianbao Hou ,&nbsp;Ce Hou ,&nbsp;Fan Zhang ,&nbsp;Qihao Weng","doi":"10.1016/j.isprsjprs.2025.09.024","DOIUrl":null,"url":null,"abstract":"<div><div>Crowd-sourced images (CSIs) offer an unprecedented opportunity for gaining deeper insights into urban built environments. However, the lack of precise geographic information limits their effectiveness in various urban applications. Traditional geo-localization methods, which rely on matching CSIs with geo-tagged street-view images (SVIs), face significant challenges due to sparse coverage and temporal misalignment of reference data, especially in developing countries. To overcome these limitations, this paper proposes a novel contrastive learning framework that integrates SVIs and satellite images (SIs), utilizing a multi-scale channel attention module and InfoNCE loss to enhance the geo-localization accuracy of CSIs. Additionally, we leverage SIs to generate synthetic SVIs in areas where actual SVIs are unavailable or outdated, ensuring comprehensive coverage across diverse urban environments. A simple yet efficient data preprocessing method is proposed to align multi-view images for enhanced feature fusion. As part of our research efforts, we introduce a Multi-Source Geo-localization Dataset (MSGD) consisting of 500k geo-tagged pairs collected from 12 cities across six continents, encompassing diverse urban typologies from dense skyscraper districts to low-density areas, providing a valuable resource for future research and advancements in geo-localization methods. Our experiments show that the proposed method outperforms state-of-the-art approaches on the challenging MSGD dataset, highlighting the importance of incorporating SIs as a complementary data source for accurate geo-localization. Our code and dataset will be released at <span><span>https://github.com/RCAIG/CrowdsourcingGeoLocalization</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":50269,"journal":{"name":"ISPRS Journal of Photogrammetry and Remote Sensing","volume":"230 ","pages":"Pages 616-629"},"PeriodicalIF":12.2000,"publicationDate":"2025-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Multi-source geo-localization in urban built environments for crowd-sourced images by contrastive learning\",\"authors\":\"Qianbao Hou ,&nbsp;Ce Hou ,&nbsp;Fan Zhang ,&nbsp;Qihao Weng\",\"doi\":\"10.1016/j.isprsjprs.2025.09.024\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Crowd-sourced images (CSIs) offer an unprecedented opportunity for gaining deeper insights into urban built environments. However, the lack of precise geographic information limits their effectiveness in various urban applications. Traditional geo-localization methods, which rely on matching CSIs with geo-tagged street-view images (SVIs), face significant challenges due to sparse coverage and temporal misalignment of reference data, especially in developing countries. To overcome these limitations, this paper proposes a novel contrastive learning framework that integrates SVIs and satellite images (SIs), utilizing a multi-scale channel attention module and InfoNCE loss to enhance the geo-localization accuracy of CSIs. Additionally, we leverage SIs to generate synthetic SVIs in areas where actual SVIs are unavailable or outdated, ensuring comprehensive coverage across diverse urban environments. A simple yet efficient data preprocessing method is proposed to align multi-view images for enhanced feature fusion. As part of our research efforts, we introduce a Multi-Source Geo-localization Dataset (MSGD) consisting of 500k geo-tagged pairs collected from 12 cities across six continents, encompassing diverse urban typologies from dense skyscraper districts to low-density areas, providing a valuable resource for future research and advancements in geo-localization methods. Our experiments show that the proposed method outperforms state-of-the-art approaches on the challenging MSGD dataset, highlighting the importance of incorporating SIs as a complementary data source for accurate geo-localization. Our code and dataset will be released at <span><span>https://github.com/RCAIG/CrowdsourcingGeoLocalization</span><svg><path></path></svg></span>.</div></div>\",\"PeriodicalId\":50269,\"journal\":{\"name\":\"ISPRS Journal of Photogrammetry and Remote Sensing\",\"volume\":\"230 \",\"pages\":\"Pages 616-629\"},\"PeriodicalIF\":12.2000,\"publicationDate\":\"2025-10-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ISPRS Journal of Photogrammetry and Remote Sensing\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S092427162500382X\",\"RegionNum\":1,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"GEOGRAPHY, PHYSICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ISPRS Journal of Photogrammetry and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S092427162500382X","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GEOGRAPHY, PHYSICAL","Score":null,"Total":0}
引用次数: 0

摘要

众包图像(csi)为深入了解城市建筑环境提供了前所未有的机会。然而,缺乏精确的地理信息限制了它们在各种城市应用中的有效性。传统的地理定位方法依赖于将csi与地理标记的街景图像(svi)进行匹配,由于参考数据的覆盖范围稀疏和时间错位,特别是在发展中国家,这种方法面临着重大挑战。为了克服这些局限性,本文提出了一种集成svi和卫星图像的对比学习框架,利用多尺度通道关注模块和InfoNCE损失来提高csi的地理定位精度。此外,我们利用si在实际的svi不可用或过时的地区生成合成的svi,确保在不同的城市环境中全面覆盖。提出了一种简单有效的多视点图像对齐数据预处理方法,以增强特征融合。作为我们研究工作的一部分,我们引入了一个多源地理定位数据集(MSGD),该数据集由来自六大洲12个城市的500k个地理标记对组成,涵盖了从密集摩天大楼区到低密度地区的不同城市类型,为未来的研究和地理定位方法的进步提供了宝贵的资源。我们的实验表明,所提出的方法在具有挑战性的MSGD数据集上优于最先进的方法,突出了将SIs作为准确地理定位的补充数据源的重要性。我们的代码和数据集将在https://github.com/RCAIG/CrowdsourcingGeoLocalization上发布。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Multi-source geo-localization in urban built environments for crowd-sourced images by contrastive learning
Crowd-sourced images (CSIs) offer an unprecedented opportunity for gaining deeper insights into urban built environments. However, the lack of precise geographic information limits their effectiveness in various urban applications. Traditional geo-localization methods, which rely on matching CSIs with geo-tagged street-view images (SVIs), face significant challenges due to sparse coverage and temporal misalignment of reference data, especially in developing countries. To overcome these limitations, this paper proposes a novel contrastive learning framework that integrates SVIs and satellite images (SIs), utilizing a multi-scale channel attention module and InfoNCE loss to enhance the geo-localization accuracy of CSIs. Additionally, we leverage SIs to generate synthetic SVIs in areas where actual SVIs are unavailable or outdated, ensuring comprehensive coverage across diverse urban environments. A simple yet efficient data preprocessing method is proposed to align multi-view images for enhanced feature fusion. As part of our research efforts, we introduce a Multi-Source Geo-localization Dataset (MSGD) consisting of 500k geo-tagged pairs collected from 12 cities across six continents, encompassing diverse urban typologies from dense skyscraper districts to low-density areas, providing a valuable resource for future research and advancements in geo-localization methods. Our experiments show that the proposed method outperforms state-of-the-art approaches on the challenging MSGD dataset, highlighting the importance of incorporating SIs as a complementary data source for accurate geo-localization. Our code and dataset will be released at https://github.com/RCAIG/CrowdsourcingGeoLocalization.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
ISPRS Journal of Photogrammetry and Remote Sensing
ISPRS Journal of Photogrammetry and Remote Sensing 工程技术-成像科学与照相技术
CiteScore
21.00
自引率
6.30%
发文量
273
审稿时长
40 days
期刊介绍: The ISPRS Journal of Photogrammetry and Remote Sensing (P&RS) serves as the official journal of the International Society for Photogrammetry and Remote Sensing (ISPRS). It acts as a platform for scientists and professionals worldwide who are involved in various disciplines that utilize photogrammetry, remote sensing, spatial information systems, computer vision, and related fields. The journal aims to facilitate communication and dissemination of advancements in these disciplines, while also acting as a comprehensive source of reference and archive. P&RS endeavors to publish high-quality, peer-reviewed research papers that are preferably original and have not been published before. These papers can cover scientific/research, technological development, or application/practical aspects. Additionally, the journal welcomes papers that are based on presentations from ISPRS meetings, as long as they are considered significant contributions to the aforementioned fields. In particular, P&RS encourages the submission of papers that are of broad scientific interest, showcase innovative applications (especially in emerging fields), have an interdisciplinary focus, discuss topics that have received limited attention in P&RS or related journals, or explore new directions in scientific or professional realms. It is preferred that theoretical papers include practical applications, while papers focusing on systems and applications should include a theoretical background.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信