Jingjun Yi , Yanfei Zhong , Yu Su , Ruiyi Yang , Yinhe Liu , Junjue Wang
{"title":"Global urban high-resolution scene classification via uncertainty-aware domain generalization","authors":"Jingjun Yi , Yanfei Zhong , Yu Su , Ruiyi Yang , Yinhe Liu , Junjue Wang","doi":"10.1016/j.isprsjprs.2025.08.027","DOIUrl":null,"url":null,"abstract":"<div><div>Global urban scene classification is a crucial technology for global land use mapping, holding significant importance in driving urban intelligence forward. When applying datasets constructed from urban scenes on a global scale, there are two serious problems. Due to cultural, economic, and other factors, style differences exist in scenes across different cities, posing challenges for model generalization. Additionally, urban scene samples often follows a long-tailed distribution, complicating the identification of tail categories with small sample volumes and impairing performance under domain generalization settings. To tackle these problems, the Uncertainty-aware Domain Generalization urban scene classification (UADG) framework is constructed. For mitigating city-related style difference among global cities, a city-related whitening is proposed, utilizing whitening operations to separate city unrelated content features and adaptively preserving city-related information hidden in style features, rather than directly removing style information, thus aiding in more robust representations. To tackle the phenomenon of significant accuracy decline in tail classes during domain generalization, estimated uncertainty is utilized to guide the mixture of experts, and reasonable expert assignment is conducted for hard samples to balance the model bias. To evaluate the proposed UADG framework under practical scenario, the Domain Generalized Urban Scene (DGUS) dataset is curated for validation, with a training set comprising 42 classes of samples from 34 provincial capitals in China, and test samples selected from representative cities across six continents. Extensive experiments have demonstrated that our method achieves state-of-the-art performance, notably outperforming the baseline GAMMA by 9.79% and 7.42% with average OA and AA metric on the unseen domains of DGUS, respectively. UADG greatly enhancing the automation of global urban land use mapping.</div></div>","PeriodicalId":50269,"journal":{"name":"ISPRS Journal of Photogrammetry and Remote Sensing","volume":"230 ","pages":"Pages 92-108"},"PeriodicalIF":12.2000,"publicationDate":"2025-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ISPRS Journal of Photogrammetry and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0924271625003387","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GEOGRAPHY, PHYSICAL","Score":null,"Total":0}
引用次数: 0
Abstract
Global urban scene classification is a crucial technology for global land use mapping, holding significant importance in driving urban intelligence forward. When applying datasets constructed from urban scenes on a global scale, there are two serious problems. Due to cultural, economic, and other factors, style differences exist in scenes across different cities, posing challenges for model generalization. Additionally, urban scene samples often follows a long-tailed distribution, complicating the identification of tail categories with small sample volumes and impairing performance under domain generalization settings. To tackle these problems, the Uncertainty-aware Domain Generalization urban scene classification (UADG) framework is constructed. For mitigating city-related style difference among global cities, a city-related whitening is proposed, utilizing whitening operations to separate city unrelated content features and adaptively preserving city-related information hidden in style features, rather than directly removing style information, thus aiding in more robust representations. To tackle the phenomenon of significant accuracy decline in tail classes during domain generalization, estimated uncertainty is utilized to guide the mixture of experts, and reasonable expert assignment is conducted for hard samples to balance the model bias. To evaluate the proposed UADG framework under practical scenario, the Domain Generalized Urban Scene (DGUS) dataset is curated for validation, with a training set comprising 42 classes of samples from 34 provincial capitals in China, and test samples selected from representative cities across six continents. Extensive experiments have demonstrated that our method achieves state-of-the-art performance, notably outperforming the baseline GAMMA by 9.79% and 7.42% with average OA and AA metric on the unseen domains of DGUS, respectively. UADG greatly enhancing the automation of global urban land use mapping.
期刊介绍:
The ISPRS Journal of Photogrammetry and Remote Sensing (P&RS) serves as the official journal of the International Society for Photogrammetry and Remote Sensing (ISPRS). It acts as a platform for scientists and professionals worldwide who are involved in various disciplines that utilize photogrammetry, remote sensing, spatial information systems, computer vision, and related fields. The journal aims to facilitate communication and dissemination of advancements in these disciplines, while also acting as a comprehensive source of reference and archive.
P&RS endeavors to publish high-quality, peer-reviewed research papers that are preferably original and have not been published before. These papers can cover scientific/research, technological development, or application/practical aspects. Additionally, the journal welcomes papers that are based on presentations from ISPRS meetings, as long as they are considered significant contributions to the aforementioned fields.
In particular, P&RS encourages the submission of papers that are of broad scientific interest, showcase innovative applications (especially in emerging fields), have an interdisciplinary focus, discuss topics that have received limited attention in P&RS or related journals, or explore new directions in scientific or professional realms. It is preferred that theoretical papers include practical applications, while papers focusing on systems and applications should include a theoretical background.