Unlabeled data assisted domain adaptation for cross-scene image classification

IF 2.8 3区 地球科学 Q2 ASTRONOMY & ASTROPHYSICS
Advances in Space Research Pub Date : 2026-03-15 Epub Date: 2026-01-13 DOI:10.1016/j.asr.2026.01.028
Shuyue Wang , Jiawei Niu , Mohammed Bennamoun
{"title":"Unlabeled data assisted domain adaptation for cross-scene image classification","authors":"Shuyue Wang ,&nbsp;Jiawei Niu ,&nbsp;Mohammed Bennamoun","doi":"10.1016/j.asr.2026.01.028","DOIUrl":null,"url":null,"abstract":"<div><div>Domain adaptation (DA) is crucial in cross-scene image classification, enabling models to generalize across domains with varying data distributions. Existing approaches rely on abundant and diverse labeled source data to learn discriminative and transferable features for cross-domain alignment. However, such labeled data are often expensive and limited in remote sensing applications. In contrast, abundant task-relevant unlabeled data are more accessible but remain underutilized, despite containing domain-specific feature distributions that can enhance feature learning. To address this gap, we propose an Unlabeled data Assisted Domain Adaptation (UADA) framework for cross-scene image classification. UADA incorporates task-relevant unlabeled data as an auxiliary source alongside labeled source data to enrich feature diversity and improve the model’s adaptability to the target domain. Specifically, we introduce a progressive pseudo-label optimization strategy that iteratively refines pseudo-labels for unlabeled data through confidence-aware self-labeling. We then employ weight-shared feature extractors to jointly encode labeled and unlabeled source data, enabling the model to learn a unified feature space that captures diverse semantic representations for robust feature alignment. Finally, we construct domain-specific classifiers for each source and adaptively fuse their predictions, effectively harnessing complementary semantic cues for robust target classification. Extensive experiments across multiple tasks show that UADA outperforms existing methods. The code will be released at <span><span>https://github.com/Morrie0804/UADA.git</span><svg><path></path></svg></span> upon acceptance.</div></div>","PeriodicalId":50850,"journal":{"name":"Advances in Space Research","volume":"77 6","pages":"Pages 6747-6759"},"PeriodicalIF":2.8000,"publicationDate":"2026-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advances in Space Research","FirstCategoryId":"89","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0273117726000530","RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2026/1/13 0:00:00","PubModel":"Epub","JCR":"Q2","JCRName":"ASTRONOMY & ASTROPHYSICS","Score":null,"Total":0}
引用次数: 0

Abstract

Domain adaptation (DA) is crucial in cross-scene image classification, enabling models to generalize across domains with varying data distributions. Existing approaches rely on abundant and diverse labeled source data to learn discriminative and transferable features for cross-domain alignment. However, such labeled data are often expensive and limited in remote sensing applications. In contrast, abundant task-relevant unlabeled data are more accessible but remain underutilized, despite containing domain-specific feature distributions that can enhance feature learning. To address this gap, we propose an Unlabeled data Assisted Domain Adaptation (UADA) framework for cross-scene image classification. UADA incorporates task-relevant unlabeled data as an auxiliary source alongside labeled source data to enrich feature diversity and improve the model’s adaptability to the target domain. Specifically, we introduce a progressive pseudo-label optimization strategy that iteratively refines pseudo-labels for unlabeled data through confidence-aware self-labeling. We then employ weight-shared feature extractors to jointly encode labeled and unlabeled source data, enabling the model to learn a unified feature space that captures diverse semantic representations for robust feature alignment. Finally, we construct domain-specific classifiers for each source and adaptively fuse their predictions, effectively harnessing complementary semantic cues for robust target classification. Extensive experiments across multiple tasks show that UADA outperforms existing methods. The code will be released at https://github.com/Morrie0804/UADA.git upon acceptance.
无标记数据辅助跨场景图像分类的域适应
领域自适应(DA)在跨场景图像分类中至关重要,它使模型能够在不同数据分布的领域中进行泛化。现有的方法依赖于丰富多样的标记源数据来学习判别性和可转移性的特征进行跨域比对。然而,这种标记数据在遥感应用中往往昂贵且有限。相比之下,大量与任务相关的未标记数据更容易访问,但仍未得到充分利用,尽管包含可以增强特征学习的特定领域特征分布。为了解决这一问题,我们提出了一种用于跨场景图像分类的未标记数据辅助域适应(UADA)框架。UADA将与任务相关的未标记数据作为辅助源数据与标记源数据相结合,以丰富特征多样性并提高模型对目标域的适应性。具体来说,我们引入了一种渐进式伪标签优化策略,该策略通过置信度感知自标记迭代地改进未标记数据的伪标签。然后,我们使用权重共享特征提取器对标记和未标记的源数据进行联合编码,使模型能够学习统一的特征空间,以捕获不同的语义表示,从而实现鲁棒特征对齐。最后,我们为每个源构建特定于领域的分类器,并自适应融合它们的预测,有效地利用互补的语义线索进行稳健的目标分类。跨多个任务的大量实验表明,UADA优于现有方法。一经接受,代码将在https://github.com/Morrie0804/UADA.git上发布。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Advances in Space Research
Advances in Space Research 地学天文-地球科学综合
CiteScore
5.20
自引率
11.50%
发文量
800
审稿时长
5.8 months
期刊介绍: The COSPAR publication Advances in Space Research (ASR) is an open journal covering all areas of space research including: space studies of the Earth''s surface, meteorology, climate, the Earth-Moon system, planets and small bodies of the solar system, upper atmospheres, ionospheres and magnetospheres of the Earth and planets including reference atmospheres, space plasmas in the solar system, astrophysics from space, materials sciences in space, fundamental physics in space, space debris, space weather, Earth observations of space phenomena, etc. NB: Please note that manuscripts related to life sciences as related to space are no more accepted for submission to Advances in Space Research. Such manuscripts should now be submitted to the new COSPAR Journal Life Sciences in Space Research (LSSR). All submissions are reviewed by two scientists in the field. COSPAR is an interdisciplinary scientific organization concerned with the progress of space research on an international scale. Operating under the rules of ICSU, COSPAR ignores political considerations and considers all questions solely from the scientific viewpoint.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信
小红书