{"title":"Unlabeled data assisted domain adaptation for cross-scene image classification","authors":"Shuyue Wang , Jiawei Niu , Mohammed Bennamoun","doi":"10.1016/j.asr.2026.01.028","DOIUrl":null,"url":null,"abstract":"<div><div>Domain adaptation (DA) is crucial in cross-scene image classification, enabling models to generalize across domains with varying data distributions. Existing approaches rely on abundant and diverse labeled source data to learn discriminative and transferable features for cross-domain alignment. However, such labeled data are often expensive and limited in remote sensing applications. In contrast, abundant task-relevant unlabeled data are more accessible but remain underutilized, despite containing domain-specific feature distributions that can enhance feature learning. To address this gap, we propose an Unlabeled data Assisted Domain Adaptation (UADA) framework for cross-scene image classification. UADA incorporates task-relevant unlabeled data as an auxiliary source alongside labeled source data to enrich feature diversity and improve the model’s adaptability to the target domain. Specifically, we introduce a progressive pseudo-label optimization strategy that iteratively refines pseudo-labels for unlabeled data through confidence-aware self-labeling. We then employ weight-shared feature extractors to jointly encode labeled and unlabeled source data, enabling the model to learn a unified feature space that captures diverse semantic representations for robust feature alignment. Finally, we construct domain-specific classifiers for each source and adaptively fuse their predictions, effectively harnessing complementary semantic cues for robust target classification. Extensive experiments across multiple tasks show that UADA outperforms existing methods. The code will be released at <span><span>https://github.com/Morrie0804/UADA.git</span><svg><path></path></svg></span> upon acceptance.</div></div>","PeriodicalId":50850,"journal":{"name":"Advances in Space Research","volume":"77 6","pages":"Pages 6747-6759"},"PeriodicalIF":2.8000,"publicationDate":"2026-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advances in Space Research","FirstCategoryId":"89","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0273117726000530","RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2026/1/13 0:00:00","PubModel":"Epub","JCR":"Q2","JCRName":"ASTRONOMY & ASTROPHYSICS","Score":null,"Total":0}
引用次数: 0
Abstract
Domain adaptation (DA) is crucial in cross-scene image classification, enabling models to generalize across domains with varying data distributions. Existing approaches rely on abundant and diverse labeled source data to learn discriminative and transferable features for cross-domain alignment. However, such labeled data are often expensive and limited in remote sensing applications. In contrast, abundant task-relevant unlabeled data are more accessible but remain underutilized, despite containing domain-specific feature distributions that can enhance feature learning. To address this gap, we propose an Unlabeled data Assisted Domain Adaptation (UADA) framework for cross-scene image classification. UADA incorporates task-relevant unlabeled data as an auxiliary source alongside labeled source data to enrich feature diversity and improve the model’s adaptability to the target domain. Specifically, we introduce a progressive pseudo-label optimization strategy that iteratively refines pseudo-labels for unlabeled data through confidence-aware self-labeling. We then employ weight-shared feature extractors to jointly encode labeled and unlabeled source data, enabling the model to learn a unified feature space that captures diverse semantic representations for robust feature alignment. Finally, we construct domain-specific classifiers for each source and adaptively fuse their predictions, effectively harnessing complementary semantic cues for robust target classification. Extensive experiments across multiple tasks show that UADA outperforms existing methods. The code will be released at https://github.com/Morrie0804/UADA.git upon acceptance.
期刊介绍:
The COSPAR publication Advances in Space Research (ASR) is an open journal covering all areas of space research including: space studies of the Earth''s surface, meteorology, climate, the Earth-Moon system, planets and small bodies of the solar system, upper atmospheres, ionospheres and magnetospheres of the Earth and planets including reference atmospheres, space plasmas in the solar system, astrophysics from space, materials sciences in space, fundamental physics in space, space debris, space weather, Earth observations of space phenomena, etc.
NB: Please note that manuscripts related to life sciences as related to space are no more accepted for submission to Advances in Space Research. Such manuscripts should now be submitted to the new COSPAR Journal Life Sciences in Space Research (LSSR).
All submissions are reviewed by two scientists in the field. COSPAR is an interdisciplinary scientific organization concerned with the progress of space research on an international scale. Operating under the rules of ICSU, COSPAR ignores political considerations and considers all questions solely from the scientific viewpoint.