{"title":"One-shot adaptation for cross-domain semantic segmentation in remote sensing images","authors":"Jiaojiao Tan , Haiwei Zhang , Ning Yao , Qiang Yu","doi":"10.1016/j.patcog.2025.111390","DOIUrl":null,"url":null,"abstract":"<div><div>Contemporary cross-domain remote sensing (RS) image segmentation has been successful in recent years. When the target domain data becomes scarce in some realistic scenarios, the performance of traditional domain adaptation (DA) methods significantly drops. In this paper, we tackle the problem of fast cross-domain adaptation by observing only one unlabeled target data. To deal with dynamic domain shift efficiently, this paper introduces a novel framework named Minimax One-shot AdapTation (<strong>MOAT</strong>) to perform cross-domain feature alignment in semantic segmentation. Specifically, MOAT alternately maximizes the cross-entropy to select the most informative source samples and minimizes the cross-entropy of obtained samples to make the model fit the target data. The selected source samples can effectively describe the target data distribution using the proposed uncertainty-based distribution estimation technique. We propose a memory-based feature enhancement strategy to learn domain-invariant decision boundaries to accomplish semantic alignment. Generally, we empirically demonstrate the effectiveness of the proposed MOAT. It achieves a new state-of-the-art performance on cross-domain RS image segmentation for conventional unsupervised domain adaptation and one-shot domain adaptation scenarios.</div></div>","PeriodicalId":49713,"journal":{"name":"Pattern Recognition","volume":"162 ","pages":"Article 111390"},"PeriodicalIF":7.5000,"publicationDate":"2025-01-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pattern Recognition","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0031320325000500","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Contemporary cross-domain remote sensing (RS) image segmentation has been successful in recent years. When the target domain data becomes scarce in some realistic scenarios, the performance of traditional domain adaptation (DA) methods significantly drops. In this paper, we tackle the problem of fast cross-domain adaptation by observing only one unlabeled target data. To deal with dynamic domain shift efficiently, this paper introduces a novel framework named Minimax One-shot AdapTation (MOAT) to perform cross-domain feature alignment in semantic segmentation. Specifically, MOAT alternately maximizes the cross-entropy to select the most informative source samples and minimizes the cross-entropy of obtained samples to make the model fit the target data. The selected source samples can effectively describe the target data distribution using the proposed uncertainty-based distribution estimation technique. We propose a memory-based feature enhancement strategy to learn domain-invariant decision boundaries to accomplish semantic alignment. Generally, we empirically demonstrate the effectiveness of the proposed MOAT. It achieves a new state-of-the-art performance on cross-domain RS image segmentation for conventional unsupervised domain adaptation and one-shot domain adaptation scenarios.
期刊介绍:
The field of Pattern Recognition is both mature and rapidly evolving, playing a crucial role in various related fields such as computer vision, image processing, text analysis, and neural networks. It closely intersects with machine learning and is being applied in emerging areas like biometrics, bioinformatics, multimedia data analysis, and data science. The journal Pattern Recognition, established half a century ago during the early days of computer science, has since grown significantly in scope and influence.