Exploring the potential of multi-source unsupervised domain adaptation in crop mapping using Sentinel-2 images

IF 6 2区 地球科学 Q1 GEOGRAPHY, PHYSICAL
Yumiao Wang, Luwei Feng, Weiwei Sun, Zuxun Zhang, Hanyu Zhang, Gang Yang, Xiangchao Meng
{"title":"Exploring the potential of multi-source unsupervised domain adaptation in crop mapping using Sentinel-2 images","authors":"Yumiao Wang, Luwei Feng, Weiwei Sun, Zuxun Zhang, Hanyu Zhang, Gang Yang, Xiangchao Meng","doi":"10.1080/15481603.2022.2156123","DOIUrl":null,"url":null,"abstract":"ABSTRACT Accurate crop mapping is critical for agricultural applications. Although studies have combined deep learning methods and time-series satellite images to crop classification with satisfactory results, most of them focused on supervised methods, which are usually applicable to a specific domain and lose their validity in new domains. Unsupervised domain adaptation (UDA) was proposed to solve this limitation by transferring knowledge from source domains with labeled samples to target domains with unlabeled samples. Particularly, multi-source UDA (MUDA) is a powerful extension that leverages knowledge from multiple source domains and can achieve better results in the target domain than single-source UDA (SUDA). However, few studies have explored the potential of MUDA for crop mapping. This study proposed a MUDA crop classification model (MUCCM) for unsupervised crop mapping. Specifically, 11 states in the U.S. were selected as the multi-source domains, and three provinces in Northeast China were selected as individual target domains. Ten spectral bands and five vegetation indexes were collected at a 10-day interval from time-series Sentinel-2 images to build the MUCCM. Subsequently, a SUDA model Domain Adversarial Neural Network (DANN) and two direct transfer methods, namely, the deep neural network and random forest, were constructed and compared with the MUCCM. The results indicated that the UDA models outperformed the direct transfer models significantly, and the MUCCM was superior to the DANN, achieving the highest classification accuracy (OA>85%) in each target domain. In addition, the MUCCM also performed best in in-season forecasting and crop mapping. This study is the first to apply a MUDA to crop classification and demonstrate a novel, effective solution for high-performance crop mapping in regions without labeled samples.","PeriodicalId":55091,"journal":{"name":"GIScience & Remote Sensing","volume":"59 1","pages":"2247 - 2265"},"PeriodicalIF":6.0000,"publicationDate":"2022-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"GIScience & Remote Sensing","FirstCategoryId":"89","ListUrlMain":"https://doi.org/10.1080/15481603.2022.2156123","RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GEOGRAPHY, PHYSICAL","Score":null,"Total":0}
引用次数: 10

Abstract

ABSTRACT Accurate crop mapping is critical for agricultural applications. Although studies have combined deep learning methods and time-series satellite images to crop classification with satisfactory results, most of them focused on supervised methods, which are usually applicable to a specific domain and lose their validity in new domains. Unsupervised domain adaptation (UDA) was proposed to solve this limitation by transferring knowledge from source domains with labeled samples to target domains with unlabeled samples. Particularly, multi-source UDA (MUDA) is a powerful extension that leverages knowledge from multiple source domains and can achieve better results in the target domain than single-source UDA (SUDA). However, few studies have explored the potential of MUDA for crop mapping. This study proposed a MUDA crop classification model (MUCCM) for unsupervised crop mapping. Specifically, 11 states in the U.S. were selected as the multi-source domains, and three provinces in Northeast China were selected as individual target domains. Ten spectral bands and five vegetation indexes were collected at a 10-day interval from time-series Sentinel-2 images to build the MUCCM. Subsequently, a SUDA model Domain Adversarial Neural Network (DANN) and two direct transfer methods, namely, the deep neural network and random forest, were constructed and compared with the MUCCM. The results indicated that the UDA models outperformed the direct transfer models significantly, and the MUCCM was superior to the DANN, achieving the highest classification accuracy (OA>85%) in each target domain. In addition, the MUCCM also performed best in in-season forecasting and crop mapping. This study is the first to apply a MUDA to crop classification and demonstrate a novel, effective solution for high-performance crop mapping in regions without labeled samples.
利用Sentinel-2图像探索多源无监督领域自适应在作物制图中的潜力
准确的作物制图对农业应用至关重要。虽然已有研究将深度学习方法与时间序列卫星图像结合起来进行作物分类,并取得了满意的结果,但大多数研究都集中在监督方法上,这种方法通常只适用于特定的领域,而在新的领域则失去了有效性。为了解决这一问题,提出了无监督域自适应(UDA)方法,将知识从具有标记样本的源域转移到具有未标记样本的目标域。特别是,多源UDA (MUDA)是一种强大的扩展,它可以利用来自多个源领域的知识,并且可以在目标领域获得比单源UDA (SUDA)更好的结果。然而,很少有研究探索MUDA在作物制图中的潜力。提出了一种用于无监督作物作图的MUDA作物分类模型(MUCCM)。具体而言,选择美国的11个州作为多源域,选择中国东北的3个省作为单个目标域。利用Sentinel-2时间序列影像,每隔10 d采集10个光谱波段和5个植被指数,构建MUCCM。随后,构建了SUDA模型域对抗神经网络(DANN)和两种直接传递方法,即深度神经网络和随机森林,并与MUCCM进行了比较。结果表明,UDA模型明显优于直接转移模型,MUCCM优于DANN,在每个目标域的分类准确率最高(OA为85%)。此外,MUCCM在季节预报和作物作图方面也表现最好。这项研究首次将MUDA应用于作物分类,并展示了一种新的、有效的解决方案,用于在没有标记样本的地区进行高性能作物制图。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
11.20
自引率
9.00%
发文量
84
审稿时长
6 months
期刊介绍: GIScience & Remote Sensing publishes original, peer-reviewed articles associated with geographic information systems (GIS), remote sensing of the environment (including digital image processing), geocomputation, spatial data mining, and geographic environmental modelling. Papers reflecting both basic and applied research are published.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信