{"title":"基于统计差分表示的异构变化检测变压器。","authors":"Xinhui Cao, Minggang Dong, Xingping Liu, Jiaming Gong, Hanhong Zheng","doi":"10.3390/s25123740","DOIUrl":null,"url":null,"abstract":"<p><p>Heterogeneous change detection refers to using image data from different sensors or modalities to detect change information in the same region by comparing images of the same region at different time periods. In recent years, methods based on deep learning and domain adaptation have become mainstream, which can effectively improve the accuracy and robustness of heterogeneous image change detection through feature alignment and multimodal data fusion. However, a lack of credible labels has stopped most current learning-based heterogeneous change detection methods from being put into application. To overcome this limitation, a weakly supervised heterogeneous change detection framework with a structure similarity-guided sample generating (S3G2) strategy is proposed, which employs differential structure similarity to acquire prior information for iteratively generating reliable pseudo-labels. Moreover, a Statistical Difference representation Transformer (SDFormer) is proposed to lower the influence of modality difference between bitemporal heterogeneous imagery and better extract relevant change information. Extensive experiments have been carried out to fully investigate the influences of inner manual parameters and compare them with state-of-the-art methods in several public heterogeneous change detection data sets. The experimental results indicate that the proposed methods have shown competitive performance.</p>","PeriodicalId":21698,"journal":{"name":"Sensors","volume":"25 12","pages":""},"PeriodicalIF":3.4000,"publicationDate":"2025-06-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Statistical Difference Representation-Based Transformer for Heterogeneous Change Detection.\",\"authors\":\"Xinhui Cao, Minggang Dong, Xingping Liu, Jiaming Gong, Hanhong Zheng\",\"doi\":\"10.3390/s25123740\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Heterogeneous change detection refers to using image data from different sensors or modalities to detect change information in the same region by comparing images of the same region at different time periods. In recent years, methods based on deep learning and domain adaptation have become mainstream, which can effectively improve the accuracy and robustness of heterogeneous image change detection through feature alignment and multimodal data fusion. However, a lack of credible labels has stopped most current learning-based heterogeneous change detection methods from being put into application. To overcome this limitation, a weakly supervised heterogeneous change detection framework with a structure similarity-guided sample generating (S3G2) strategy is proposed, which employs differential structure similarity to acquire prior information for iteratively generating reliable pseudo-labels. Moreover, a Statistical Difference representation Transformer (SDFormer) is proposed to lower the influence of modality difference between bitemporal heterogeneous imagery and better extract relevant change information. Extensive experiments have been carried out to fully investigate the influences of inner manual parameters and compare them with state-of-the-art methods in several public heterogeneous change detection data sets. The experimental results indicate that the proposed methods have shown competitive performance.</p>\",\"PeriodicalId\":21698,\"journal\":{\"name\":\"Sensors\",\"volume\":\"25 12\",\"pages\":\"\"},\"PeriodicalIF\":3.4000,\"publicationDate\":\"2025-06-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Sensors\",\"FirstCategoryId\":\"103\",\"ListUrlMain\":\"https://doi.org/10.3390/s25123740\",\"RegionNum\":3,\"RegionCategory\":\"综合性期刊\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"CHEMISTRY, ANALYTICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Sensors","FirstCategoryId":"103","ListUrlMain":"https://doi.org/10.3390/s25123740","RegionNum":3,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"CHEMISTRY, ANALYTICAL","Score":null,"Total":0}
Statistical Difference Representation-Based Transformer for Heterogeneous Change Detection.
Heterogeneous change detection refers to using image data from different sensors or modalities to detect change information in the same region by comparing images of the same region at different time periods. In recent years, methods based on deep learning and domain adaptation have become mainstream, which can effectively improve the accuracy and robustness of heterogeneous image change detection through feature alignment and multimodal data fusion. However, a lack of credible labels has stopped most current learning-based heterogeneous change detection methods from being put into application. To overcome this limitation, a weakly supervised heterogeneous change detection framework with a structure similarity-guided sample generating (S3G2) strategy is proposed, which employs differential structure similarity to acquire prior information for iteratively generating reliable pseudo-labels. Moreover, a Statistical Difference representation Transformer (SDFormer) is proposed to lower the influence of modality difference between bitemporal heterogeneous imagery and better extract relevant change information. Extensive experiments have been carried out to fully investigate the influences of inner manual parameters and compare them with state-of-the-art methods in several public heterogeneous change detection data sets. The experimental results indicate that the proposed methods have shown competitive performance.
期刊介绍:
Sensors (ISSN 1424-8220) provides an advanced forum for the science and technology of sensors and biosensors. It publishes reviews (including comprehensive reviews on the complete sensors products), regular research papers and short notes. Our aim is to encourage scientists to publish their experimental and theoretical results in as much detail as possible. There is no restriction on the length of the papers. The full experimental details must be provided so that the results can be reproduced.