遥感中基于掩模自编码器的持续自监督学习

Lars Möllenbrok;Behnood Rasti;Begüm Demir
{"title":"遥感中基于掩模自编码器的持续自监督学习","authors":"Lars Möllenbrok;Behnood Rasti;Begüm Demir","doi":"10.1109/LGRS.2025.3579585","DOIUrl":null,"url":null,"abstract":"The development of continual learning (CL) methods, which aim to learn new tasks in a sequential manner from the training data acquired continuously, has gained great attention in remote sensing (RS). The existing CL methods in RS, while learning new tasks, enhance robustness toward catastrophic forgetting. This is achieved using a large number of labeled training samples, which is costly and not always feasible to gather in RS. To address this problem, we propose a novel continual self-supervised learning (SSL) method in the context of masked autoencoders (MAEs) (denoted as CoSMAE). The proposed CoSMAE consists of two components: 1) data mixup and 2)model mixup knowledge distillation. Data mixup is associated with retaining information on previous data distributions by interpolating images from the current task with those from the previous tasks. Model mixup knowledge distillation is associated with distilling knowledge from past models and the current model simultaneously by interpolating their model weights to form a teacher for knowledge distillation. The two components complement each other to regularize the MAE at the data and model levels to facilitate better generalization across tasks and reduce the risk of catastrophic forgetting. Experimental results show that CoSMAE achieves significant improvements of up to 4.94% over state-of-the-art CL methods applied to MAE. Our code is publicly available at: <uri>https://git.tu-berlin.de/rsim/CoSMAE</uri>","PeriodicalId":91017,"journal":{"name":"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society","volume":"22 ","pages":"1-5"},"PeriodicalIF":0.0000,"publicationDate":"2025-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Continual Self-Supervised Learning With Masked Autoencoders in Remote Sensing\",\"authors\":\"Lars Möllenbrok;Behnood Rasti;Begüm Demir\",\"doi\":\"10.1109/LGRS.2025.3579585\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The development of continual learning (CL) methods, which aim to learn new tasks in a sequential manner from the training data acquired continuously, has gained great attention in remote sensing (RS). The existing CL methods in RS, while learning new tasks, enhance robustness toward catastrophic forgetting. This is achieved using a large number of labeled training samples, which is costly and not always feasible to gather in RS. To address this problem, we propose a novel continual self-supervised learning (SSL) method in the context of masked autoencoders (MAEs) (denoted as CoSMAE). The proposed CoSMAE consists of two components: 1) data mixup and 2)model mixup knowledge distillation. Data mixup is associated with retaining information on previous data distributions by interpolating images from the current task with those from the previous tasks. Model mixup knowledge distillation is associated with distilling knowledge from past models and the current model simultaneously by interpolating their model weights to form a teacher for knowledge distillation. The two components complement each other to regularize the MAE at the data and model levels to facilitate better generalization across tasks and reduce the risk of catastrophic forgetting. Experimental results show that CoSMAE achieves significant improvements of up to 4.94% over state-of-the-art CL methods applied to MAE. Our code is publicly available at: <uri>https://git.tu-berlin.de/rsim/CoSMAE</uri>\",\"PeriodicalId\":91017,\"journal\":{\"name\":\"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society\",\"volume\":\"22 \",\"pages\":\"1-5\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2025-06-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/11036167/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11036167/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

持续学习(CL)方法的发展,旨在从连续获取的训练数据中以顺序的方式学习新的任务,在遥感领域受到了广泛的关注。现有的记忆识别方法在学习新任务的同时,增强了对灾难性遗忘的鲁棒性。这是通过使用大量标记的训练样本来实现的,这是昂贵的,并且在RS中并不总是可行的。为了解决这个问题,我们提出了一种新的基于掩码自编码器(MAEs)(表示为CoSMAE)的持续自监督学习(SSL)方法。所提出的CoSMAE由两个部分组成:1)数据混合和2)模型混合知识蒸馏。数据混合与通过将当前任务中的图像与以前任务中的图像插值来保留以前数据分布的信息有关。模型混合知识蒸馏是将过去模型和当前模型的知识同时提取出来,通过插值模型和当前模型的权重,形成知识蒸馏的教师。这两个组成部分相互补充,在数据和模型级别上规范MAE,以促进更好的跨任务泛化并降低灾难性遗忘的风险。实验结果表明,与应用于MAE的最先进的CL方法相比,CoSMAE取得了4.94%的显著改进。我们的代码可以在https://git.tu-berlin.de/rsim/CoSMAE上公开获得
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Continual Self-Supervised Learning With Masked Autoencoders in Remote Sensing
The development of continual learning (CL) methods, which aim to learn new tasks in a sequential manner from the training data acquired continuously, has gained great attention in remote sensing (RS). The existing CL methods in RS, while learning new tasks, enhance robustness toward catastrophic forgetting. This is achieved using a large number of labeled training samples, which is costly and not always feasible to gather in RS. To address this problem, we propose a novel continual self-supervised learning (SSL) method in the context of masked autoencoders (MAEs) (denoted as CoSMAE). The proposed CoSMAE consists of two components: 1) data mixup and 2)model mixup knowledge distillation. Data mixup is associated with retaining information on previous data distributions by interpolating images from the current task with those from the previous tasks. Model mixup knowledge distillation is associated with distilling knowledge from past models and the current model simultaneously by interpolating their model weights to form a teacher for knowledge distillation. The two components complement each other to regularize the MAE at the data and model levels to facilitate better generalization across tasks and reduce the risk of catastrophic forgetting. Experimental results show that CoSMAE achieves significant improvements of up to 4.94% over state-of-the-art CL methods applied to MAE. Our code is publicly available at: https://git.tu-berlin.de/rsim/CoSMAE
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信