{"title":"Forget to Learn (F2L): Circumventing plasticity–stability trade-off in continuous unsupervised domain adaptation","authors":"Mohamed Abubakr Hassan, Chi-Guhn Lee","doi":"10.1016/j.patcog.2024.111139","DOIUrl":null,"url":null,"abstract":"<div><div>In continuous unsupervised domain adaptation (CUDA), deep learning models struggle with the stability-plasticity trade-off—where the model must forget old knowledge to acquire new one. This paper introduces the “Forget to Learn” (F2L), a novel framework that circumvents such a trade-off. In contrast to state-of-the-art methods that aim to balance the two conflicting objectives, stability and plasticity, F2L utilizes active forgetting and knowledge distillation to circumvent the conflict’s root causes. In F2L, dual-encoders are trained, where the first encoder – the ‘Specialist’ – is designed to actively forget, thereby boosting adaptability (i.e., plasticity) and generating high-accuracy pseudo labels on the new domains. Such pseudo labels are then used to transfer/accumulate the specialist knowledge to the second encoder—the ‘Generalist’ through conflict-free knowledge distillation. Empirical and ablation studies confirmed F2L’s superiority on different datasets and against different SOTAs. Furthermore, F2L minimizes the need for hyperparameter tuning, enhances computational and sample efficiency, and excels in problems with long domain sequences—key advantages for practical systems constrained by hardware limitations.</div></div>","PeriodicalId":49713,"journal":{"name":"Pattern Recognition","volume":"159 ","pages":"Article 111139"},"PeriodicalIF":7.5000,"publicationDate":"2024-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pattern Recognition","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0031320324008902","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
In continuous unsupervised domain adaptation (CUDA), deep learning models struggle with the stability-plasticity trade-off—where the model must forget old knowledge to acquire new one. This paper introduces the “Forget to Learn” (F2L), a novel framework that circumvents such a trade-off. In contrast to state-of-the-art methods that aim to balance the two conflicting objectives, stability and plasticity, F2L utilizes active forgetting and knowledge distillation to circumvent the conflict’s root causes. In F2L, dual-encoders are trained, where the first encoder – the ‘Specialist’ – is designed to actively forget, thereby boosting adaptability (i.e., plasticity) and generating high-accuracy pseudo labels on the new domains. Such pseudo labels are then used to transfer/accumulate the specialist knowledge to the second encoder—the ‘Generalist’ through conflict-free knowledge distillation. Empirical and ablation studies confirmed F2L’s superiority on different datasets and against different SOTAs. Furthermore, F2L minimizes the need for hyperparameter tuning, enhances computational and sample efficiency, and excels in problems with long domain sequences—key advantages for practical systems constrained by hardware limitations.
期刊介绍:
The field of Pattern Recognition is both mature and rapidly evolving, playing a crucial role in various related fields such as computer vision, image processing, text analysis, and neural networks. It closely intersects with machine learning and is being applied in emerging areas like biometrics, bioinformatics, multimedia data analysis, and data science. The journal Pattern Recognition, established half a century ago during the early days of computer science, has since grown significantly in scope and influence.