{"title":"Overcome STSF phenomenon in catastrophic forgetting","authors":"Yifan Chang, Qifan Zhao","doi":"10.1117/12.2674765","DOIUrl":null,"url":null,"abstract":"Catastrophic forgetting is an undesirable phenomenon in convolution neural networks and iCarl is an effective algorithm for preventing catastrophic forgetting. However, a potential defect that more similar tasks result in severer catastrophic forgetting (STSF) in iCarl is explored in this work. The reason of STSF is that similar tasks are prone to occupy similar feature channels and similar feature representations, thus they can be replaced by each other easily. Based on these findings, a novel method Similar Margin Loss (SML) is proposed. SML aims to make feature representations of samples from the same task compact while making feature representations from the different tasks differentiable in the feature space. Experiment results show that SML is effective in alleviating STSF.","PeriodicalId":286364,"journal":{"name":"Conference on Computer Graphics, Artificial Intelligence, and Data Processing","volume":"81 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Conference on Computer Graphics, Artificial Intelligence, and Data Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.2674765","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Catastrophic forgetting is an undesirable phenomenon in convolution neural networks and iCarl is an effective algorithm for preventing catastrophic forgetting. However, a potential defect that more similar tasks result in severer catastrophic forgetting (STSF) in iCarl is explored in this work. The reason of STSF is that similar tasks are prone to occupy similar feature channels and similar feature representations, thus they can be replaced by each other easily. Based on these findings, a novel method Similar Margin Loss (SML) is proposed. SML aims to make feature representations of samples from the same task compact while making feature representations from the different tasks differentiable in the feature space. Experiment results show that SML is effective in alleviating STSF.