{"title":"具有固定权值和伪样例的有效类增量学习策略","authors":"Been-Chian Chien, Yueh-Chia Hsu, T. Hong","doi":"10.1145/3504006.3504023","DOIUrl":null,"url":null,"abstract":"In this paper, we propose a novel and efficient class-incremental learning approach that does not necessitate the storage of old data after training each task. The proposed approach uses the autoencoder's decoder to generate pseudo data to consolidate the model and sets a subset of relevant weights in the encoder layers to learn new knowledge while freezing most weights. It uses no extra storage space to save old data. The experimental results also show the performance of the proposed approach.","PeriodicalId":296534,"journal":{"name":"Proceedings of the 8th Multidisciplinary International Social Networks Conference","volume":"55 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"An Efficient Class-incremental Learning Strategy with Frozen Weights and Pseudo Exemplars\",\"authors\":\"Been-Chian Chien, Yueh-Chia Hsu, T. Hong\",\"doi\":\"10.1145/3504006.3504023\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we propose a novel and efficient class-incremental learning approach that does not necessitate the storage of old data after training each task. The proposed approach uses the autoencoder's decoder to generate pseudo data to consolidate the model and sets a subset of relevant weights in the encoder layers to learn new knowledge while freezing most weights. It uses no extra storage space to save old data. The experimental results also show the performance of the proposed approach.\",\"PeriodicalId\":296534,\"journal\":{\"name\":\"Proceedings of the 8th Multidisciplinary International Social Networks Conference\",\"volume\":\"55 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-11-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 8th Multidisciplinary International Social Networks Conference\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3504006.3504023\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 8th Multidisciplinary International Social Networks Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3504006.3504023","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An Efficient Class-incremental Learning Strategy with Frozen Weights and Pseudo Exemplars
In this paper, we propose a novel and efficient class-incremental learning approach that does not necessitate the storage of old data after training each task. The proposed approach uses the autoencoder's decoder to generate pseudo data to consolidate the model and sets a subset of relevant weights in the encoder layers to learn new knowledge while freezing most weights. It uses no extra storage space to save old data. The experimental results also show the performance of the proposed approach.