Zehao Wang, Zhenli He, Hui Fang, Yi-Xiong Huang, Ying Sun, Yu Yang, Zhi-Yuan Zhang, Di Liu
{"title":"Efficient On-Device Incremental Learning by Weight Freezing","authors":"Zehao Wang, Zhenli He, Hui Fang, Yi-Xiong Huang, Ying Sun, Yu Yang, Zhi-Yuan Zhang, Di Liu","doi":"10.1109/ASP-DAC52403.2022.9712563","DOIUrl":null,"url":null,"abstract":"On-device learning has become a new trend for edge intelligence systems. In this paper, we investigate the on-device in-cremental learning problem, which targets to learn new classes on top of a well-trained model on the device. Incremental learning is known to suffer from catastrophic forgetting, i.e., a model learns new classes at the cost of forgetting the old classes. Inspired by model pruning techniques, we propose a new on-device incremental learning method based on weight freezing. The weight freezing in our framework plays two roles: 1) preserving the knowledge of the old classes; 2) boosting the training procedure. By means of weight freezing, we build up an efficient incremental learning framework which combines knowledge distillation to fine-tune the new model. We conduct extensive experiments on CIFAR100 and compare our method with two existing methods. The experimental results show that our method can achieve higher accuracy after incrementally learning new classes.","PeriodicalId":239260,"journal":{"name":"2022 27th Asia and South Pacific Design Automation Conference (ASP-DAC)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 27th Asia and South Pacific Design Automation Conference (ASP-DAC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ASP-DAC52403.2022.9712563","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
On-device learning has become a new trend for edge intelligence systems. In this paper, we investigate the on-device in-cremental learning problem, which targets to learn new classes on top of a well-trained model on the device. Incremental learning is known to suffer from catastrophic forgetting, i.e., a model learns new classes at the cost of forgetting the old classes. Inspired by model pruning techniques, we propose a new on-device incremental learning method based on weight freezing. The weight freezing in our framework plays two roles: 1) preserving the knowledge of the old classes; 2) boosting the training procedure. By means of weight freezing, we build up an efficient incremental learning framework which combines knowledge distillation to fine-tune the new model. We conduct extensive experiments on CIFAR100 and compare our method with two existing methods. The experimental results show that our method can achieve higher accuracy after incrementally learning new classes.