{"title":"基于子集重播的自主系统可扩展改进持续学习","authors":"P. Brahma, Adrienne Othon","doi":"10.1109/CVPRW.2018.00154","DOIUrl":null,"url":null,"abstract":"While machine learning techniques have come a long way in showing astounding performance on various vision problems, the conventional way of training is not applicable for learning from a sequence of new data or tasks. For most real life applications like perception for autonomous vehicles, multiple stages of data collection are necessary to improve the performance of machine learning models over time. The newer observations may have a different distribution than the older ones and thus a simply fine-tuned model often overfits while forgetting the knowledge from past experiences. Recently, few lifelong or continual learning approaches have shown promising results towards overcoming this problem of catastrophic forgetting. In this work, we show that carefully choosing a small subset of the older data with the objective of promoting representativeness and diversity can also help in learning continuously. For large scale cloud based training, this can help in significantly reducing the amount of storage required along with lessening the computation and time for each retraining session.","PeriodicalId":150600,"journal":{"name":"2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)","volume":"AES-2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"16","resultStr":"{\"title\":\"Subset Replay Based Continual Learning for Scalable Improvement of Autonomous Systems\",\"authors\":\"P. Brahma, Adrienne Othon\",\"doi\":\"10.1109/CVPRW.2018.00154\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"While machine learning techniques have come a long way in showing astounding performance on various vision problems, the conventional way of training is not applicable for learning from a sequence of new data or tasks. For most real life applications like perception for autonomous vehicles, multiple stages of data collection are necessary to improve the performance of machine learning models over time. The newer observations may have a different distribution than the older ones and thus a simply fine-tuned model often overfits while forgetting the knowledge from past experiences. Recently, few lifelong or continual learning approaches have shown promising results towards overcoming this problem of catastrophic forgetting. In this work, we show that carefully choosing a small subset of the older data with the objective of promoting representativeness and diversity can also help in learning continuously. For large scale cloud based training, this can help in significantly reducing the amount of storage required along with lessening the computation and time for each retraining session.\",\"PeriodicalId\":150600,\"journal\":{\"name\":\"2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)\",\"volume\":\"AES-2 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"16\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CVPRW.2018.00154\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CVPRW.2018.00154","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Subset Replay Based Continual Learning for Scalable Improvement of Autonomous Systems
While machine learning techniques have come a long way in showing astounding performance on various vision problems, the conventional way of training is not applicable for learning from a sequence of new data or tasks. For most real life applications like perception for autonomous vehicles, multiple stages of data collection are necessary to improve the performance of machine learning models over time. The newer observations may have a different distribution than the older ones and thus a simply fine-tuned model often overfits while forgetting the knowledge from past experiences. Recently, few lifelong or continual learning approaches have shown promising results towards overcoming this problem of catastrophic forgetting. In this work, we show that carefully choosing a small subset of the older data with the objective of promoting representativeness and diversity can also help in learning continuously. For large scale cloud based training, this can help in significantly reducing the amount of storage required along with lessening the computation and time for each retraining session.