培养终身学习的动态生长混合模型

IF 8.9 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Fei Ye;Adrian G. Bors
{"title":"培养终身学习的动态生长混合模型","authors":"Fei Ye;Adrian G. Bors","doi":"10.1109/TNNLS.2025.3569156","DOIUrl":null,"url":null,"abstract":"Lifelong learning (LLL) defines a training paradigm that aims to continuously acquire and capture new concepts from a sequence of tasks without forgetting. Recently, dynamic expansion models (DEMs) have been proposed to address catastrophic forgetting under the LLL paradigm. However, the efficiency of DEMs lacks a thorough explanation based on theoretical analysis. In this article, we develop a new theoretical framework that interprets the forgetting process of the DEM as increasing the statistical discrepancy distance between the distribution of the probabilistic representation of the new data and the previously learned knowledge. The theoretical analysis shows that adding new components to a mixture model represents a trade-off between model complexity and its performance. Inspired by the theoretical analysis, we introduce a new DEM, called the growing mixture model (GMM), where generative data components are added according to the novelty of the incoming task information compared to what is already known. A new component selection mechanism considering the model’s already acquired knowledge is employed for updating new DEM’s components, promoting efficient future task learning. We also train a compact student model with samples drawn through the generative mechanisms of the GMM, aiming to accumulate cross-domain representations over time. By employing the student model, we can significantly reduce the number of parameters and make quick inferences during the testing phase.","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"36 9","pages":"15836-15850"},"PeriodicalIF":8.9000,"publicationDate":"2025-06-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Training a Dynamic Growing Mixture Model for Lifelong Learning\",\"authors\":\"Fei Ye;Adrian G. Bors\",\"doi\":\"10.1109/TNNLS.2025.3569156\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Lifelong learning (LLL) defines a training paradigm that aims to continuously acquire and capture new concepts from a sequence of tasks without forgetting. Recently, dynamic expansion models (DEMs) have been proposed to address catastrophic forgetting under the LLL paradigm. However, the efficiency of DEMs lacks a thorough explanation based on theoretical analysis. In this article, we develop a new theoretical framework that interprets the forgetting process of the DEM as increasing the statistical discrepancy distance between the distribution of the probabilistic representation of the new data and the previously learned knowledge. The theoretical analysis shows that adding new components to a mixture model represents a trade-off between model complexity and its performance. Inspired by the theoretical analysis, we introduce a new DEM, called the growing mixture model (GMM), where generative data components are added according to the novelty of the incoming task information compared to what is already known. A new component selection mechanism considering the model’s already acquired knowledge is employed for updating new DEM’s components, promoting efficient future task learning. We also train a compact student model with samples drawn through the generative mechanisms of the GMM, aiming to accumulate cross-domain representations over time. By employing the student model, we can significantly reduce the number of parameters and make quick inferences during the testing phase.\",\"PeriodicalId\":13303,\"journal\":{\"name\":\"IEEE transactions on neural networks and learning systems\",\"volume\":\"36 9\",\"pages\":\"15836-15850\"},\"PeriodicalIF\":8.9000,\"publicationDate\":\"2025-06-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on neural networks and learning systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/11027910/\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/11027910/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

终身学习(LLL)定义了一种训练范式,旨在从一系列任务中不断获取和捕获新概念而不忘记。近年来,动态扩展模型(dem)被提出用于解决LLL范式下的灾难性遗忘问题。然而,基于理论分析的dem效率缺乏一个透彻的解释。在本文中,我们开发了一个新的理论框架,将DEM的遗忘过程解释为增加新数据的概率表示分布与先前学习的知识之间的统计差异距离。理论分析表明,在混合模型中加入新组件代表了模型复杂性和性能之间的权衡。受理论分析的启发,我们引入了一种新的DEM,称为增长混合模型(GMM),其中根据传入任务信息与已知信息的新新性添加生成数据组件。采用一种考虑模型已有知识的组件选择机制来更新新的DEM组件,促进高效的未来任务学习。我们还训练了一个紧凑的学生模型,该模型通过GMM的生成机制绘制样本,旨在随着时间的推移积累跨域表示。通过使用学生模型,我们可以显著减少参数的数量,并在测试阶段快速推断。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Training a Dynamic Growing Mixture Model for Lifelong Learning
Lifelong learning (LLL) defines a training paradigm that aims to continuously acquire and capture new concepts from a sequence of tasks without forgetting. Recently, dynamic expansion models (DEMs) have been proposed to address catastrophic forgetting under the LLL paradigm. However, the efficiency of DEMs lacks a thorough explanation based on theoretical analysis. In this article, we develop a new theoretical framework that interprets the forgetting process of the DEM as increasing the statistical discrepancy distance between the distribution of the probabilistic representation of the new data and the previously learned knowledge. The theoretical analysis shows that adding new components to a mixture model represents a trade-off between model complexity and its performance. Inspired by the theoretical analysis, we introduce a new DEM, called the growing mixture model (GMM), where generative data components are added according to the novelty of the incoming task information compared to what is already known. A new component selection mechanism considering the model’s already acquired knowledge is employed for updating new DEM’s components, promoting efficient future task learning. We also train a compact student model with samples drawn through the generative mechanisms of the GMM, aiming to accumulate cross-domain representations over time. By employing the student model, we can significantly reduce the number of parameters and make quick inferences during the testing phase.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE transactions on neural networks and learning systems
IEEE transactions on neural networks and learning systems COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE-COMPUTER SCIENCE, HARDWARE & ARCHITECTURE
CiteScore
23.80
自引率
9.60%
发文量
2102
审稿时长
3-8 weeks
期刊介绍: The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信