基于资源约束的多源实例迁移学习。

IF 10.2 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Mohammad Askarizadeh, Alireza Morsali, Kim Khoa Nguyen
{"title":"基于资源约束的多源实例迁移学习。","authors":"Mohammad Askarizadeh, Alireza Morsali, Kim Khoa Nguyen","doi":"10.1109/TNNLS.2023.3327248","DOIUrl":null,"url":null,"abstract":"<p><p>In today's machine learning (ML), the need for vast amounts of training data has become a significant challenge. Transfer learning (TL) offers a promising solution by leveraging knowledge across different domains/tasks, effectively addressing data scarcity. However, TL encounters computational and communication challenges in resource-constrained scenarios, and negative transfer (NT) can arise from specific data distributions. This article presents a novel focus on maximizing the accuracy of instance-based TL in multisource resource-constrained environments while mitigating NT, a key concern in TL. Previous studies have overlooked the impact of resource consumption in addressing the NT problem. To address these challenges, we introduce an optimization model named multisource resource-constrained optimized TL (MSOPTL), which employs a convex combination of empirical sources and target errors while considering feasibility and resource constraints. Moreover, we enhance one of the generalization error upper bounds in domain adaptation setting by demonstrating the potential to substitute the H ∆ H divergence with the Kullback-Leibler (KL) divergence. We utilize this enhanced error upper bound as one of the feasibility constraints of MSOPTL. Our suggested model can be applied as a versatile framework for various ML methods. Our approach is extensively validated in a neural network (NN)-based classification problem, demonstrating the efficiency of MSOPTL in achieving the desired trade-offs between TL's benefits and associated costs. This advancement holds tremendous potential for enhancing edge artificial intelligence (AI) applications in resource-constrained environments.</p>","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"PP ","pages":""},"PeriodicalIF":10.2000,"publicationDate":"2023-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Resource-Constrained Multisource Instance-Based Transfer Learning.\",\"authors\":\"Mohammad Askarizadeh, Alireza Morsali, Kim Khoa Nguyen\",\"doi\":\"10.1109/TNNLS.2023.3327248\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>In today's machine learning (ML), the need for vast amounts of training data has become a significant challenge. Transfer learning (TL) offers a promising solution by leveraging knowledge across different domains/tasks, effectively addressing data scarcity. However, TL encounters computational and communication challenges in resource-constrained scenarios, and negative transfer (NT) can arise from specific data distributions. This article presents a novel focus on maximizing the accuracy of instance-based TL in multisource resource-constrained environments while mitigating NT, a key concern in TL. Previous studies have overlooked the impact of resource consumption in addressing the NT problem. To address these challenges, we introduce an optimization model named multisource resource-constrained optimized TL (MSOPTL), which employs a convex combination of empirical sources and target errors while considering feasibility and resource constraints. Moreover, we enhance one of the generalization error upper bounds in domain adaptation setting by demonstrating the potential to substitute the H ∆ H divergence with the Kullback-Leibler (KL) divergence. We utilize this enhanced error upper bound as one of the feasibility constraints of MSOPTL. Our suggested model can be applied as a versatile framework for various ML methods. Our approach is extensively validated in a neural network (NN)-based classification problem, demonstrating the efficiency of MSOPTL in achieving the desired trade-offs between TL's benefits and associated costs. This advancement holds tremendous potential for enhancing edge artificial intelligence (AI) applications in resource-constrained environments.</p>\",\"PeriodicalId\":13303,\"journal\":{\"name\":\"IEEE transactions on neural networks and learning systems\",\"volume\":\"PP \",\"pages\":\"\"},\"PeriodicalIF\":10.2000,\"publicationDate\":\"2023-11-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on neural networks and learning systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1109/TNNLS.2023.3327248\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1109/TNNLS.2023.3327248","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

在当今的机器学习(ML)中,对大量训练数据的需求已成为一个重大挑战。迁移学习(TL)通过跨不同领域/任务利用知识,有效解决数据短缺问题,提供了一个很有前途的解决方案。然而,TL在资源受限的场景中遇到了计算和通信方面的挑战,并且特定的数据分布可能会产生负迁移(NT)。本文提出了一个新的重点,即在多源资源受限的环境中最大限度地提高基于实例的TL的准确性,同时减轻TL中的一个关键问题NT。以前的研究忽略了资源消耗对解决NT问题的影响。为了应对这些挑战,我们引入了一个名为多源资源约束优化TL(MSOPTL)的优化模型,该模型在考虑可行性和资源约束的同时,采用了经验源和目标误差的凸组合。此外,我们通过证明用Kullback-Leibler(KL)散度代替H∆H散度的潜力,增强了域自适应设置中的一个泛化误差上界。我们利用这个增强的误差上界作为MSOPTL的可行性约束之一。我们提出的模型可以作为各种ML方法的通用框架来应用。我们的方法在基于神经网络(NN)的分类问题中得到了广泛验证,证明了MSOPTL在TL的收益和相关成本之间实现所需权衡的效率。这一进步对于在资源受限的环境中增强边缘人工智能(AI)应用具有巨大潜力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Resource-Constrained Multisource Instance-Based Transfer Learning.

In today's machine learning (ML), the need for vast amounts of training data has become a significant challenge. Transfer learning (TL) offers a promising solution by leveraging knowledge across different domains/tasks, effectively addressing data scarcity. However, TL encounters computational and communication challenges in resource-constrained scenarios, and negative transfer (NT) can arise from specific data distributions. This article presents a novel focus on maximizing the accuracy of instance-based TL in multisource resource-constrained environments while mitigating NT, a key concern in TL. Previous studies have overlooked the impact of resource consumption in addressing the NT problem. To address these challenges, we introduce an optimization model named multisource resource-constrained optimized TL (MSOPTL), which employs a convex combination of empirical sources and target errors while considering feasibility and resource constraints. Moreover, we enhance one of the generalization error upper bounds in domain adaptation setting by demonstrating the potential to substitute the H ∆ H divergence with the Kullback-Leibler (KL) divergence. We utilize this enhanced error upper bound as one of the feasibility constraints of MSOPTL. Our suggested model can be applied as a versatile framework for various ML methods. Our approach is extensively validated in a neural network (NN)-based classification problem, demonstrating the efficiency of MSOPTL in achieving the desired trade-offs between TL's benefits and associated costs. This advancement holds tremendous potential for enhancing edge artificial intelligence (AI) applications in resource-constrained environments.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE transactions on neural networks and learning systems
IEEE transactions on neural networks and learning systems COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE-COMPUTER SCIENCE, HARDWARE & ARCHITECTURE
CiteScore
23.80
自引率
9.60%
发文量
2102
审稿时长
3-8 weeks
期刊介绍: The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信