AdaFML: Adaptive Federated Meta Learning With Multi-Objectives and Context-Awareness in Dynamic Heterogeneous Networks

IF 5.3 3区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Qiaomei Han;Xianbin Wang;Weiming Shen;Yanjun Shi
{"title":"AdaFML: Adaptive Federated Meta Learning With Multi-Objectives and Context-Awareness in Dynamic Heterogeneous Networks","authors":"Qiaomei Han;Xianbin Wang;Weiming Shen;Yanjun Shi","doi":"10.1109/TETCI.2025.3537940","DOIUrl":null,"url":null,"abstract":"Recent advancements in Federated Learning (FL) have enabled the widespread deployment of distributed computing resources across connected devices, enhancing data processing capabilities and facilitating collaborative decision-making while maintaining user privacy. However, in Internet of Things (IoT) systems, the heterogeneity of devices and unstable network connections present significant challenges to the effective and efficient execution of FL tasks in real-world environments. To address these challenges, we propose an Adaptive Federated Meta Learning Framework with Multi-Objectives and Context-Awareness (AdaFML). This framework aims to achieve multiple objectives, including improving the performance of the FL global model, optimizing time efficiency, and enabling local model adaptation in dynamic and heterogeneous environments. Specifically, AdaFML extracts contextual information from each device, including its data distribution, computation, and communication conditions, to train a multimodal model that optimizes the FL task and time cost estimation, enhancing global model performance and time efficiency. Moreover, AdaFML fine-tunes two critical meta-learning parameters: the mixture ratio between local and global models and the selection weights for model aggregation. This enables adaptive local model updates across different devices while improving global model performance. Experimental results demonstrate that AdaFML boosts the effectiveness, efficiency, and adaptability of FL task execution in dynamic and heterogeneous environments.","PeriodicalId":13135,"journal":{"name":"IEEE Transactions on Emerging Topics in Computational Intelligence","volume":"9 2","pages":"1428-1440"},"PeriodicalIF":5.3000,"publicationDate":"2025-02-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Emerging Topics in Computational Intelligence","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10891245/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Recent advancements in Federated Learning (FL) have enabled the widespread deployment of distributed computing resources across connected devices, enhancing data processing capabilities and facilitating collaborative decision-making while maintaining user privacy. However, in Internet of Things (IoT) systems, the heterogeneity of devices and unstable network connections present significant challenges to the effective and efficient execution of FL tasks in real-world environments. To address these challenges, we propose an Adaptive Federated Meta Learning Framework with Multi-Objectives and Context-Awareness (AdaFML). This framework aims to achieve multiple objectives, including improving the performance of the FL global model, optimizing time efficiency, and enabling local model adaptation in dynamic and heterogeneous environments. Specifically, AdaFML extracts contextual information from each device, including its data distribution, computation, and communication conditions, to train a multimodal model that optimizes the FL task and time cost estimation, enhancing global model performance and time efficiency. Moreover, AdaFML fine-tunes two critical meta-learning parameters: the mixture ratio between local and global models and the selection weights for model aggregation. This enables adaptive local model updates across different devices while improving global model performance. Experimental results demonstrate that AdaFML boosts the effectiveness, efficiency, and adaptability of FL task execution in dynamic and heterogeneous environments.
动态异构网络中具有多目标和上下文感知的自适应联邦元学习
联邦学习(FL)的最新进展使分布式计算资源在连接设备上的广泛部署成为可能,增强了数据处理能力,促进了协作决策,同时维护了用户隐私。然而,在物联网(IoT)系统中,设备的异构性和不稳定的网络连接对在现实环境中有效和高效地执行FL任务提出了重大挑战。为了应对这些挑战,我们提出了一个具有多目标和上下文感知的自适应联邦元学习框架(AdaFML)。该框架旨在实现多个目标,包括提高FL全局模型的性能,优化时间效率,以及在动态和异构环境中实现局部模型的适应。具体而言,AdaFML从每个设备中提取上下文信息,包括其数据分布,计算和通信条件,以训练多模态模型,优化FL任务和时间成本估算,提高全局模型性能和时间效率。此外,AdaFML对两个关键的元学习参数进行微调:局部模型和全局模型的混合比率以及模型聚合的选择权重。这使得可以跨不同设备进行自适应本地模型更新,同时提高全局模型的性能。实验结果表明,AdaFML提高了动态和异构环境下FL任务执行的有效性、效率和适应性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
10.30
自引率
7.50%
发文量
147
期刊介绍: The IEEE Transactions on Emerging Topics in Computational Intelligence (TETCI) publishes original articles on emerging aspects of computational intelligence, including theory, applications, and surveys. TETCI is an electronics only publication. TETCI publishes six issues per year. Authors are encouraged to submit manuscripts in any emerging topic in computational intelligence, especially nature-inspired computing topics not covered by other IEEE Computational Intelligence Society journals. A few such illustrative examples are glial cell networks, computational neuroscience, Brain Computer Interface, ambient intelligence, non-fuzzy computing with words, artificial life, cultural learning, artificial endocrine networks, social reasoning, artificial hormone networks, computational intelligence for the IoT and Smart-X technologies.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信