SWIM: Sliding-Window Model contrast for federated learning

IF 6.2 2区 计算机科学 Q1 COMPUTER SCIENCE, THEORY & METHODS
Heng-Ru Zhang , Rui Chen , Shi-Huai Wen , Xiao-Qiang Bian
{"title":"SWIM: Sliding-Window Model contrast for federated learning","authors":"Heng-Ru Zhang ,&nbsp;Rui Chen ,&nbsp;Shi-Huai Wen ,&nbsp;Xiao-Qiang Bian","doi":"10.1016/j.future.2024.107590","DOIUrl":null,"url":null,"abstract":"<div><div>In federated learning, data heterogeneity leads to significant differences in the local models learned by the clients, thereby affecting the performance of the global model. To address this issue, contrast federated learning algorithms increase the comparison of positive and negative samples on the clients, bringing the local models closer to the global model. However, existing methods take the global model as the positive sample and the previous round of local models as the negative sample, resulting in insufficient utilization of historical local models. In this paper, we propose SWIM: Sliding-WIndow Model contrast method, which introduces more rounds of local models. First, we design and utilize a sliding window mechanism for collecting client representations of historical local models. Subsequently, we employ the cosine distance function as a discriminator to distinguish them into positive and negative samples. In addition, we introduce a dynamic coefficient that balances the federated classification learning and feature learning tasks. By adjusting the dynamic coefficient at different training rounds, the global model becomes more focused on feature learning in the early stages and classification learning in the later stages. Experiments are compared with four state-of-the-art federated learning algorithms on three datasets. The results show that the proposed algorithm outperforms the four state-of-the-art algorithms in terms of accuracy. Source code is available at <span><span>https://github.com/zhanghrswpu/SWIM</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":55132,"journal":{"name":"Future Generation Computer Systems-The International Journal of Escience","volume":null,"pages":null},"PeriodicalIF":6.2000,"publicationDate":"2024-11-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Future Generation Computer Systems-The International Journal of Escience","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0167739X24005545","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 0

Abstract

In federated learning, data heterogeneity leads to significant differences in the local models learned by the clients, thereby affecting the performance of the global model. To address this issue, contrast federated learning algorithms increase the comparison of positive and negative samples on the clients, bringing the local models closer to the global model. However, existing methods take the global model as the positive sample and the previous round of local models as the negative sample, resulting in insufficient utilization of historical local models. In this paper, we propose SWIM: Sliding-WIndow Model contrast method, which introduces more rounds of local models. First, we design and utilize a sliding window mechanism for collecting client representations of historical local models. Subsequently, we employ the cosine distance function as a discriminator to distinguish them into positive and negative samples. In addition, we introduce a dynamic coefficient that balances the federated classification learning and feature learning tasks. By adjusting the dynamic coefficient at different training rounds, the global model becomes more focused on feature learning in the early stages and classification learning in the later stages. Experiments are compared with four state-of-the-art federated learning algorithms on three datasets. The results show that the proposed algorithm outperforms the four state-of-the-art algorithms in terms of accuracy. Source code is available at https://github.com/zhanghrswpu/SWIM.
SWIM:联合学习的滑动窗口模型对比
在联合学习中,数据异质性会导致客户端学习到的本地模型存在显著差异,从而影响全局模型的性能。为了解决这个问题,对比联合学习算法增加了客户端正样本和负样本的对比,使局部模型更接近全局模型。然而,现有方法将全局模型作为正样本,将上一轮局部模型作为负样本,导致对历史局部模型的利用率不足。在本文中,我们提出了 SWIM:Sliding-WIndow Model contrast 方法,该方法引入了更多轮局部模型。首先,我们设计并利用滑动窗口机制来收集历史局部模型的客户端表示。随后,我们使用余弦距离函数作为判别器,将它们区分为正样本和负样本。此外,我们还引入了动态系数,以平衡联合分类学习和特征学习任务。通过在不同的训练轮次中调整动态系数,全局模型在早期阶段会更加专注于特征学习,而在后期阶段则会更加专注于分类学习。实验在三个数据集上与四种最先进的联合学习算法进行了比较。结果表明,所提出的算法在准确率方面优于四种最先进的算法。源代码见 https://github.com/zhanghrswpu/SWIM。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
19.90
自引率
2.70%
发文量
376
审稿时长
10.6 months
期刊介绍: Computing infrastructures and systems are constantly evolving, resulting in increasingly complex and collaborative scientific applications. To cope with these advancements, there is a growing need for collaborative tools that can effectively map, control, and execute these applications. Furthermore, with the explosion of Big Data, there is a requirement for innovative methods and infrastructures to collect, analyze, and derive meaningful insights from the vast amount of data generated. This necessitates the integration of computational and storage capabilities, databases, sensors, and human collaboration. Future Generation Computer Systems aims to pioneer advancements in distributed systems, collaborative environments, high-performance computing, and Big Data analytics. It strives to stay at the forefront of developments in grids, clouds, and the Internet of Things (IoT) to effectively address the challenges posed by these wide-area, fully distributed sensing and computing systems.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信