弹弓:联盟学习的全局有利局部更新

Jialiang Liu;Huawei Huang;Chun Wang;Sicong Zhou;Ruixin Li;Zibin Zheng
{"title":"弹弓:联盟学习的全局有利局部更新","authors":"Jialiang Liu;Huawei Huang;Chun Wang;Sicong Zhou;Ruixin Li;Zibin Zheng","doi":"10.1109/OJCS.2024.3356599","DOIUrl":null,"url":null,"abstract":"Federated Learning (FL), as a promising distributed learning paradigm, is proposed to solve the contradiction between the data hunger of modern machine learning and the increasingly stringent need for data privacy. However, clients naturally present different distributions of their local data and inconsistent local optima, which leads to poor model performance of FL. Many previous methods focus on mitigating objective inconsistency. Although local objective consistency can be guaranteed when the number of communication rounds is infinite, we should notice that the accumulation of global drift and the limitation on the potential of local updates are non-negligible in those previous methods. In this article, we study a new framework for data-heterogeneity FL, in which the local updates in clients towards the global optimum can accelerate FL. We propose a new approach called \n<italic>Slingshot</i>\n. Slingshot's design goals are twofold, i.e., i) to retain the potential of local updates, and ii) to combine local and global trends. Experimental results show that \n<italic>Slingshot</i>\n helps local updates become more globally favorable and outperforms other popular methods under various FL settings. For example, on CIFAR10, \n<italic>Slingshot</i>\n achieves 46.52% improvement in test accuracy and 48.21× speedup for a lightweight neural network named \n<italic>SqueezeNet</i>\n.","PeriodicalId":13205,"journal":{"name":"IEEE Open Journal of the Computer Society","volume":"5 ","pages":"39-49"},"PeriodicalIF":0.0000,"publicationDate":"2024-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10411043","citationCount":"0","resultStr":"{\"title\":\"Slingshot: Globally Favorable Local Updates for Federated Learning\",\"authors\":\"Jialiang Liu;Huawei Huang;Chun Wang;Sicong Zhou;Ruixin Li;Zibin Zheng\",\"doi\":\"10.1109/OJCS.2024.3356599\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Federated Learning (FL), as a promising distributed learning paradigm, is proposed to solve the contradiction between the data hunger of modern machine learning and the increasingly stringent need for data privacy. However, clients naturally present different distributions of their local data and inconsistent local optima, which leads to poor model performance of FL. Many previous methods focus on mitigating objective inconsistency. Although local objective consistency can be guaranteed when the number of communication rounds is infinite, we should notice that the accumulation of global drift and the limitation on the potential of local updates are non-negligible in those previous methods. In this article, we study a new framework for data-heterogeneity FL, in which the local updates in clients towards the global optimum can accelerate FL. We propose a new approach called \\n<italic>Slingshot</i>\\n. Slingshot's design goals are twofold, i.e., i) to retain the potential of local updates, and ii) to combine local and global trends. Experimental results show that \\n<italic>Slingshot</i>\\n helps local updates become more globally favorable and outperforms other popular methods under various FL settings. For example, on CIFAR10, \\n<italic>Slingshot</i>\\n achieves 46.52% improvement in test accuracy and 48.21× speedup for a lightweight neural network named \\n<italic>SqueezeNet</i>\\n.\",\"PeriodicalId\":13205,\"journal\":{\"name\":\"IEEE Open Journal of the Computer Society\",\"volume\":\"5 \",\"pages\":\"39-49\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-01-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10411043\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Open Journal of the Computer Society\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10411043/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Open Journal of the Computer Society","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10411043/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

联邦学习(Federated Learning,FL)作为一种前景广阔的分布式学习范式,被提出来解决现代机器学习的数据饥渴与日益严格的数据隐私需求之间的矛盾。然而,客户端自然会呈现出不同的本地数据分布和不一致的本地最优值,这就导致 FL 的模型性能不佳。以前的许多方法都侧重于缓解目标不一致性。虽然当通信轮数为无限时,局部目标一致性可以得到保证,但我们应该注意到,在这些方法中,全局漂移的积累和局部更新潜力的限制是不可忽视的。在本文中,我们研究了一种新的数据异构 FL 框架,在这种框架中,客户端向全局最优的局部更新可以加速 FL。我们提出了一种名为 Slingshot 的新方法。Slingshot 的设计目标有两个方面,即 i) 保留局部更新的潜力;ii) 将局部和全局趋势结合起来。实验结果表明,Slingshot 能帮助局部更新变得更有利于全局,在各种 FL 设置下的表现优于其他流行方法。例如,在 CIFAR10 上,Slingshot 使名为 SqueezeNet 的轻量级神经网络的测试准确率提高了 46.52%,速度提高了 48.21 倍。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Slingshot: Globally Favorable Local Updates for Federated Learning
Federated Learning (FL), as a promising distributed learning paradigm, is proposed to solve the contradiction between the data hunger of modern machine learning and the increasingly stringent need for data privacy. However, clients naturally present different distributions of their local data and inconsistent local optima, which leads to poor model performance of FL. Many previous methods focus on mitigating objective inconsistency. Although local objective consistency can be guaranteed when the number of communication rounds is infinite, we should notice that the accumulation of global drift and the limitation on the potential of local updates are non-negligible in those previous methods. In this article, we study a new framework for data-heterogeneity FL, in which the local updates in clients towards the global optimum can accelerate FL. We propose a new approach called Slingshot . Slingshot's design goals are twofold, i.e., i) to retain the potential of local updates, and ii) to combine local and global trends. Experimental results show that Slingshot helps local updates become more globally favorable and outperforms other popular methods under various FL settings. For example, on CIFAR10, Slingshot achieves 46.52% improvement in test accuracy and 48.21× speedup for a lightweight neural network named SqueezeNet .
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
12.60
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信