多目标联合学习:平衡全局绩效与个体公平

IF 6.2 2区 计算机科学 Q1 COMPUTER SCIENCE, THEORY & METHODS
{"title":"多目标联合学习:平衡全局绩效与个体公平","authors":"","doi":"10.1016/j.future.2024.07.046","DOIUrl":null,"url":null,"abstract":"<div><p>In federated learning, non-iid data not only diminishes the performance of the global model but also gives rise to the fairness problem which manifests as an increase in the variance of the global model’s accuracy across clients. Fairness issues can result in the global model performing poorly or even failing on certain clients. Existing methods addressing the fairness problem in federated learning tend to neglect the comprehensive improvement of both the average performance and fairness of the global model. In addressing it, the multi-objective optimization method for fine-tuning global gradients, FedMC algorithm is introduced in this paper. The primary objective is the average loss function of all clients, and the sub-objective involves fine-tuning the global gradient by reducing the gradient conflict between the global gradient and the local gradients. Specifically, we refine the global gradient by incorporating a sub-optimization objective aimed at alleviating conflicts between the global gradient and the local gradient with the largest deviation, denoted as FedMC. FedMC can enhance the performance and convergence rate of clients with initially poor performance, albeit at the cost of the earlier convergence rate of clients with initially good performance. Nevertheless, it enables the latter to reach the accuracy level achieved before fine-tuning. In addition, we also propose FedMC+ algorithm, owning three additional optimization mechanisms built upon the FedMC optimization objective which includes the decay of hyperparameter, the sliding window mechanism, and data-balanced client selection. Besides, we present a theoretical analysis of the convergence rate of FedMC, demonstrating its convergence to a Pareto stationary solution. Our combined experimental results confirm that FedMC+ achieves an average 4.5% improvement in accuracy and a 22% reduction in the degree of dispersion compared to state-of-the-art federated learning (FL) methods.</p></div>","PeriodicalId":55132,"journal":{"name":"Future Generation Computer Systems-The International Journal of Escience","volume":null,"pages":null},"PeriodicalIF":6.2000,"publicationDate":"2024-07-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Multi-objective federated learning: Balancing global performance and individual fairness\",\"authors\":\"\",\"doi\":\"10.1016/j.future.2024.07.046\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>In federated learning, non-iid data not only diminishes the performance of the global model but also gives rise to the fairness problem which manifests as an increase in the variance of the global model’s accuracy across clients. Fairness issues can result in the global model performing poorly or even failing on certain clients. Existing methods addressing the fairness problem in federated learning tend to neglect the comprehensive improvement of both the average performance and fairness of the global model. In addressing it, the multi-objective optimization method for fine-tuning global gradients, FedMC algorithm is introduced in this paper. The primary objective is the average loss function of all clients, and the sub-objective involves fine-tuning the global gradient by reducing the gradient conflict between the global gradient and the local gradients. Specifically, we refine the global gradient by incorporating a sub-optimization objective aimed at alleviating conflicts between the global gradient and the local gradient with the largest deviation, denoted as FedMC. FedMC can enhance the performance and convergence rate of clients with initially poor performance, albeit at the cost of the earlier convergence rate of clients with initially good performance. Nevertheless, it enables the latter to reach the accuracy level achieved before fine-tuning. In addition, we also propose FedMC+ algorithm, owning three additional optimization mechanisms built upon the FedMC optimization objective which includes the decay of hyperparameter, the sliding window mechanism, and data-balanced client selection. Besides, we present a theoretical analysis of the convergence rate of FedMC, demonstrating its convergence to a Pareto stationary solution. Our combined experimental results confirm that FedMC+ achieves an average 4.5% improvement in accuracy and a 22% reduction in the degree of dispersion compared to state-of-the-art federated learning (FL) methods.</p></div>\",\"PeriodicalId\":55132,\"journal\":{\"name\":\"Future Generation Computer Systems-The International Journal of Escience\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":6.2000,\"publicationDate\":\"2024-07-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Future Generation Computer Systems-The International Journal of Escience\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0167739X24004199\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, THEORY & METHODS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Future Generation Computer Systems-The International Journal of Escience","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0167739X24004199","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 0

摘要

在联合学习中,非同源数据不仅会降低全局模型的性能,还会引发公平性问题,表现为全局模型在不同客户机上的准确性差异增大。公平性问题会导致全局模型在某些客户端上表现不佳甚至失效。现有的解决联合学习中公平性问题的方法往往忽视了全局模型平均性能和公平性的综合改善。为解决这一问题,本文引入了微调全局梯度的多目标优化方法--FedMC 算法。主目标是所有客户的平均损失函数,子目标是通过减少全局梯度和局部梯度之间的梯度冲突来微调全局梯度。具体来说,我们通过加入一个子优化目标来细化全局梯度,该目标旨在缓解全局梯度与偏差最大的局部梯度之间的冲突,称为 FedMC。FedMC 可以提高最初性能较差的客户机的性能和收敛速度,但代价是降低了最初性能较好的客户机的收敛速度。不过,它能使后者达到微调前的精度水平。此外,我们还提出了 FedMC+ 算法,该算法在 FedMC 优化目标的基础上增加了三个优化机制,包括超参数衰减、滑动窗口机制和数据均衡客户端选择。此外,我们还对 FedMC 的收敛速率进行了理论分析,证明了它对帕累托静态解的收敛性。我们的综合实验结果证实,与最先进的联合学习(FL)方法相比,FedMC+ 的准确率平均提高了 4.5%,分散程度降低了 22%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Multi-objective federated learning: Balancing global performance and individual fairness

In federated learning, non-iid data not only diminishes the performance of the global model but also gives rise to the fairness problem which manifests as an increase in the variance of the global model’s accuracy across clients. Fairness issues can result in the global model performing poorly or even failing on certain clients. Existing methods addressing the fairness problem in federated learning tend to neglect the comprehensive improvement of both the average performance and fairness of the global model. In addressing it, the multi-objective optimization method for fine-tuning global gradients, FedMC algorithm is introduced in this paper. The primary objective is the average loss function of all clients, and the sub-objective involves fine-tuning the global gradient by reducing the gradient conflict between the global gradient and the local gradients. Specifically, we refine the global gradient by incorporating a sub-optimization objective aimed at alleviating conflicts between the global gradient and the local gradient with the largest deviation, denoted as FedMC. FedMC can enhance the performance and convergence rate of clients with initially poor performance, albeit at the cost of the earlier convergence rate of clients with initially good performance. Nevertheless, it enables the latter to reach the accuracy level achieved before fine-tuning. In addition, we also propose FedMC+ algorithm, owning three additional optimization mechanisms built upon the FedMC optimization objective which includes the decay of hyperparameter, the sliding window mechanism, and data-balanced client selection. Besides, we present a theoretical analysis of the convergence rate of FedMC, demonstrating its convergence to a Pareto stationary solution. Our combined experimental results confirm that FedMC+ achieves an average 4.5% improvement in accuracy and a 22% reduction in the degree of dispersion compared to state-of-the-art federated learning (FL) methods.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
19.90
自引率
2.70%
发文量
376
审稿时长
10.6 months
期刊介绍: Computing infrastructures and systems are constantly evolving, resulting in increasingly complex and collaborative scientific applications. To cope with these advancements, there is a growing need for collaborative tools that can effectively map, control, and execute these applications. Furthermore, with the explosion of Big Data, there is a requirement for innovative methods and infrastructures to collect, analyze, and derive meaningful insights from the vast amount of data generated. This necessitates the integration of computational and storage capabilities, databases, sensors, and human collaboration. Future Generation Computer Systems aims to pioneer advancements in distributed systems, collaborative environments, high-performance computing, and Big Data analytics. It strives to stay at the forefront of developments in grids, clouds, and the Internet of Things (IoT) to effectively address the challenges posed by these wide-area, fully distributed sensing and computing systems.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信