具有层间个性化贡献的联邦聚合:性能与隐私之间基于偏好的优化

IF 8.9 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Xiaoting Sun;Zhong Li;Changjun Jiang
{"title":"具有层间个性化贡献的联邦聚合:性能与隐私之间基于偏好的优化","authors":"Xiaoting Sun;Zhong Li;Changjun Jiang","doi":"10.1109/TNNLS.2025.3552206","DOIUrl":null,"url":null,"abstract":"Currently, due to the different distribution of data for each user, many personalized federated learning (PFL) methods have emerged to meet the personalized needs of different users. However, existing methods have two problems: 1) in the aggregation process, the contribution between the internal layers of the client model is not considered and 2) it is difficult to match the quantitative weight information of both user privacy protection and performance with their qualitative preferences during the training process. Therefore, we first propose a framework for federated aggregation with interlayer personalized contribution named FedIPC, which completes model aggregation based on the contribution of internal layers and improves client model performance. Based on the above framework, we design a multiobjective federated optimization method based on adaptive preference indicators named FedAPI-nondominated sorting genetic algorithm II (NSGA-II). This method can match quantitative weight with qualitative user preferences and adaptively select for Pareto optimal solutions during the optimization process. Extensive experiments on two image datasets and a tabular dataset show that our proposed method not only accelerates model convergence, but also achieves good improvements in model performance. In addition, our proposed method can accurately match the qualitative preferences of users, balancing the performance of the model and privacy protection based on preferences.","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"36 9","pages":"17071-17085"},"PeriodicalIF":8.9000,"publicationDate":"2025-04-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Federated Aggregation With Interlayer Personalized Contribution: Preference-Based Optimization Between Performance and Privacy\",\"authors\":\"Xiaoting Sun;Zhong Li;Changjun Jiang\",\"doi\":\"10.1109/TNNLS.2025.3552206\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Currently, due to the different distribution of data for each user, many personalized federated learning (PFL) methods have emerged to meet the personalized needs of different users. However, existing methods have two problems: 1) in the aggregation process, the contribution between the internal layers of the client model is not considered and 2) it is difficult to match the quantitative weight information of both user privacy protection and performance with their qualitative preferences during the training process. Therefore, we first propose a framework for federated aggregation with interlayer personalized contribution named FedIPC, which completes model aggregation based on the contribution of internal layers and improves client model performance. Based on the above framework, we design a multiobjective federated optimization method based on adaptive preference indicators named FedAPI-nondominated sorting genetic algorithm II (NSGA-II). This method can match quantitative weight with qualitative user preferences and adaptively select for Pareto optimal solutions during the optimization process. Extensive experiments on two image datasets and a tabular dataset show that our proposed method not only accelerates model convergence, but also achieves good improvements in model performance. In addition, our proposed method can accurately match the qualitative preferences of users, balancing the performance of the model and privacy protection based on preferences.\",\"PeriodicalId\":13303,\"journal\":{\"name\":\"IEEE transactions on neural networks and learning systems\",\"volume\":\"36 9\",\"pages\":\"17071-17085\"},\"PeriodicalIF\":8.9000,\"publicationDate\":\"2025-04-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on neural networks and learning systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10981479/\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10981479/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

目前,由于每个用户的数据分布不同,出现了许多个性化联邦学习(PFL)方法来满足不同用户的个性化需求。然而,现有的方法存在两个问题:1)在聚合过程中,没有考虑客户端模型内层之间的贡献;2)在训练过程中,难以将用户隐私保护和性能的定量权重信息与用户的定性偏好匹配。因此,我们首先提出了一种具有层间个性化贡献的联邦聚合框架FedIPC,该框架基于内层的贡献完成模型聚合,提高了客户端模型性能。基于上述框架,我们设计了一种基于自适应偏好指标的多目标联邦优化方法,命名为fedapi -非支配排序遗传算法II (NSGA-II)。该方法可以将定量权重与定性用户偏好相匹配,并在优化过程中自适应选择Pareto最优解。在两个图像数据集和一个表格数据集上进行的大量实验表明,我们提出的方法不仅加速了模型的收敛,而且在模型性能上取得了很好的提高。此外,我们提出的方法可以准确匹配用户的定性偏好,平衡模型的性能和基于偏好的隐私保护。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Federated Aggregation With Interlayer Personalized Contribution: Preference-Based Optimization Between Performance and Privacy
Currently, due to the different distribution of data for each user, many personalized federated learning (PFL) methods have emerged to meet the personalized needs of different users. However, existing methods have two problems: 1) in the aggregation process, the contribution between the internal layers of the client model is not considered and 2) it is difficult to match the quantitative weight information of both user privacy protection and performance with their qualitative preferences during the training process. Therefore, we first propose a framework for federated aggregation with interlayer personalized contribution named FedIPC, which completes model aggregation based on the contribution of internal layers and improves client model performance. Based on the above framework, we design a multiobjective federated optimization method based on adaptive preference indicators named FedAPI-nondominated sorting genetic algorithm II (NSGA-II). This method can match quantitative weight with qualitative user preferences and adaptively select for Pareto optimal solutions during the optimization process. Extensive experiments on two image datasets and a tabular dataset show that our proposed method not only accelerates model convergence, but also achieves good improvements in model performance. In addition, our proposed method can accurately match the qualitative preferences of users, balancing the performance of the model and privacy protection based on preferences.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE transactions on neural networks and learning systems
IEEE transactions on neural networks and learning systems COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE-COMPUTER SCIENCE, HARDWARE & ARCHITECTURE
CiteScore
23.80
自引率
9.60%
发文量
2102
审稿时长
3-8 weeks
期刊介绍: The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信