DSAFL:用于跨竖井联邦学习的具有通信路径优化的分散安全聚合

IF 4.6 2区 计算机科学 Q1 COMPUTER SCIENCE, HARDWARE & ARCHITECTURE
Ling Li , Cheng Guo , Xinyu Tang , Yining Liu
{"title":"DSAFL:用于跨竖井联邦学习的具有通信路径优化的分散安全聚合","authors":"Ling Li ,&nbsp;Cheng Guo ,&nbsp;Xinyu Tang ,&nbsp;Yining Liu","doi":"10.1016/j.comnet.2025.111732","DOIUrl":null,"url":null,"abstract":"<div><div>Cross-Silo Federated Learning (CSFL) facilitates collaborative machine learning (ML) across organizations by locally training models and centrally aggregating model updates. Currently, this approach is shifting to decentralized aggregation due to the limitations of centralized aggregation such as single-point failures and network congestion. However, existing decentralized aggregation methods often suffer from privacy leakage and high communication cost. To address these issues, we propose DSAFL, a decentralized secure aggregation scheme for CSFL. In DSAFL, we present a staged secure aggregation method based on multi-key homomorphic encryption, which enables load-balanced collaborative aggregation computation across clients while preserving model update confidentiality and providing verifiability of the aggregation result. DSAFL optimizes communication paths by jointly considering communication cost and reliability, enabling cost-efficient and robust secure aggregation across diverse network topologies, and further reduces communication cost through non-interactive decryption. The security analysis proves that DSAFL is semi-honestly secure and resistant to client collusion attacks. The experimental results confirm the practicality and applicability of DSAFL, and show significant advantages in both accuracy and privacy. With a combination of computational balancing, low communication cost, and privacy preservation, DSAFL provides a solution for enabling sustainable ML collaboration across organizations.</div></div>","PeriodicalId":50637,"journal":{"name":"Computer Networks","volume":"272 ","pages":"Article 111732"},"PeriodicalIF":4.6000,"publicationDate":"2025-09-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"DSAFL:Decentralized secure aggregation with communication path optimization for cross-silo federated learning\",\"authors\":\"Ling Li ,&nbsp;Cheng Guo ,&nbsp;Xinyu Tang ,&nbsp;Yining Liu\",\"doi\":\"10.1016/j.comnet.2025.111732\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Cross-Silo Federated Learning (CSFL) facilitates collaborative machine learning (ML) across organizations by locally training models and centrally aggregating model updates. Currently, this approach is shifting to decentralized aggregation due to the limitations of centralized aggregation such as single-point failures and network congestion. However, existing decentralized aggregation methods often suffer from privacy leakage and high communication cost. To address these issues, we propose DSAFL, a decentralized secure aggregation scheme for CSFL. In DSAFL, we present a staged secure aggregation method based on multi-key homomorphic encryption, which enables load-balanced collaborative aggregation computation across clients while preserving model update confidentiality and providing verifiability of the aggregation result. DSAFL optimizes communication paths by jointly considering communication cost and reliability, enabling cost-efficient and robust secure aggregation across diverse network topologies, and further reduces communication cost through non-interactive decryption. The security analysis proves that DSAFL is semi-honestly secure and resistant to client collusion attacks. The experimental results confirm the practicality and applicability of DSAFL, and show significant advantages in both accuracy and privacy. With a combination of computational balancing, low communication cost, and privacy preservation, DSAFL provides a solution for enabling sustainable ML collaboration across organizations.</div></div>\",\"PeriodicalId\":50637,\"journal\":{\"name\":\"Computer Networks\",\"volume\":\"272 \",\"pages\":\"Article 111732\"},\"PeriodicalIF\":4.6000,\"publicationDate\":\"2025-09-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computer Networks\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S138912862500698X\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S138912862500698X","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE","Score":null,"Total":0}
引用次数: 0

摘要

跨竖井联邦学习(CSFL)通过本地训练模型和集中聚合模型更新,促进了跨组织的协作机器学习(ML)。目前,由于集中式聚合的局限性,如单点故障和网络拥塞,这种方法正在转向分散聚合。然而,现有的去中心化聚合方法往往存在隐私泄露和通信成本高的问题。为了解决这些问题,我们提出了一种分散的CSFL安全聚合方案DSAFL。在DSAFL中,我们提出了一种基于多密钥同态加密的分阶段安全聚合方法,该方法在保持模型更新机密性和提供聚合结果可验证性的同时,实现了跨客户端负载均衡的协同聚合计算。DSAFL通过综合考虑通信成本和可靠性来优化通信路径,实现跨多种网络拓扑的经济高效、鲁棒的安全聚合,并通过非交互式解密进一步降低通信成本。安全性分析证明,DSAFL具有半诚实安全性,并能抵抗客户端串通攻击。实验结果证实了DSAFL的实用性和适用性,在准确性和保密性方面都有显著的优势。DSAFL结合了计算平衡、低通信成本和隐私保护,为实现跨组织的可持续ML协作提供了解决方案。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
DSAFL:Decentralized secure aggregation with communication path optimization for cross-silo federated learning
Cross-Silo Federated Learning (CSFL) facilitates collaborative machine learning (ML) across organizations by locally training models and centrally aggregating model updates. Currently, this approach is shifting to decentralized aggregation due to the limitations of centralized aggregation such as single-point failures and network congestion. However, existing decentralized aggregation methods often suffer from privacy leakage and high communication cost. To address these issues, we propose DSAFL, a decentralized secure aggregation scheme for CSFL. In DSAFL, we present a staged secure aggregation method based on multi-key homomorphic encryption, which enables load-balanced collaborative aggregation computation across clients while preserving model update confidentiality and providing verifiability of the aggregation result. DSAFL optimizes communication paths by jointly considering communication cost and reliability, enabling cost-efficient and robust secure aggregation across diverse network topologies, and further reduces communication cost through non-interactive decryption. The security analysis proves that DSAFL is semi-honestly secure and resistant to client collusion attacks. The experimental results confirm the practicality and applicability of DSAFL, and show significant advantages in both accuracy and privacy. With a combination of computational balancing, low communication cost, and privacy preservation, DSAFL provides a solution for enabling sustainable ML collaboration across organizations.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Computer Networks
Computer Networks 工程技术-电信学
CiteScore
10.80
自引率
3.60%
发文量
434
审稿时长
8.6 months
期刊介绍: Computer Networks is an international, archival journal providing a publication vehicle for complete coverage of all topics of interest to those involved in the computer communications networking area. The audience includes researchers, managers and operators of networks as well as designers and implementors. The Editorial Board will consider any material for publication that is of interest to those groups.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信