In-Network Aggregation for Privacy-Preserving Federated Learning

Fahao Chen, Peng Li, T. Miyazaki
{"title":"In-Network Aggregation for Privacy-Preserving Federated Learning","authors":"Fahao Chen, Peng Li, T. Miyazaki","doi":"10.1109/ict-dm52643.2021.9664035","DOIUrl":null,"url":null,"abstract":"Cross-silo federated learning becomes popular in various fields due to its great promises in protecting training data. By carefully examining the interaction among distributed training nodes, we find that existing federated learning still suffers from security weakness and network bottleneck during model synchronization. It has no protection on training models, which also contain significant private information. In addition, many evidences have shown that model synchronization over wide-area network is slow, bottlenecking the whole learning process. To fill this research gap, we propose a novel cross-silo federated learning architecture that can protect both training data and model by using homomorphic encryption (HE). Instead of sharing the model parameters in plaintexts, we encrypt them using the HE, so that they can be aggregated in ciphertexts. In order to handle the inflated network traffic incurred by HE, we apply the in-network aggregation by exploiting the strong capability of programmable switches. A fast algorithm that jointly considers in-network aggregator placement and traffic engineering has been proposed and evaluated by extensive simulations.","PeriodicalId":337000,"journal":{"name":"2021 International Conference on Information and Communication Technologies for Disaster Management (ICT-DM)","volume":"366 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 International Conference on Information and Communication Technologies for Disaster Management (ICT-DM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ict-dm52643.2021.9664035","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

Cross-silo federated learning becomes popular in various fields due to its great promises in protecting training data. By carefully examining the interaction among distributed training nodes, we find that existing federated learning still suffers from security weakness and network bottleneck during model synchronization. It has no protection on training models, which also contain significant private information. In addition, many evidences have shown that model synchronization over wide-area network is slow, bottlenecking the whole learning process. To fill this research gap, we propose a novel cross-silo federated learning architecture that can protect both training data and model by using homomorphic encryption (HE). Instead of sharing the model parameters in plaintexts, we encrypt them using the HE, so that they can be aggregated in ciphertexts. In order to handle the inflated network traffic incurred by HE, we apply the in-network aggregation by exploiting the strong capability of programmable switches. A fast algorithm that jointly considers in-network aggregator placement and traffic engineering has been proposed and evaluated by extensive simulations.
保护隐私的网络聚合联邦学习
跨竖井联邦学习因其在保护训练数据方面的巨大潜力而在各个领域受到欢迎。通过仔细研究分布式训练节点之间的相互作用,我们发现现有的联邦学习在模型同步过程中仍然存在安全弱点和网络瓶颈。它对训练模型没有保护,其中也包含重要的私人信息。此外,许多证据表明,广域网上的模型同步速度很慢,成为整个学习过程的瓶颈。为了填补这一研究空白,我们提出了一种新的跨竖井联邦学习架构,该架构可以使用同态加密(HE)来保护训练数据和模型。我们没有以明文形式共享模型参数,而是使用HE对它们进行加密,以便它们可以在密文中聚合。为了应对HE带来的网络流量膨胀,我们利用可编程交换机的强大功能,应用了网内汇聚。提出了一种综合考虑网络内聚合器放置和流量工程的快速算法,并通过大量仿真对其进行了评价。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信