SHE-SFL: An efficient and privacy-preserving heterogeneous federated split learning architecture based on homomorphic encryption

IF 6.2 2区 计算机科学 Q1 COMPUTER SCIENCE, THEORY & METHODS
Jiaqi Xia , Meng Wu , Pengyong Li
{"title":"SHE-SFL: An efficient and privacy-preserving heterogeneous federated split learning architecture based on homomorphic encryption","authors":"Jiaqi Xia ,&nbsp;Meng Wu ,&nbsp;Pengyong Li","doi":"10.1016/j.future.2025.108101","DOIUrl":null,"url":null,"abstract":"<div><div>Federated Learning (FL) and Split Learning (SL) are distributed methods that enable collaborative model training without sharing raw data. Combining FL and SL leverages the benefits of computational offloading and model privacy while enhancing efficiency through parallel processing. However, both methods risk privacy leaks: FL is vulnerable to model inversion attacks and gradient leaks, whereas SL’s data transmission during training can reveal sensitive information, potentially allowing attackers to reconstruct the original dataset. Current privacy protections often fall short of fully securing these systems and impose substantial computational and communication costs. In this work, we introduce SHE-SFL, an efficient privacy-preserving federated split learning architecture based on fully homomorphic encryption. Specifically, we employ the CKKS scheme to encrypt activation values during forward propagation, gradients during backpropagation, and the model parameters shared at the aggregation stage. This ensures that all data leaving the client domain is encrypted. This architecture includes two key modules: SHE-SL encrypts and transmits ciphertext based on batch packing and adopts a sparsification strategy, reducing system overhead and enabling polymorphic training of the models. SHE-Aggr enhances the efficiency of encrypting model parameters during the aggregation phase and perfectly supports encrypted weighted aggregation. Extensive experimental results demonstrate that the proposed SHE-SFL provides comprehensive protection for the federated split learning architecture with minimal impact on model performance, effectively safeguarding client privacy while significantly reducing system overhead.</div></div>","PeriodicalId":55132,"journal":{"name":"Future Generation Computer Systems-The International Journal of Escience","volume":"175 ","pages":"Article 108101"},"PeriodicalIF":6.2000,"publicationDate":"2025-08-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Future Generation Computer Systems-The International Journal of Escience","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0167739X25003954","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 0

Abstract

Federated Learning (FL) and Split Learning (SL) are distributed methods that enable collaborative model training without sharing raw data. Combining FL and SL leverages the benefits of computational offloading and model privacy while enhancing efficiency through parallel processing. However, both methods risk privacy leaks: FL is vulnerable to model inversion attacks and gradient leaks, whereas SL’s data transmission during training can reveal sensitive information, potentially allowing attackers to reconstruct the original dataset. Current privacy protections often fall short of fully securing these systems and impose substantial computational and communication costs. In this work, we introduce SHE-SFL, an efficient privacy-preserving federated split learning architecture based on fully homomorphic encryption. Specifically, we employ the CKKS scheme to encrypt activation values during forward propagation, gradients during backpropagation, and the model parameters shared at the aggregation stage. This ensures that all data leaving the client domain is encrypted. This architecture includes two key modules: SHE-SL encrypts and transmits ciphertext based on batch packing and adopts a sparsification strategy, reducing system overhead and enabling polymorphic training of the models. SHE-Aggr enhances the efficiency of encrypting model parameters during the aggregation phase and perfectly supports encrypted weighted aggregation. Extensive experimental results demonstrate that the proposed SHE-SFL provides comprehensive protection for the federated split learning architecture with minimal impact on model performance, effectively safeguarding client privacy while significantly reducing system overhead.
SHE-SFL:一种基于同态加密的高效且保护隐私的异构联邦分裂学习体系结构
联邦学习(FL)和分裂学习(SL)是分布式方法,可以在不共享原始数据的情况下进行协作模型训练。FL和SL的结合利用了计算卸载和模型隐私的好处,同时通过并行处理提高了效率。然而,这两种方法都存在隐私泄露的风险:FL容易受到模型反演攻击和梯度泄漏的攻击,而SL在训练过程中的数据传输可能会泄露敏感信息,可能会让攻击者重建原始数据集。目前的隐私保护往往不能完全保护这些系统,并带来大量的计算和通信成本。在这项工作中,我们介绍了SHE-SFL,一种高效的基于完全同态加密的隐私保护联邦分裂学习体系结构。具体来说,我们采用CKKS方案对正向传播期间的激活值、反向传播期间的梯度和聚合阶段共享的模型参数进行加密。这确保了所有离开客户端域的数据都是加密的。该体系结构包括两个关键模块:SHE-SL基于批打包对密文进行加密和传输,并采用稀疏化策略,减少系统开销并支持模型的多态训练。SHE-Aggr提高了模型参数在聚合阶段的加密效率,并且很好地支持加密加权聚合。大量实验结果表明,本文提出的SHE-SFL为联邦分裂学习架构提供了全面的保护,对模型性能的影响最小,有效地保护了客户端隐私,同时显著降低了系统开销。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
19.90
自引率
2.70%
发文量
376
审稿时长
10.6 months
期刊介绍: Computing infrastructures and systems are constantly evolving, resulting in increasingly complex and collaborative scientific applications. To cope with these advancements, there is a growing need for collaborative tools that can effectively map, control, and execute these applications. Furthermore, with the explosion of Big Data, there is a requirement for innovative methods and infrastructures to collect, analyze, and derive meaningful insights from the vast amount of data generated. This necessitates the integration of computational and storage capabilities, databases, sensors, and human collaboration. Future Generation Computer Systems aims to pioneer advancements in distributed systems, collaborative environments, high-performance computing, and Big Data analytics. It strives to stay at the forefront of developments in grids, clouds, and the Internet of Things (IoT) to effectively address the challenges posed by these wide-area, fully distributed sensing and computing systems.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信