Privacy-Enhanced Knowledge Transfer with Collaborative Split Learning over Teacher Ensembles

Ziyao Liu, Jiale Guo, Mengmeng Yang, Wenzhuo Yang, Jiani Fan, Kwok-Yan Lam
{"title":"Privacy-Enhanced Knowledge Transfer with Collaborative Split Learning over Teacher Ensembles","authors":"Ziyao Liu, Jiale Guo, Mengmeng Yang, Wenzhuo Yang, Jiani Fan, Kwok-Yan Lam","doi":"10.1145/3591197.3591303","DOIUrl":null,"url":null,"abstract":"Knowledge Transfer has received much attention for its ability to transfer knowledge, rather than data, from one application task to another. In order to comply with the stringent data privacy regulations, privacy-preserving knowledge transfer is highly desirable. The Private Aggregation of Teacher Ensembles (PATE) scheme is one promising approach to address this privacy concern while supporting knowledge transfer from an ensemble of \"teacher\" models to a \"student\" model under the coordination of an aggregator. To further protect the data privacy of the student node, the privacy-enhanced version of PATE makes use of cryptographic techniques at the expense of heavy computation overheads at the teacher nodes. However, this inevitably hinders the adoption of knowledge transfer due to the highly disparate computational capability of teachers. Besides, in real-life systems, participating teachers may drop out of the system at any time, which causes new security risks for adopted cryptographic building blocks. Thus, it is desirable to devise privacy-enhanced knowledge transfer that can run on teacher nodes with relatively fewer computational resources and can preserve privacy with dropped teacher nodes. In this connection, we propose a dropout-resilient and privacy-enhanced knowledge transfer scheme, Collaborative Split learning over Teacher Ensembles (CSTE), that supports the participating teacher nodes to train and infer their local models using split learning. CSTE not only allows the compute-intensive processing to be performed at a split learning server, but also protects the data privacy of teacher nodes from collusion between the student node and aggregator. Experimental results showed that CSTE achieves significant efficiency improvement from existing schemes.","PeriodicalId":128846,"journal":{"name":"Proceedings of the 2023 Secure and Trustworthy Deep Learning Systems Workshop","volume":"6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2023 Secure and Trustworthy Deep Learning Systems Workshop","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3591197.3591303","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Knowledge Transfer has received much attention for its ability to transfer knowledge, rather than data, from one application task to another. In order to comply with the stringent data privacy regulations, privacy-preserving knowledge transfer is highly desirable. The Private Aggregation of Teacher Ensembles (PATE) scheme is one promising approach to address this privacy concern while supporting knowledge transfer from an ensemble of "teacher" models to a "student" model under the coordination of an aggregator. To further protect the data privacy of the student node, the privacy-enhanced version of PATE makes use of cryptographic techniques at the expense of heavy computation overheads at the teacher nodes. However, this inevitably hinders the adoption of knowledge transfer due to the highly disparate computational capability of teachers. Besides, in real-life systems, participating teachers may drop out of the system at any time, which causes new security risks for adopted cryptographic building blocks. Thus, it is desirable to devise privacy-enhanced knowledge transfer that can run on teacher nodes with relatively fewer computational resources and can preserve privacy with dropped teacher nodes. In this connection, we propose a dropout-resilient and privacy-enhanced knowledge transfer scheme, Collaborative Split learning over Teacher Ensembles (CSTE), that supports the participating teacher nodes to train and infer their local models using split learning. CSTE not only allows the compute-intensive processing to be performed at a split learning server, but also protects the data privacy of teacher nodes from collusion between the student node and aggregator. Experimental results showed that CSTE achieves significant efficiency improvement from existing schemes.
基于教师团队的协作式分割学习的隐私增强知识转移
知识转移因其将知识而不是数据从一个应用任务转移到另一个应用任务的能力而受到广泛关注。为了遵守严格的数据隐私法规,保护隐私的知识转移是非常必要的。教师集合的私有聚合(PATE)方案是解决这一隐私问题的一种有前途的方法,同时支持在聚合器的协调下从“教师”模型集合到“学生”模型的知识转移。为了进一步保护学生节点的数据隐私,PATE的隐私增强版本使用了加密技术,但代价是教师节点的大量计算开销。然而,由于教师的计算能力高度分散,这不可避免地阻碍了知识转移的采用。此外,在现实系统中,参与系统的教师随时可能退出系统,这给所采用的加密模块带来了新的安全风险。因此,需要设计一种增强隐私的知识转移,它可以在计算资源相对较少的教师节点上运行,并且可以在删除教师节点时保持隐私。在这方面,我们提出了一种具有辍学弹性和隐私增强的知识转移方案,即教师集成上的协作分裂学习(CSTE),该方案支持参与的教师节点使用分裂学习来训练和推断他们的本地模型。CSTE不仅允许在一个分离的学习服务器上执行计算密集型的处理,而且还保护了教师节点的数据隐私,防止学生节点和聚合器之间的共谋。实验结果表明,CSTE比现有方案的效率有显著提高。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信