当联邦学习与知识蒸馏相结合,保护消费者边缘网络

IF 10.9 2区 计算机科学 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC
Zakaria Abou El Houda;Hajar Moudoud;Bouziane Brik
{"title":"当联邦学习与知识蒸馏相结合,保护消费者边缘网络","authors":"Zakaria Abou El Houda;Hajar Moudoud;Bouziane Brik","doi":"10.1109/TCE.2025.3559004","DOIUrl":null,"url":null,"abstract":"Consumer networks face several security challenges due to the distributed nature of edge devices and the sensitive data they handle. Federated Learning (FL) presents a promising paradigm for collaborative model training in distributed environments. However, its implementation in edge consumer networks raises concerns about model heterogeneity, communication efficiency, and reverse engineering attacks. To address these issues, in this paper, we introduce SKDFL, a novel framework that leverages Knowledge Distillation (KD) and Secure Multi-Party Computation (SMPC) techniques to enhance communication efficiency while preserving data privacy in edge consumer networks. Through the use of KD, the distilled knowledge is transmitted between devices, significantly reducing communication overhead. Additionally, we incorporate lightweight encryption mechanisms to protect soft-labels from reverse engineering attacks using SMPC. We evaluate our proposed framework using two public datasets and demonstrate its efficiency in reducing communication costs, achieving up to a 92.4% reduction compared to conventional FL methods. Moreover, SKDFL achieves high performances in terms of accuracy and F1-score in both binary and multi-class classification while preserving the privacy of clients. Our obtained results show the potential of SKDFL to address the challenges of communication efficiency and data privacy in FL for edge consumer networks, paving the way for secure and efficient collaborative learning in consumer networks.","PeriodicalId":13208,"journal":{"name":"IEEE Transactions on Consumer Electronics","volume":"71 2","pages":"7192-7200"},"PeriodicalIF":10.9000,"publicationDate":"2025-04-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"When Federated Learning Meets Knowledge Distillation to Secure Consumer Edge Network\",\"authors\":\"Zakaria Abou El Houda;Hajar Moudoud;Bouziane Brik\",\"doi\":\"10.1109/TCE.2025.3559004\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Consumer networks face several security challenges due to the distributed nature of edge devices and the sensitive data they handle. Federated Learning (FL) presents a promising paradigm for collaborative model training in distributed environments. However, its implementation in edge consumer networks raises concerns about model heterogeneity, communication efficiency, and reverse engineering attacks. To address these issues, in this paper, we introduce SKDFL, a novel framework that leverages Knowledge Distillation (KD) and Secure Multi-Party Computation (SMPC) techniques to enhance communication efficiency while preserving data privacy in edge consumer networks. Through the use of KD, the distilled knowledge is transmitted between devices, significantly reducing communication overhead. Additionally, we incorporate lightweight encryption mechanisms to protect soft-labels from reverse engineering attacks using SMPC. We evaluate our proposed framework using two public datasets and demonstrate its efficiency in reducing communication costs, achieving up to a 92.4% reduction compared to conventional FL methods. Moreover, SKDFL achieves high performances in terms of accuracy and F1-score in both binary and multi-class classification while preserving the privacy of clients. Our obtained results show the potential of SKDFL to address the challenges of communication efficiency and data privacy in FL for edge consumer networks, paving the way for secure and efficient collaborative learning in consumer networks.\",\"PeriodicalId\":13208,\"journal\":{\"name\":\"IEEE Transactions on Consumer Electronics\",\"volume\":\"71 2\",\"pages\":\"7192-7200\"},\"PeriodicalIF\":10.9000,\"publicationDate\":\"2025-04-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Consumer Electronics\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10955720/\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Consumer Electronics","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10955720/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

摘要

由于边缘设备的分布式特性及其处理的敏感数据,消费者网络面临着一些安全挑战。联邦学习(FL)为分布式环境下的协作模型训练提供了一个很有前途的范例。然而,它在边缘消费者网络中的实现引起了对模型异构性、通信效率和逆向工程攻击的关注。为了解决这些问题,在本文中,我们引入了SKDFL,这是一个利用知识蒸馏(KD)和安全多方计算(SMPC)技术来提高通信效率的新框架,同时保护边缘消费者网络中的数据隐私。通过使用KD,提取的知识在设备之间传输,大大减少了通信开销。此外,我们还结合了轻量级加密机制来保护软标签免受使用SMPC的反向工程攻击。我们使用两个公共数据集评估了我们提出的框架,并证明了它在降低通信成本方面的效率,与传统的FL方法相比,降低了92.4%。此外,在保护客户端的隐私的同时,SKDFL在二分类和多分类的准确率和f1分数方面都取得了很高的性能。我们获得的结果表明,SKDFL具有解决边缘消费者网络中FL通信效率和数据隐私挑战的潜力,为消费者网络中安全高效的协作学习铺平了道路。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
When Federated Learning Meets Knowledge Distillation to Secure Consumer Edge Network
Consumer networks face several security challenges due to the distributed nature of edge devices and the sensitive data they handle. Federated Learning (FL) presents a promising paradigm for collaborative model training in distributed environments. However, its implementation in edge consumer networks raises concerns about model heterogeneity, communication efficiency, and reverse engineering attacks. To address these issues, in this paper, we introduce SKDFL, a novel framework that leverages Knowledge Distillation (KD) and Secure Multi-Party Computation (SMPC) techniques to enhance communication efficiency while preserving data privacy in edge consumer networks. Through the use of KD, the distilled knowledge is transmitted between devices, significantly reducing communication overhead. Additionally, we incorporate lightweight encryption mechanisms to protect soft-labels from reverse engineering attacks using SMPC. We evaluate our proposed framework using two public datasets and demonstrate its efficiency in reducing communication costs, achieving up to a 92.4% reduction compared to conventional FL methods. Moreover, SKDFL achieves high performances in terms of accuracy and F1-score in both binary and multi-class classification while preserving the privacy of clients. Our obtained results show the potential of SKDFL to address the challenges of communication efficiency and data privacy in FL for edge consumer networks, paving the way for secure and efficient collaborative learning in consumer networks.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
7.70
自引率
9.30%
发文量
59
审稿时长
3.3 months
期刊介绍: The main focus for the IEEE Transactions on Consumer Electronics is the engineering and research aspects of the theory, design, construction, manufacture or end use of mass market electronics, systems, software and services for consumers.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信