{"title":"当联邦学习与知识蒸馏相结合,保护消费者边缘网络","authors":"Zakaria Abou El Houda;Hajar Moudoud;Bouziane Brik","doi":"10.1109/TCE.2025.3559004","DOIUrl":null,"url":null,"abstract":"Consumer networks face several security challenges due to the distributed nature of edge devices and the sensitive data they handle. Federated Learning (FL) presents a promising paradigm for collaborative model training in distributed environments. However, its implementation in edge consumer networks raises concerns about model heterogeneity, communication efficiency, and reverse engineering attacks. To address these issues, in this paper, we introduce SKDFL, a novel framework that leverages Knowledge Distillation (KD) and Secure Multi-Party Computation (SMPC) techniques to enhance communication efficiency while preserving data privacy in edge consumer networks. Through the use of KD, the distilled knowledge is transmitted between devices, significantly reducing communication overhead. Additionally, we incorporate lightweight encryption mechanisms to protect soft-labels from reverse engineering attacks using SMPC. We evaluate our proposed framework using two public datasets and demonstrate its efficiency in reducing communication costs, achieving up to a 92.4% reduction compared to conventional FL methods. Moreover, SKDFL achieves high performances in terms of accuracy and F1-score in both binary and multi-class classification while preserving the privacy of clients. Our obtained results show the potential of SKDFL to address the challenges of communication efficiency and data privacy in FL for edge consumer networks, paving the way for secure and efficient collaborative learning in consumer networks.","PeriodicalId":13208,"journal":{"name":"IEEE Transactions on Consumer Electronics","volume":"71 2","pages":"7192-7200"},"PeriodicalIF":10.9000,"publicationDate":"2025-04-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"When Federated Learning Meets Knowledge Distillation to Secure Consumer Edge Network\",\"authors\":\"Zakaria Abou El Houda;Hajar Moudoud;Bouziane Brik\",\"doi\":\"10.1109/TCE.2025.3559004\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Consumer networks face several security challenges due to the distributed nature of edge devices and the sensitive data they handle. Federated Learning (FL) presents a promising paradigm for collaborative model training in distributed environments. However, its implementation in edge consumer networks raises concerns about model heterogeneity, communication efficiency, and reverse engineering attacks. To address these issues, in this paper, we introduce SKDFL, a novel framework that leverages Knowledge Distillation (KD) and Secure Multi-Party Computation (SMPC) techniques to enhance communication efficiency while preserving data privacy in edge consumer networks. Through the use of KD, the distilled knowledge is transmitted between devices, significantly reducing communication overhead. Additionally, we incorporate lightweight encryption mechanisms to protect soft-labels from reverse engineering attacks using SMPC. We evaluate our proposed framework using two public datasets and demonstrate its efficiency in reducing communication costs, achieving up to a 92.4% reduction compared to conventional FL methods. Moreover, SKDFL achieves high performances in terms of accuracy and F1-score in both binary and multi-class classification while preserving the privacy of clients. Our obtained results show the potential of SKDFL to address the challenges of communication efficiency and data privacy in FL for edge consumer networks, paving the way for secure and efficient collaborative learning in consumer networks.\",\"PeriodicalId\":13208,\"journal\":{\"name\":\"IEEE Transactions on Consumer Electronics\",\"volume\":\"71 2\",\"pages\":\"7192-7200\"},\"PeriodicalIF\":10.9000,\"publicationDate\":\"2025-04-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Consumer Electronics\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10955720/\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Consumer Electronics","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10955720/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
When Federated Learning Meets Knowledge Distillation to Secure Consumer Edge Network
Consumer networks face several security challenges due to the distributed nature of edge devices and the sensitive data they handle. Federated Learning (FL) presents a promising paradigm for collaborative model training in distributed environments. However, its implementation in edge consumer networks raises concerns about model heterogeneity, communication efficiency, and reverse engineering attacks. To address these issues, in this paper, we introduce SKDFL, a novel framework that leverages Knowledge Distillation (KD) and Secure Multi-Party Computation (SMPC) techniques to enhance communication efficiency while preserving data privacy in edge consumer networks. Through the use of KD, the distilled knowledge is transmitted between devices, significantly reducing communication overhead. Additionally, we incorporate lightweight encryption mechanisms to protect soft-labels from reverse engineering attacks using SMPC. We evaluate our proposed framework using two public datasets and demonstrate its efficiency in reducing communication costs, achieving up to a 92.4% reduction compared to conventional FL methods. Moreover, SKDFL achieves high performances in terms of accuracy and F1-score in both binary and multi-class classification while preserving the privacy of clients. Our obtained results show the potential of SKDFL to address the challenges of communication efficiency and data privacy in FL for edge consumer networks, paving the way for secure and efficient collaborative learning in consumer networks.
期刊介绍:
The main focus for the IEEE Transactions on Consumer Electronics is the engineering and research aspects of the theory, design, construction, manufacture or end use of mass market electronics, systems, software and services for consumers.