{"title":"SoFL:基于双重聚类的异构数据聚类联合学习","authors":"Jianfei Zhang, Zhiming Qiao","doi":"10.3390/electronics13183682","DOIUrl":null,"url":null,"abstract":"Federated Learning (FL) is an emerging privacy-preserving technology that enables training a global model beneficial to all participants without sharing their data. However, differences in data distributions among participants may undermine the stability and accuracy of the global model. To address this challenge, recent research proposes client clustering based on data distribution similarity, generating independent models for each cluster in order to enhance FL performance. Nevertheless, due to the uncertainty of participant identities, FL struggles to rapidly and accurately determine the clusters. Most of the existing algorithms distinguish clients by iterative clustering, which not only increases the computing cost of the server but also affects the convergence speed of the federation model. To address these shortcomings, in this paper, we propose a novel clustering-based FL method, SoFL. SoFL introduces SOM networks, improves the quality of cluster data, and eliminates redundant categories through secondary clustering, encouraging more similar clients to train together. Through this mechanism, SoFL completes the clustering task in one round of communication and speeds up the convergence of federated model training. Simulation results demonstrate that SoFL accurately and swiftly adapts to determine the clusters. In different non-IID settings, SoFL’s model accuracy improvements ranged from 9 to 18% compared to FedAvg and FedProx.","PeriodicalId":11646,"journal":{"name":"Electronics","volume":null,"pages":null},"PeriodicalIF":2.6000,"publicationDate":"2024-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"SoFL: Clustered Federated Learning Based on Dual Clustering for Heterogeneous Data\",\"authors\":\"Jianfei Zhang, Zhiming Qiao\",\"doi\":\"10.3390/electronics13183682\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Federated Learning (FL) is an emerging privacy-preserving technology that enables training a global model beneficial to all participants without sharing their data. However, differences in data distributions among participants may undermine the stability and accuracy of the global model. To address this challenge, recent research proposes client clustering based on data distribution similarity, generating independent models for each cluster in order to enhance FL performance. Nevertheless, due to the uncertainty of participant identities, FL struggles to rapidly and accurately determine the clusters. Most of the existing algorithms distinguish clients by iterative clustering, which not only increases the computing cost of the server but also affects the convergence speed of the federation model. To address these shortcomings, in this paper, we propose a novel clustering-based FL method, SoFL. SoFL introduces SOM networks, improves the quality of cluster data, and eliminates redundant categories through secondary clustering, encouraging more similar clients to train together. Through this mechanism, SoFL completes the clustering task in one round of communication and speeds up the convergence of federated model training. Simulation results demonstrate that SoFL accurately and swiftly adapts to determine the clusters. In different non-IID settings, SoFL’s model accuracy improvements ranged from 9 to 18% compared to FedAvg and FedProx.\",\"PeriodicalId\":11646,\"journal\":{\"name\":\"Electronics\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.6000,\"publicationDate\":\"2024-09-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Electronics\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.3390/electronics13183682\",\"RegionNum\":3,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Electronics","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.3390/electronics13183682","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
SoFL: Clustered Federated Learning Based on Dual Clustering for Heterogeneous Data
Federated Learning (FL) is an emerging privacy-preserving technology that enables training a global model beneficial to all participants without sharing their data. However, differences in data distributions among participants may undermine the stability and accuracy of the global model. To address this challenge, recent research proposes client clustering based on data distribution similarity, generating independent models for each cluster in order to enhance FL performance. Nevertheless, due to the uncertainty of participant identities, FL struggles to rapidly and accurately determine the clusters. Most of the existing algorithms distinguish clients by iterative clustering, which not only increases the computing cost of the server but also affects the convergence speed of the federation model. To address these shortcomings, in this paper, we propose a novel clustering-based FL method, SoFL. SoFL introduces SOM networks, improves the quality of cluster data, and eliminates redundant categories through secondary clustering, encouraging more similar clients to train together. Through this mechanism, SoFL completes the clustering task in one round of communication and speeds up the convergence of federated model training. Simulation results demonstrate that SoFL accurately and swiftly adapts to determine the clusters. In different non-IID settings, SoFL’s model accuracy improvements ranged from 9 to 18% compared to FedAvg and FedProx.
ElectronicsComputer Science-Computer Networks and Communications
CiteScore
1.10
自引率
10.30%
发文量
3515
审稿时长
16.71 days
期刊介绍:
Electronics (ISSN 2079-9292; CODEN: ELECGJ) is an international, open access journal on the science of electronics and its applications published quarterly online by MDPI.