Jian Chen;Peifeng Zhang;Jiahui Chen;Terry Shue Chien Lau
{"title":"Heterogeneity-Aware Clustering and Intra-Cluster Uniform Data Sampling for Federated Learning","authors":"Jian Chen;Peifeng Zhang;Jiahui Chen;Terry Shue Chien Lau","doi":"10.1109/TETCI.2024.3515007","DOIUrl":null,"url":null,"abstract":"Federated learning (FL) is an innovative privacy-preserving machine learning paradigm that enables clients to train a global model without sharing their local data. However, the coexistence of category distribution heterogeneity and quantity imbalance frequently occurs in real-world FL scenarios. On the one side, due to the category distribution heterogeneity, local models are optimized based on distinct local objectives, resulting in divergent optimization directions. On the other side, quantity imbalance in widely used uniform client sampling of FL may hinder the active participation of clients with larger datasets in model training, and potentially make the model get suboptimal performance. To tackle this, we propose a framework that incorporates heterogeneity-aware clustering and intra-cluster uniform data sampling. More precisely, we firstly do heterogeneity-aware clustering that performs clustering on clients based on category distribution vectors. Then, we implement intra-cluster uniform data sampling, where local data from each client within a cluster is randomly selected based on a predetermined probability. Furthermore, to address privacy concerns, we incorporate homomorphic encryption to protect clients' category distribution vectors and sample sizes. Finally, the experimental results on multiple benchmark datasets demonstrate that the proposed framework validate the superiority of our approach.","PeriodicalId":13135,"journal":{"name":"IEEE Transactions on Emerging Topics in Computational Intelligence","volume":"9 3","pages":"2545-2556"},"PeriodicalIF":5.3000,"publicationDate":"2024-12-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Emerging Topics in Computational Intelligence","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10815593/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Federated learning (FL) is an innovative privacy-preserving machine learning paradigm that enables clients to train a global model without sharing their local data. However, the coexistence of category distribution heterogeneity and quantity imbalance frequently occurs in real-world FL scenarios. On the one side, due to the category distribution heterogeneity, local models are optimized based on distinct local objectives, resulting in divergent optimization directions. On the other side, quantity imbalance in widely used uniform client sampling of FL may hinder the active participation of clients with larger datasets in model training, and potentially make the model get suboptimal performance. To tackle this, we propose a framework that incorporates heterogeneity-aware clustering and intra-cluster uniform data sampling. More precisely, we firstly do heterogeneity-aware clustering that performs clustering on clients based on category distribution vectors. Then, we implement intra-cluster uniform data sampling, where local data from each client within a cluster is randomly selected based on a predetermined probability. Furthermore, to address privacy concerns, we incorporate homomorphic encryption to protect clients' category distribution vectors and sample sizes. Finally, the experimental results on multiple benchmark datasets demonstrate that the proposed framework validate the superiority of our approach.
期刊介绍:
The IEEE Transactions on Emerging Topics in Computational Intelligence (TETCI) publishes original articles on emerging aspects of computational intelligence, including theory, applications, and surveys.
TETCI is an electronics only publication. TETCI publishes six issues per year.
Authors are encouraged to submit manuscripts in any emerging topic in computational intelligence, especially nature-inspired computing topics not covered by other IEEE Computational Intelligence Society journals. A few such illustrative examples are glial cell networks, computational neuroscience, Brain Computer Interface, ambient intelligence, non-fuzzy computing with words, artificial life, cultural learning, artificial endocrine networks, social reasoning, artificial hormone networks, computational intelligence for the IoT and Smart-X technologies.