Lukas Johannes Dust, Marina López Murcia, Andreas Mäkilä, Petter Nordin, N. Xiong, Francisco Herrera
{"title":"Federated Fuzzy Learning with Imbalanced Data","authors":"Lukas Johannes Dust, Marina López Murcia, Andreas Mäkilä, Petter Nordin, N. Xiong, Francisco Herrera","doi":"10.1109/ICMLA52953.2021.00185","DOIUrl":null,"url":null,"abstract":"Federated learning (FL) is an emerging and privacy-preserving machine learning technique that is shown to be increasingly important in the digital age. The two challenging issues for FL are: (1) communication overhead between clients and the server, and (2) volatile distribution of training data such as class imbalance. The paper aims to tackle these two challenges with the proposal of a federated fuzzy learning algorithm (FFLA) that can be used for data-based construction of fuzzy classification models in a distributed setting. The proposed learning algorithm is fast and highly cheap in communication by requiring only two rounds of interplay between the server and clients. Moreover, FFLA is empowered with an an imbalance adaptation mechanism so that it remains robust against heterogeneous distributions of data and class imbalance. The efficacy of the proposed learning method has been verified by the simulation tests made on a set of balanced and imbalanced benchmark data sets.","PeriodicalId":6750,"journal":{"name":"2021 20th IEEE International Conference on Machine Learning and Applications (ICMLA)","volume":"1994 1","pages":"1130-1137"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 20th IEEE International Conference on Machine Learning and Applications (ICMLA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMLA52953.2021.00185","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
Federated learning (FL) is an emerging and privacy-preserving machine learning technique that is shown to be increasingly important in the digital age. The two challenging issues for FL are: (1) communication overhead between clients and the server, and (2) volatile distribution of training data such as class imbalance. The paper aims to tackle these two challenges with the proposal of a federated fuzzy learning algorithm (FFLA) that can be used for data-based construction of fuzzy classification models in a distributed setting. The proposed learning algorithm is fast and highly cheap in communication by requiring only two rounds of interplay between the server and clients. Moreover, FFLA is empowered with an an imbalance adaptation mechanism so that it remains robust against heterogeneous distributions of data and class imbalance. The efficacy of the proposed learning method has been verified by the simulation tests made on a set of balanced and imbalanced benchmark data sets.