Juncheng Pu;Xiaodong Fu;Hai Dong;Pengcheng Zhang;Li Liu
{"title":"本地长尾数据的动态自适应联合学习","authors":"Juncheng Pu;Xiaodong Fu;Hai Dong;Pengcheng Zhang;Li Liu","doi":"10.1109/TSC.2024.3478796","DOIUrl":null,"url":null,"abstract":"Federated learning provides privacy protection to the collaborative training of global model based on distributed private data. The local private data is often in the presence of long-tailed distribution in reality, which downgrades the performance and causes biased results. In this paper, we propose a dynamic adaptive federated learning optimization algorithm with the Grey Wolf Optimizer and Markov Chain, named FedWolf, to solve the problems of performance degradation and result bias caused by the local long-tailed data. FedWolf is launched with a set of randomly initialized parameters instead of a shared parameter employed by existing methods. Then multi-level participants are elected based on the F1 scores calculated from the uploaded parameters. A dynamic weighting strategy based on the participant level is used to adaptively update parameters without artificial control. The above parameter updating is modelled as a Markov Process. After all communication rounds are completed, the future performance (including the probability of each participant is elected as different participant level) of participants is predicted through the historical Markov states. Finally, the probability of each participant is elected as the level 1 is used as the contribution weight and the global model is obtained through dynamic contribution weight aggregating. We introduce the Gini index to evaluate the bias of classification results. Extensive experiments are conducted to validate the effectiveness of FedWolf in solving the problems of performance cracks and categorization result bias as well as the robustness of adaptive parameter updating in resisting outliers and malicious users.","PeriodicalId":13255,"journal":{"name":"IEEE Transactions on Services Computing","volume":"17 6","pages":"3485-3498"},"PeriodicalIF":5.5000,"publicationDate":"2024-10-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Dynamic Adaptive Federated Learning on Local Long-Tailed Data\",\"authors\":\"Juncheng Pu;Xiaodong Fu;Hai Dong;Pengcheng Zhang;Li Liu\",\"doi\":\"10.1109/TSC.2024.3478796\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Federated learning provides privacy protection to the collaborative training of global model based on distributed private data. The local private data is often in the presence of long-tailed distribution in reality, which downgrades the performance and causes biased results. In this paper, we propose a dynamic adaptive federated learning optimization algorithm with the Grey Wolf Optimizer and Markov Chain, named FedWolf, to solve the problems of performance degradation and result bias caused by the local long-tailed data. FedWolf is launched with a set of randomly initialized parameters instead of a shared parameter employed by existing methods. Then multi-level participants are elected based on the F1 scores calculated from the uploaded parameters. A dynamic weighting strategy based on the participant level is used to adaptively update parameters without artificial control. The above parameter updating is modelled as a Markov Process. After all communication rounds are completed, the future performance (including the probability of each participant is elected as different participant level) of participants is predicted through the historical Markov states. Finally, the probability of each participant is elected as the level 1 is used as the contribution weight and the global model is obtained through dynamic contribution weight aggregating. We introduce the Gini index to evaluate the bias of classification results. Extensive experiments are conducted to validate the effectiveness of FedWolf in solving the problems of performance cracks and categorization result bias as well as the robustness of adaptive parameter updating in resisting outliers and malicious users.\",\"PeriodicalId\":13255,\"journal\":{\"name\":\"IEEE Transactions on Services Computing\",\"volume\":\"17 6\",\"pages\":\"3485-3498\"},\"PeriodicalIF\":5.5000,\"publicationDate\":\"2024-10-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Services Computing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10713999/\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Services Computing","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10713999/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
Dynamic Adaptive Federated Learning on Local Long-Tailed Data
Federated learning provides privacy protection to the collaborative training of global model based on distributed private data. The local private data is often in the presence of long-tailed distribution in reality, which downgrades the performance and causes biased results. In this paper, we propose a dynamic adaptive federated learning optimization algorithm with the Grey Wolf Optimizer and Markov Chain, named FedWolf, to solve the problems of performance degradation and result bias caused by the local long-tailed data. FedWolf is launched with a set of randomly initialized parameters instead of a shared parameter employed by existing methods. Then multi-level participants are elected based on the F1 scores calculated from the uploaded parameters. A dynamic weighting strategy based on the participant level is used to adaptively update parameters without artificial control. The above parameter updating is modelled as a Markov Process. After all communication rounds are completed, the future performance (including the probability of each participant is elected as different participant level) of participants is predicted through the historical Markov states. Finally, the probability of each participant is elected as the level 1 is used as the contribution weight and the global model is obtained through dynamic contribution weight aggregating. We introduce the Gini index to evaluate the bias of classification results. Extensive experiments are conducted to validate the effectiveness of FedWolf in solving the problems of performance cracks and categorization result bias as well as the robustness of adaptive parameter updating in resisting outliers and malicious users.
期刊介绍:
IEEE Transactions on Services Computing encompasses the computing and software aspects of the science and technology of services innovation research and development. It places emphasis on algorithmic, mathematical, statistical, and computational methods central to services computing. Topics covered include Service Oriented Architecture, Web Services, Business Process Integration, Solution Performance Management, and Services Operations and Management. The transactions address mathematical foundations, security, privacy, agreement, contract, discovery, negotiation, collaboration, and quality of service for web services. It also covers areas like composite web service creation, business and scientific applications, standards, utility models, business process modeling, integration, collaboration, and more in the realm of Services Computing.