Zongfu Han , Yu Feng , Yifan Zhu , Zhen Tian , Fangyu Hao , Meina Song
{"title":"FedGPA: Federated Learning with Global Personalized Aggregation","authors":"Zongfu Han , Yu Feng , Yifan Zhu , Zhen Tian , Fangyu Hao , Meina Song","doi":"10.1016/j.aiopen.2025.03.001","DOIUrl":null,"url":null,"abstract":"<div><div>A significant challenge in Federated Learning (FL) is addressing the heterogeneity of local data distribution across clients. Personalized Federated Learning (PFL), an emerging method aimed at overcoming data heterogeneity, can either integrate personalized components into the global model or train multiple models to achieve personalization. However, little research has simultaneously considered both directions. One such approach involves adopting a weighted aggregation method to generate personalized models, where the weights are determined by solving an optimization problem among different clients. In brief, previous works either neglect the use of global information during local representation learning or simply treat the personalized model as learning a set of individual weights. In this work, we initially decouple the model into a feature extractor, associated with generalization, and a classifier, linked to personalization. Subsequently, we conduct local–global alignment based on prototypes to leverage global information for learning better representations. Moreover, we fully utilize these representations to calculate the distance between clients and develop individual aggregation strategies for feature extractors and classifiers, respectively. Finally, extensive experimental results on five benchmark datasets under three different heterogeneous data scenarios demonstrate the effectiveness of our proposed FedGPA.</div></div>","PeriodicalId":100068,"journal":{"name":"AI Open","volume":"6 ","pages":"Pages 82-92"},"PeriodicalIF":0.0000,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"AI Open","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666651025000063","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
A significant challenge in Federated Learning (FL) is addressing the heterogeneity of local data distribution across clients. Personalized Federated Learning (PFL), an emerging method aimed at overcoming data heterogeneity, can either integrate personalized components into the global model or train multiple models to achieve personalization. However, little research has simultaneously considered both directions. One such approach involves adopting a weighted aggregation method to generate personalized models, where the weights are determined by solving an optimization problem among different clients. In brief, previous works either neglect the use of global information during local representation learning or simply treat the personalized model as learning a set of individual weights. In this work, we initially decouple the model into a feature extractor, associated with generalization, and a classifier, linked to personalization. Subsequently, we conduct local–global alignment based on prototypes to leverage global information for learning better representations. Moreover, we fully utilize these representations to calculate the distance between clients and develop individual aggregation strategies for feature extractors and classifiers, respectively. Finally, extensive experimental results on five benchmark datasets under three different heterogeneous data scenarios demonstrate the effectiveness of our proposed FedGPA.