Chenrui Wu;Zexi Li;Fangxin Wang;Hongyang Chen;Jiajun Bu;Haishuai Wang
{"title":"通过协作基础生成模型实现联邦学习中的通用个性化","authors":"Chenrui Wu;Zexi Li;Fangxin Wang;Hongyang Chen;Jiajun Bu;Haishuai Wang","doi":"10.1109/TMC.2025.3564880","DOIUrl":null,"url":null,"abstract":"Personalized federated learning (PFL) enhances the performance of customized client models through collaborative training without compromising data privacy and ownership. Some previous PFL methods rely on rich prior knowledge about the types of data heterogeneity (such as class imbalance or feature skew), which greatly limits their application ranges. In this paper, we study the <bold>Universal Personalization in Federated Learning (UniPFL)</b>, the problem that has no prior knowledge about the types of data heterogeneity. In real-world PFL scenarios, UniPFL is potential because the data distributions of clients are usually heterogeneous and unknown to the server, where quantity imbalance, class imbalance, feature skew, or hybrid heterogeneity are possible contingencies. To address UniPFL, we propose <bold>FedFD</b>, a novel framework with local data augmentation and global concept fusion, which is based on the recent advances in <italic>the foundation generative models</i> (e.g., diffusion models, BLIP-2). On the client side, FedFD utilizes a diffusion model to assist local training by generating augmented data samples, and is then efficiently fine-tuned to be personalized. On the server side, we customize the aggregation strategies based on model similarities to learn both personalized models and diverse feature concepts. Extensive experiments show that FedFD reaches the state-of-the-art on (1) CIFAR-10 and CIFAR-100 for class imbalance; (2) DomainNet and Office-10 for feature skew, and (3) hybrid heterogeneity with both class and feature shifts.","PeriodicalId":50389,"journal":{"name":"IEEE Transactions on Mobile Computing","volume":"24 10","pages":"9695-9708"},"PeriodicalIF":9.2000,"publicationDate":"2025-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Toward Universal Personalization in Federated Learning via Collaborative Foundation Generative Models\",\"authors\":\"Chenrui Wu;Zexi Li;Fangxin Wang;Hongyang Chen;Jiajun Bu;Haishuai Wang\",\"doi\":\"10.1109/TMC.2025.3564880\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Personalized federated learning (PFL) enhances the performance of customized client models through collaborative training without compromising data privacy and ownership. Some previous PFL methods rely on rich prior knowledge about the types of data heterogeneity (such as class imbalance or feature skew), which greatly limits their application ranges. In this paper, we study the <bold>Universal Personalization in Federated Learning (UniPFL)</b>, the problem that has no prior knowledge about the types of data heterogeneity. In real-world PFL scenarios, UniPFL is potential because the data distributions of clients are usually heterogeneous and unknown to the server, where quantity imbalance, class imbalance, feature skew, or hybrid heterogeneity are possible contingencies. To address UniPFL, we propose <bold>FedFD</b>, a novel framework with local data augmentation and global concept fusion, which is based on the recent advances in <italic>the foundation generative models</i> (e.g., diffusion models, BLIP-2). On the client side, FedFD utilizes a diffusion model to assist local training by generating augmented data samples, and is then efficiently fine-tuned to be personalized. On the server side, we customize the aggregation strategies based on model similarities to learn both personalized models and diverse feature concepts. Extensive experiments show that FedFD reaches the state-of-the-art on (1) CIFAR-10 and CIFAR-100 for class imbalance; (2) DomainNet and Office-10 for feature skew, and (3) hybrid heterogeneity with both class and feature shifts.\",\"PeriodicalId\":50389,\"journal\":{\"name\":\"IEEE Transactions on Mobile Computing\",\"volume\":\"24 10\",\"pages\":\"9695-9708\"},\"PeriodicalIF\":9.2000,\"publicationDate\":\"2025-04-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Mobile Computing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10978084/\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Mobile Computing","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10978084/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
Toward Universal Personalization in Federated Learning via Collaborative Foundation Generative Models
Personalized federated learning (PFL) enhances the performance of customized client models through collaborative training without compromising data privacy and ownership. Some previous PFL methods rely on rich prior knowledge about the types of data heterogeneity (such as class imbalance or feature skew), which greatly limits their application ranges. In this paper, we study the Universal Personalization in Federated Learning (UniPFL), the problem that has no prior knowledge about the types of data heterogeneity. In real-world PFL scenarios, UniPFL is potential because the data distributions of clients are usually heterogeneous and unknown to the server, where quantity imbalance, class imbalance, feature skew, or hybrid heterogeneity are possible contingencies. To address UniPFL, we propose FedFD, a novel framework with local data augmentation and global concept fusion, which is based on the recent advances in the foundation generative models (e.g., diffusion models, BLIP-2). On the client side, FedFD utilizes a diffusion model to assist local training by generating augmented data samples, and is then efficiently fine-tuned to be personalized. On the server side, we customize the aggregation strategies based on model similarities to learn both personalized models and diverse feature concepts. Extensive experiments show that FedFD reaches the state-of-the-art on (1) CIFAR-10 and CIFAR-100 for class imbalance; (2) DomainNet and Office-10 for feature skew, and (3) hybrid heterogeneity with both class and feature shifts.
期刊介绍:
IEEE Transactions on Mobile Computing addresses key technical issues related to various aspects of mobile computing. This includes (a) architectures, (b) support services, (c) algorithm/protocol design and analysis, (d) mobile environments, (e) mobile communication systems, (f) applications, and (g) emerging technologies. Topics of interest span a wide range, covering aspects like mobile networks and hosts, mobility management, multimedia, operating system support, power management, online and mobile environments, security, scalability, reliability, and emerging technologies such as wearable computers, body area networks, and wireless sensor networks. The journal serves as a comprehensive platform for advancements in mobile computing research.