Zediao Liu , Yayu Luo , Tongzhijun Zhu , Ziyi Chen , Tenglong Mao , Huan Pi , Ying Lin
{"title":"FedBH:在异构系统中通过共享基础和自适应超参数进行有效的联邦学习","authors":"Zediao Liu , Yayu Luo , Tongzhijun Zhu , Ziyi Chen , Tenglong Mao , Huan Pi , Ying Lin","doi":"10.1016/j.comcom.2025.108190","DOIUrl":null,"url":null,"abstract":"<div><div>Federated learning is a distributed learning framework that protects client data privacy. However, in real-world applications, due to the significant differences in data distribution among different clients, a single global model often fails to meet personalized needs. Moreover, existing personalized federated learning methods often struggle to balance computational efficiency and model personalization while inadequately addressing the heterogeneous computational capabilities of clients. To address these challenges, we propose FedBH—an innovative personalized federated learning method that combines the feature extraction capabilities of deep learning with an adaptive dynamic training mechanism. FedBH decomposes each client’s model into a global base layer and client-specific local heads, enhancing both computational efficiency and personalized training. To further optimize learning, FedBH incorporates a dynamic adjustment mechanism that adapts training parameters based on each client’s specific conditions. In each local training round, the algorithm samples a subset of the dataset and explores different configurations of local head and global base training rounds. The optimal configuration is determined based on validation loss and then applied for full training. This adaptive mechanism enables FedBH to dynamically adjust to varying client demands, ensuring efficient utilization of computational resources while maintaining robust personalization. Experimental results across multiple benchmark datasets and diverse data distribution scenarios show that FedBH achieves faster convergence and higher accuracy in most cases. These findings demonstrate that FedBH not only effectively adapts to heterogeneous data environments but also significantly improves training efficiency, showing great potential in addressing issues related to diverse and non-IID data distributions.</div></div>","PeriodicalId":55224,"journal":{"name":"Computer Communications","volume":"239 ","pages":"Article 108190"},"PeriodicalIF":4.5000,"publicationDate":"2025-05-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"FedBH: Efficient federated learning through shared base and adaptive hyperparameters in heterogeneous systems\",\"authors\":\"Zediao Liu , Yayu Luo , Tongzhijun Zhu , Ziyi Chen , Tenglong Mao , Huan Pi , Ying Lin\",\"doi\":\"10.1016/j.comcom.2025.108190\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Federated learning is a distributed learning framework that protects client data privacy. However, in real-world applications, due to the significant differences in data distribution among different clients, a single global model often fails to meet personalized needs. Moreover, existing personalized federated learning methods often struggle to balance computational efficiency and model personalization while inadequately addressing the heterogeneous computational capabilities of clients. To address these challenges, we propose FedBH—an innovative personalized federated learning method that combines the feature extraction capabilities of deep learning with an adaptive dynamic training mechanism. FedBH decomposes each client’s model into a global base layer and client-specific local heads, enhancing both computational efficiency and personalized training. To further optimize learning, FedBH incorporates a dynamic adjustment mechanism that adapts training parameters based on each client’s specific conditions. In each local training round, the algorithm samples a subset of the dataset and explores different configurations of local head and global base training rounds. The optimal configuration is determined based on validation loss and then applied for full training. This adaptive mechanism enables FedBH to dynamically adjust to varying client demands, ensuring efficient utilization of computational resources while maintaining robust personalization. Experimental results across multiple benchmark datasets and diverse data distribution scenarios show that FedBH achieves faster convergence and higher accuracy in most cases. These findings demonstrate that FedBH not only effectively adapts to heterogeneous data environments but also significantly improves training efficiency, showing great potential in addressing issues related to diverse and non-IID data distributions.</div></div>\",\"PeriodicalId\":55224,\"journal\":{\"name\":\"Computer Communications\",\"volume\":\"239 \",\"pages\":\"Article 108190\"},\"PeriodicalIF\":4.5000,\"publicationDate\":\"2025-05-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computer Communications\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0140366425001471\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer Communications","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0140366425001471","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
FedBH: Efficient federated learning through shared base and adaptive hyperparameters in heterogeneous systems
Federated learning is a distributed learning framework that protects client data privacy. However, in real-world applications, due to the significant differences in data distribution among different clients, a single global model often fails to meet personalized needs. Moreover, existing personalized federated learning methods often struggle to balance computational efficiency and model personalization while inadequately addressing the heterogeneous computational capabilities of clients. To address these challenges, we propose FedBH—an innovative personalized federated learning method that combines the feature extraction capabilities of deep learning with an adaptive dynamic training mechanism. FedBH decomposes each client’s model into a global base layer and client-specific local heads, enhancing both computational efficiency and personalized training. To further optimize learning, FedBH incorporates a dynamic adjustment mechanism that adapts training parameters based on each client’s specific conditions. In each local training round, the algorithm samples a subset of the dataset and explores different configurations of local head and global base training rounds. The optimal configuration is determined based on validation loss and then applied for full training. This adaptive mechanism enables FedBH to dynamically adjust to varying client demands, ensuring efficient utilization of computational resources while maintaining robust personalization. Experimental results across multiple benchmark datasets and diverse data distribution scenarios show that FedBH achieves faster convergence and higher accuracy in most cases. These findings demonstrate that FedBH not only effectively adapts to heterogeneous data environments but also significantly improves training efficiency, showing great potential in addressing issues related to diverse and non-IID data distributions.
期刊介绍:
Computer and Communications networks are key infrastructures of the information society with high socio-economic value as they contribute to the correct operations of many critical services (from healthcare to finance and transportation). Internet is the core of today''s computer-communication infrastructures. This has transformed the Internet, from a robust network for data transfer between computers, to a global, content-rich, communication and information system where contents are increasingly generated by the users, and distributed according to human social relations. Next-generation network technologies, architectures and protocols are therefore required to overcome the limitations of the legacy Internet and add new capabilities and services. The future Internet should be ubiquitous, secure, resilient, and closer to human communication paradigms.
Computer Communications is a peer-reviewed international journal that publishes high-quality scientific articles (both theory and practice) and survey papers covering all aspects of future computer communication networks (on all layers, except the physical layer), with a special attention to the evolution of the Internet architecture, protocols, services, and applications.