{"title":"Def-Ag: An energy-efficient decentralized federated learning framework via aggregator clients","authors":"Junyoung Park , Sungpil Woo , Joohyung Lee","doi":"10.1016/j.future.2025.108114","DOIUrl":null,"url":null,"abstract":"<div><div>Federated Learning (FL) has revolutionized Artificial Intelligence (AI) by enabling decentralized model training across diverse datasets, thereby addressing privacy concerns. However, traditional FL relies on a centralized server, leading to latency, single-point failures, and trust issues. Decentralized Federated Learning (DFL) emerges as a promising solution, but it faces challenges in achieving optimal accuracy and convergence due to limited client interactions, requiring energy inefficiency. Moreover, balancing the personalization and generalization of the AI model in DFL remains a complex issue. To address those challenging problems, this paper presents Def-Ag, an innovative energy-efficient DFL framework utilizing aggregator clients within similarity-based clusters. To reduce this signaling overhead, a partial model information exchange is proposed in intra-cluster training. In addition, the knowledge distillation method is applied for inter-cluster training to carefully incorporate the knowledge between clusters. Finally, by integrating clustering-based hierarchical DFL and optimizing client selection, Def-Ag reduces energy consumption and communication overhead while balancing personalization and generalization. Extensive experiments on CIFAR-10 and FMNIST datasets confirm Def-Ag’s superior performance in reducing energy usage and maintaining learning accuracy compared to baseline methods. The results demonstrate that Def-Ag effectively balances personalization and generalization, providing a robust solution for energy-efficient decentralized federated learning systems.</div></div>","PeriodicalId":55132,"journal":{"name":"Future Generation Computer Systems-The International Journal of Escience","volume":"175 ","pages":"Article 108114"},"PeriodicalIF":6.2000,"publicationDate":"2025-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Future Generation Computer Systems-The International Journal of Escience","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0167739X2500408X","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 0
Abstract
Federated Learning (FL) has revolutionized Artificial Intelligence (AI) by enabling decentralized model training across diverse datasets, thereby addressing privacy concerns. However, traditional FL relies on a centralized server, leading to latency, single-point failures, and trust issues. Decentralized Federated Learning (DFL) emerges as a promising solution, but it faces challenges in achieving optimal accuracy and convergence due to limited client interactions, requiring energy inefficiency. Moreover, balancing the personalization and generalization of the AI model in DFL remains a complex issue. To address those challenging problems, this paper presents Def-Ag, an innovative energy-efficient DFL framework utilizing aggregator clients within similarity-based clusters. To reduce this signaling overhead, a partial model information exchange is proposed in intra-cluster training. In addition, the knowledge distillation method is applied for inter-cluster training to carefully incorporate the knowledge between clusters. Finally, by integrating clustering-based hierarchical DFL and optimizing client selection, Def-Ag reduces energy consumption and communication overhead while balancing personalization and generalization. Extensive experiments on CIFAR-10 and FMNIST datasets confirm Def-Ag’s superior performance in reducing energy usage and maintaining learning accuracy compared to baseline methods. The results demonstrate that Def-Ag effectively balances personalization and generalization, providing a robust solution for energy-efficient decentralized federated learning systems.
期刊介绍:
Computing infrastructures and systems are constantly evolving, resulting in increasingly complex and collaborative scientific applications. To cope with these advancements, there is a growing need for collaborative tools that can effectively map, control, and execute these applications.
Furthermore, with the explosion of Big Data, there is a requirement for innovative methods and infrastructures to collect, analyze, and derive meaningful insights from the vast amount of data generated. This necessitates the integration of computational and storage capabilities, databases, sensors, and human collaboration.
Future Generation Computer Systems aims to pioneer advancements in distributed systems, collaborative environments, high-performance computing, and Big Data analytics. It strives to stay at the forefront of developments in grids, clouds, and the Internet of Things (IoT) to effectively address the challenges posed by these wide-area, fully distributed sensing and computing systems.