Yifan Liu , Hongmei Ma , Donglin Pan , Yi Liu , Wenlei Chai , Zhenpeng Liu
{"title":"基于软聚类模型压缩的高效分层联邦学习","authors":"Yifan Liu , Hongmei Ma , Donglin Pan , Yi Liu , Wenlei Chai , Zhenpeng Liu","doi":"10.1016/j.eswa.2025.130022","DOIUrl":null,"url":null,"abstract":"<div><div>Federated Learning (FL) is an advanced distributed machine learning paradigm that enables clients to collaboratively train a shared neural network model using local datasets, transmitting model parameters instead of raw data. However, in many FL systems, the frequent exchange of model parameters between clients and remote cloud servers leads to significant communication overhead. As the model size increases, existing FL methods incur substantial communication costs. To address this bottleneck, this paper proposes a novel hierarchical federated learning model compression scheme (HierFLMC). This scheme integrates model compression techniques within a hierarchical framework, significantly enhancing communication efficiency between edge devices and cloud servers. In addition, an innovative preliminary soft clustering model update compression algorithm (SCMUC) is proposed. The SCMUC algorithm utilizes the K-means method for initial clustering, effectively reducing the computational complexity associated with traditional soft clustering methods. We validate the proposed scheme using the CIFAR-10 and FEMNIST datasets, demonstrating a 2 % improvement in model accuracy and an 11 % reduction in communication time compared to MUCSC. Experimental results indicate that this approach not only achieves a favorable compression rate but also substantially improves communication efficiency.</div></div>","PeriodicalId":50461,"journal":{"name":"Expert Systems with Applications","volume":"299 ","pages":"Article 130022"},"PeriodicalIF":7.5000,"publicationDate":"2025-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"HierFLMC: Efficient hierarchical federated learning based on soft clustering model compression\",\"authors\":\"Yifan Liu , Hongmei Ma , Donglin Pan , Yi Liu , Wenlei Chai , Zhenpeng Liu\",\"doi\":\"10.1016/j.eswa.2025.130022\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Federated Learning (FL) is an advanced distributed machine learning paradigm that enables clients to collaboratively train a shared neural network model using local datasets, transmitting model parameters instead of raw data. However, in many FL systems, the frequent exchange of model parameters between clients and remote cloud servers leads to significant communication overhead. As the model size increases, existing FL methods incur substantial communication costs. To address this bottleneck, this paper proposes a novel hierarchical federated learning model compression scheme (HierFLMC). This scheme integrates model compression techniques within a hierarchical framework, significantly enhancing communication efficiency between edge devices and cloud servers. In addition, an innovative preliminary soft clustering model update compression algorithm (SCMUC) is proposed. The SCMUC algorithm utilizes the K-means method for initial clustering, effectively reducing the computational complexity associated with traditional soft clustering methods. We validate the proposed scheme using the CIFAR-10 and FEMNIST datasets, demonstrating a 2 % improvement in model accuracy and an 11 % reduction in communication time compared to MUCSC. Experimental results indicate that this approach not only achieves a favorable compression rate but also substantially improves communication efficiency.</div></div>\",\"PeriodicalId\":50461,\"journal\":{\"name\":\"Expert Systems with Applications\",\"volume\":\"299 \",\"pages\":\"Article 130022\"},\"PeriodicalIF\":7.5000,\"publicationDate\":\"2025-10-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Expert Systems with Applications\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0957417425036383\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Expert Systems with Applications","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0957417425036383","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
HierFLMC: Efficient hierarchical federated learning based on soft clustering model compression
Federated Learning (FL) is an advanced distributed machine learning paradigm that enables clients to collaboratively train a shared neural network model using local datasets, transmitting model parameters instead of raw data. However, in many FL systems, the frequent exchange of model parameters between clients and remote cloud servers leads to significant communication overhead. As the model size increases, existing FL methods incur substantial communication costs. To address this bottleneck, this paper proposes a novel hierarchical federated learning model compression scheme (HierFLMC). This scheme integrates model compression techniques within a hierarchical framework, significantly enhancing communication efficiency between edge devices and cloud servers. In addition, an innovative preliminary soft clustering model update compression algorithm (SCMUC) is proposed. The SCMUC algorithm utilizes the K-means method for initial clustering, effectively reducing the computational complexity associated with traditional soft clustering methods. We validate the proposed scheme using the CIFAR-10 and FEMNIST datasets, demonstrating a 2 % improvement in model accuracy and an 11 % reduction in communication time compared to MUCSC. Experimental results indicate that this approach not only achieves a favorable compression rate but also substantially improves communication efficiency.
期刊介绍:
Expert Systems With Applications is an international journal dedicated to the exchange of information on expert and intelligent systems used globally in industry, government, and universities. The journal emphasizes original papers covering the design, development, testing, implementation, and management of these systems, offering practical guidelines. It spans various sectors such as finance, engineering, marketing, law, project management, information management, medicine, and more. The journal also welcomes papers on multi-agent systems, knowledge management, neural networks, knowledge discovery, data mining, and other related areas, excluding applications to military/defense systems.