基于软聚类模型压缩的高效分层联邦学习

IF 7.5 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Yifan Liu , Hongmei Ma , Donglin Pan , Yi Liu , Wenlei Chai , Zhenpeng Liu
{"title":"基于软聚类模型压缩的高效分层联邦学习","authors":"Yifan Liu ,&nbsp;Hongmei Ma ,&nbsp;Donglin Pan ,&nbsp;Yi Liu ,&nbsp;Wenlei Chai ,&nbsp;Zhenpeng Liu","doi":"10.1016/j.eswa.2025.130022","DOIUrl":null,"url":null,"abstract":"<div><div>Federated Learning (FL) is an advanced distributed machine learning paradigm that enables clients to collaboratively train a shared neural network model using local datasets, transmitting model parameters instead of raw data. However, in many FL systems, the frequent exchange of model parameters between clients and remote cloud servers leads to significant communication overhead. As the model size increases, existing FL methods incur substantial communication costs. To address this bottleneck, this paper proposes a novel hierarchical federated learning model compression scheme (HierFLMC). This scheme integrates model compression techniques within a hierarchical framework, significantly enhancing communication efficiency between edge devices and cloud servers. In addition, an innovative preliminary soft clustering model update compression algorithm (SCMUC) is proposed. The SCMUC algorithm utilizes the K-means method for initial clustering, effectively reducing the computational complexity associated with traditional soft clustering methods. We validate the proposed scheme using the CIFAR-10 and FEMNIST datasets, demonstrating a 2 % improvement in model accuracy and an 11 % reduction in communication time compared to MUCSC. Experimental results indicate that this approach not only achieves a favorable compression rate but also substantially improves communication efficiency.</div></div>","PeriodicalId":50461,"journal":{"name":"Expert Systems with Applications","volume":"299 ","pages":"Article 130022"},"PeriodicalIF":7.5000,"publicationDate":"2025-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"HierFLMC: Efficient hierarchical federated learning based on soft clustering model compression\",\"authors\":\"Yifan Liu ,&nbsp;Hongmei Ma ,&nbsp;Donglin Pan ,&nbsp;Yi Liu ,&nbsp;Wenlei Chai ,&nbsp;Zhenpeng Liu\",\"doi\":\"10.1016/j.eswa.2025.130022\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Federated Learning (FL) is an advanced distributed machine learning paradigm that enables clients to collaboratively train a shared neural network model using local datasets, transmitting model parameters instead of raw data. However, in many FL systems, the frequent exchange of model parameters between clients and remote cloud servers leads to significant communication overhead. As the model size increases, existing FL methods incur substantial communication costs. To address this bottleneck, this paper proposes a novel hierarchical federated learning model compression scheme (HierFLMC). This scheme integrates model compression techniques within a hierarchical framework, significantly enhancing communication efficiency between edge devices and cloud servers. In addition, an innovative preliminary soft clustering model update compression algorithm (SCMUC) is proposed. The SCMUC algorithm utilizes the K-means method for initial clustering, effectively reducing the computational complexity associated with traditional soft clustering methods. We validate the proposed scheme using the CIFAR-10 and FEMNIST datasets, demonstrating a 2 % improvement in model accuracy and an 11 % reduction in communication time compared to MUCSC. Experimental results indicate that this approach not only achieves a favorable compression rate but also substantially improves communication efficiency.</div></div>\",\"PeriodicalId\":50461,\"journal\":{\"name\":\"Expert Systems with Applications\",\"volume\":\"299 \",\"pages\":\"Article 130022\"},\"PeriodicalIF\":7.5000,\"publicationDate\":\"2025-10-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Expert Systems with Applications\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0957417425036383\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Expert Systems with Applications","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0957417425036383","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

联邦学习(FL)是一种先进的分布式机器学习范式,它使客户端能够使用本地数据集协作训练共享神经网络模型,传输模型参数而不是原始数据。然而,在许多FL系统中,客户机和远程云服务器之间频繁的模型参数交换导致了巨大的通信开销。随着模型尺寸的增加,现有的FL方法产生了大量的通信成本。为了解决这一瓶颈,本文提出了一种新的分层联邦学习模型压缩方案(HierFLMC)。该方案将模型压缩技术集成在分层框架中,显著提高了边缘设备与云服务器之间的通信效率。此外,提出了一种创新的初步软聚类模型更新压缩算法(SCMUC)。SCMUC算法采用K-means方法进行初始聚类,有效降低了传统软聚类方法的计算复杂度。我们使用CIFAR-10和FEMNIST数据集验证了所提出的方案,与MUCSC相比,模型精度提高了2%,通信时间减少了11%。实验结果表明,该方法不仅获得了良好的压缩率,而且大大提高了通信效率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
HierFLMC: Efficient hierarchical federated learning based on soft clustering model compression
Federated Learning (FL) is an advanced distributed machine learning paradigm that enables clients to collaboratively train a shared neural network model using local datasets, transmitting model parameters instead of raw data. However, in many FL systems, the frequent exchange of model parameters between clients and remote cloud servers leads to significant communication overhead. As the model size increases, existing FL methods incur substantial communication costs. To address this bottleneck, this paper proposes a novel hierarchical federated learning model compression scheme (HierFLMC). This scheme integrates model compression techniques within a hierarchical framework, significantly enhancing communication efficiency between edge devices and cloud servers. In addition, an innovative preliminary soft clustering model update compression algorithm (SCMUC) is proposed. The SCMUC algorithm utilizes the K-means method for initial clustering, effectively reducing the computational complexity associated with traditional soft clustering methods. We validate the proposed scheme using the CIFAR-10 and FEMNIST datasets, demonstrating a 2 % improvement in model accuracy and an 11 % reduction in communication time compared to MUCSC. Experimental results indicate that this approach not only achieves a favorable compression rate but also substantially improves communication efficiency.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Expert Systems with Applications
Expert Systems with Applications 工程技术-工程:电子与电气
CiteScore
13.80
自引率
10.60%
发文量
2045
审稿时长
8.7 months
期刊介绍: Expert Systems With Applications is an international journal dedicated to the exchange of information on expert and intelligent systems used globally in industry, government, and universities. The journal emphasizes original papers covering the design, development, testing, implementation, and management of these systems, offering practical guidelines. It spans various sectors such as finance, engineering, marketing, law, project management, information management, medicine, and more. The journal also welcomes papers on multi-agent systems, knowledge management, neural networks, knowledge discovery, data mining, and other related areas, excluding applications to military/defense systems.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信