Distributed Center-Based Clustering: A Unified Framework

IF 5.8 2区 工程技术 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC
Aleksandar Armacki;Dragana Bajović;Dušan Jakovetić;Soummya Kar
{"title":"Distributed Center-Based Clustering: A Unified Framework","authors":"Aleksandar Armacki;Dragana Bajović;Dušan Jakovetić;Soummya Kar","doi":"10.1109/TSP.2025.3531292","DOIUrl":null,"url":null,"abstract":"We develop a family of distributed center-based clustering algorithms that work over connected networks of users. In the proposed scenario, users contain a local dataset and communicate only with their immediate neighbours, with the aim of finding a clustering of the full, joint data. The proposed family, termed Distributed Gradient Clustering (DGC-<inline-formula><tex-math>$\\mathcal{F}_{\\rho}$</tex-math></inline-formula>), is parametrized by <inline-formula><tex-math>$\\rho\\geq 1$</tex-math></inline-formula>, controlling the proximity of users’ center estimates, with <inline-formula><tex-math>$\\mathcal{F}$</tex-math></inline-formula> determining the clustering loss. Our framework allows for a broad class of smooth convex loss functions, including popular clustering losses like <inline-formula><tex-math>$K$</tex-math></inline-formula>-means and Huber loss. Specialized to <inline-formula><tex-math>$K$</tex-math></inline-formula>-means and Huber loss, DGC-<inline-formula><tex-math>$\\mathcal{F}_{\\rho}$</tex-math></inline-formula> gives rise to novel distributed clustering algorithms DGC-KM<inline-formula><tex-math>${}_{\\rho}$</tex-math></inline-formula> and DGC-HL<inline-formula><tex-math>${}_{\\rho}$</tex-math></inline-formula>, while novel clustering losses based on the logistic and fair loss lead to DGC-LL<inline-formula><tex-math>${}_{\\rho}$</tex-math></inline-formula> and DGC-FL<inline-formula><tex-math>${}_{\\rho}$</tex-math></inline-formula>. We provide a unified analysis and establish several strong results, under mild assumptions. First, the sequence of centers generated by the methods converges to a well-defined notion of fixed point, under any center initialization and value of <inline-formula><tex-math>$\\rho$</tex-math></inline-formula>. Second, as <inline-formula><tex-math>$\\rho$</tex-math></inline-formula> increases, the family of fixed points produced by DGC-<inline-formula><tex-math>$\\mathcal{F}_{\\rho}$</tex-math></inline-formula> converges to a notion of consensus fixed points. We show that consensus fixed points of DGC-<inline-formula><tex-math>$\\mathcal{F}_{\\rho}$</tex-math></inline-formula> are equivalent to fixed points of gradient clustering over the full data, guaranteeing a clustering of the full data is produced. For the special case of Bregman losses, we show that our fixed points converge to the set of Lloyd points. Numerical experiments on real data confirm our theoretical findings and demonstrate strong performance of the methods.","PeriodicalId":13330,"journal":{"name":"IEEE Transactions on Signal Processing","volume":"73 ","pages":"903-918"},"PeriodicalIF":5.8000,"publicationDate":"2025-01-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Signal Processing","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10847582/","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

We develop a family of distributed center-based clustering algorithms that work over connected networks of users. In the proposed scenario, users contain a local dataset and communicate only with their immediate neighbours, with the aim of finding a clustering of the full, joint data. The proposed family, termed Distributed Gradient Clustering (DGC-$\mathcal{F}_{\rho}$), is parametrized by $\rho\geq 1$, controlling the proximity of users’ center estimates, with $\mathcal{F}$ determining the clustering loss. Our framework allows for a broad class of smooth convex loss functions, including popular clustering losses like $K$-means and Huber loss. Specialized to $K$-means and Huber loss, DGC-$\mathcal{F}_{\rho}$ gives rise to novel distributed clustering algorithms DGC-KM${}_{\rho}$ and DGC-HL${}_{\rho}$, while novel clustering losses based on the logistic and fair loss lead to DGC-LL${}_{\rho}$ and DGC-FL${}_{\rho}$. We provide a unified analysis and establish several strong results, under mild assumptions. First, the sequence of centers generated by the methods converges to a well-defined notion of fixed point, under any center initialization and value of $\rho$. Second, as $\rho$ increases, the family of fixed points produced by DGC-$\mathcal{F}_{\rho}$ converges to a notion of consensus fixed points. We show that consensus fixed points of DGC-$\mathcal{F}_{\rho}$ are equivalent to fixed points of gradient clustering over the full data, guaranteeing a clustering of the full data is produced. For the special case of Bregman losses, we show that our fixed points converge to the set of Lloyd points. Numerical experiments on real data confirm our theoretical findings and demonstrate strong performance of the methods.
基于中心的分布式集群:统一框架
我们开发了一系列基于分布式中心的聚类算法,这些算法可以在连接的用户网络上工作。在提议的场景中,用户包含一个本地数据集,并且只与他们的近邻通信,目的是找到完整的联合数据的聚类。所提出的家族,称为分布式梯度聚类(DGC- $\mathcal{F}_{\rho}$),由$\rho\geq 1$参数化,控制用户中心估计的接近度,$\mathcal{F}$确定聚类损失。我们的框架允许广泛的光滑凸损失函数,包括流行的聚类损失,如$K$ -means和Huber损失。专门针对$K$ -means和Huber损失的DGC- $\mathcal{F}_{\rho}$产生了新的分布式聚类算法DGC- km ${}_{\rho}$和DGC- hl ${}_{\rho}$,而基于logistic和公平损失的新型聚类损失产生了DGC- ll ${}_{\rho}$和DGC- fl ${}_{\rho}$。我们提供了一个统一的分析,并在温和的假设下建立了几个强有力的结果。首先,在任意中心初始化和$\rho$值下,方法生成的中心序列收敛于一个定义良好的不动点概念。其次,随着$\rho$的增加,DGC- $\mathcal{F}_{\rho}$产生的不动点族收敛于共识不动点的概念。我们证明了DGC- $\mathcal{F}_{\rho}$的一致不动点等价于全数据梯度聚类的不动点,保证了对全数据的聚类。对于Bregman损失的特殊情况,我们证明了不动点收敛于Lloyd点的集合。在实际数据上的数值实验证实了我们的理论结论,并证明了该方法的良好性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Transactions on Signal Processing
IEEE Transactions on Signal Processing 工程技术-工程:电子与电气
CiteScore
11.20
自引率
9.30%
发文量
310
审稿时长
3.0 months
期刊介绍: The IEEE Transactions on Signal Processing covers novel theory, algorithms, performance analyses and applications of techniques for the processing, understanding, learning, retrieval, mining, and extraction of information from signals. The term “signal” includes, among others, audio, video, speech, image, communication, geophysical, sonar, radar, medical and musical signals. Examples of topics of interest include, but are not limited to, information processing and the theory and application of filtering, coding, transmitting, estimating, detecting, analyzing, recognizing, synthesizing, recording, and reproducing signals.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信