{"title":"Universal Algorithms for Clustering Problems","authors":"Arun Ganesh, Bruce M. Maggs, Debmalya Panigrahi","doi":"https://dl.acm.org/doi/10.1145/3572840","DOIUrl":null,"url":null,"abstract":"<p>This article presents <i>universal</i> algorithms for clustering problems, including the widely studied <i>k</i>-median, <i>k</i>-means, and <i>k</i>-center objectives. The input is a metric space containing all <i>potential</i> client locations. The algorithm must select <i>k</i> cluster centers such that they are a good solution for <i>any</i> subset of clients that actually realize. Specifically, we aim for low <i>regret</i>, defined as the maximum over all subsets of the difference between the cost of the algorithm’s solution and that of an optimal solution. A universal algorithm’s solution <span>Sol</span> for a clustering problem is said to be an α , β-approximation if for all subsets of clients <i>C<sup>′</sup></i>, it satisfies <span>sol</span> (<i>C</i><sup>′</sup>) ≤ α ċ <span>opt</span> (<i>C</i>′) + β ċ <span>mr</span>, where <span>opt</span> (<i>C</i>′ is the cost of the optimal solution for clients (<i>C</i>′) and <span>mr</span> is the minimum regret achievable by any solution.</p><p>Our main results are universal algorithms for the standard clustering objectives of <i>k</i>-median, <i>k</i>-means, and <i>k</i>-center that achieve (<i>O</i>(1), <i>O</i>(1))-approximations. These results are obtained via a novel framework for universal algorithms using linear programming (LP) relaxations. These results generalize to other ℓ<i><sub>p</sub></i>-objectives and the setting where some subset of the clients are <i>fixed</i>. We also give hardness results showing that (α, β)-approximation is NP-hard if α or β is at most a certain constant, even for the widely studied special case of Euclidean metric spaces. This shows that in some sense, (<i>O</i>(1), <i>O</i>(1))-approximation is the strongest type of guarantee obtainable for universal clustering.</p>","PeriodicalId":50922,"journal":{"name":"ACM Transactions on Algorithms","volume":"8 7","pages":""},"PeriodicalIF":0.9000,"publicationDate":"2023-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Transactions on Algorithms","FirstCategoryId":"94","ListUrlMain":"https://doi.org/https://dl.acm.org/doi/10.1145/3572840","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 0
Abstract
This article presents universal algorithms for clustering problems, including the widely studied k-median, k-means, and k-center objectives. The input is a metric space containing all potential client locations. The algorithm must select k cluster centers such that they are a good solution for any subset of clients that actually realize. Specifically, we aim for low regret, defined as the maximum over all subsets of the difference between the cost of the algorithm’s solution and that of an optimal solution. A universal algorithm’s solution Sol for a clustering problem is said to be an α , β-approximation if for all subsets of clients C′, it satisfies sol (C′) ≤ α ċ opt (C′) + β ċ mr, where opt (C′ is the cost of the optimal solution for clients (C′) and mr is the minimum regret achievable by any solution.
Our main results are universal algorithms for the standard clustering objectives of k-median, k-means, and k-center that achieve (O(1), O(1))-approximations. These results are obtained via a novel framework for universal algorithms using linear programming (LP) relaxations. These results generalize to other ℓp-objectives and the setting where some subset of the clients are fixed. We also give hardness results showing that (α, β)-approximation is NP-hard if α or β is at most a certain constant, even for the widely studied special case of Euclidean metric spaces. This shows that in some sense, (O(1), O(1))-approximation is the strongest type of guarantee obtainable for universal clustering.
期刊介绍:
ACM Transactions on Algorithms welcomes submissions of original research of the highest quality dealing with algorithms that are inherently discrete and finite, and having mathematical content in a natural way, either in the objective or in the analysis. Most welcome are new algorithms and data structures, new and improved analyses, and complexity results. Specific areas of computation covered by the journal include
combinatorial searches and objects;
counting;
discrete optimization and approximation;
randomization and quantum computation;
parallel and distributed computation;
algorithms for
graphs,
geometry,
arithmetic,
number theory,
strings;
on-line analysis;
cryptography;
coding;
data compression;
learning algorithms;
methods of algorithmic analysis;
discrete algorithms for application areas such as
biology,
economics,
game theory,
communication,
computer systems and architecture,
hardware design,
scientific computing