{"title":"用四种距离度量和一种直接解释规则对四种简洁聚类方法进行实证检验","authors":"T. A. Alvandyan, S. Shalileh","doi":"10.1134/S1064562424602002","DOIUrl":null,"url":null,"abstract":"<p>Clustering has always been in great demand by scientific and industrial communities. However, due to the lack of ground truth, interpreting its obtained results can be debatable. The current research provides an empirical benchmark on the efficiency of three popular and one recently proposed crisp clustering methods. To this end, we extensively analyzed these (four) methods by applying them to nine real-world and 420 synthetic datasets using four different values of <i>p</i> in Minkowski distance. Furthermore, we validated a previously proposed yet not well-known straightforward rule to interpret the recovered clusters. Our computations showed (i) Nesterov gradient descent clustering is the most effective clustering method using our real-world data, while K-Means had edge over it using our synthetic data; (ii) Minkowski distance with <i>p</i> = 1 is the most effective distance function, (iii) the investigated cluster interpretation rule is intuitive and valid.</p>","PeriodicalId":531,"journal":{"name":"Doklady Mathematics","volume":"110 1 supplement","pages":"S236 - S250"},"PeriodicalIF":0.5000,"publicationDate":"2025-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1134/S1064562424602002.pdf","citationCount":"0","resultStr":"{\"title\":\"An Empirical Scrutinization of Four Crisp Clustering Methods with Four Distance Metrics and One Straightforward Interpretation Rule\",\"authors\":\"T. A. Alvandyan, S. Shalileh\",\"doi\":\"10.1134/S1064562424602002\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Clustering has always been in great demand by scientific and industrial communities. However, due to the lack of ground truth, interpreting its obtained results can be debatable. The current research provides an empirical benchmark on the efficiency of three popular and one recently proposed crisp clustering methods. To this end, we extensively analyzed these (four) methods by applying them to nine real-world and 420 synthetic datasets using four different values of <i>p</i> in Minkowski distance. Furthermore, we validated a previously proposed yet not well-known straightforward rule to interpret the recovered clusters. Our computations showed (i) Nesterov gradient descent clustering is the most effective clustering method using our real-world data, while K-Means had edge over it using our synthetic data; (ii) Minkowski distance with <i>p</i> = 1 is the most effective distance function, (iii) the investigated cluster interpretation rule is intuitive and valid.</p>\",\"PeriodicalId\":531,\"journal\":{\"name\":\"Doklady Mathematics\",\"volume\":\"110 1 supplement\",\"pages\":\"S236 - S250\"},\"PeriodicalIF\":0.5000,\"publicationDate\":\"2025-03-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://link.springer.com/content/pdf/10.1134/S1064562424602002.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Doklady Mathematics\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://link.springer.com/article/10.1134/S1064562424602002\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"MATHEMATICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Doklady Mathematics","FirstCategoryId":"100","ListUrlMain":"https://link.springer.com/article/10.1134/S1064562424602002","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"MATHEMATICS","Score":null,"Total":0}
引用次数: 0
摘要
科学界和工业界一直对集群有很大的需求。然而,由于缺乏基础真理,解释其获得的结果可能是有争议的。本研究对三种流行的聚类方法和最近提出的一种清晰聚类方法的效率提供了一个经验基准。为此,我们使用四种不同的闵可夫斯基距离p值,将这四种方法应用于9个真实数据集和420个合成数据集,对这四种方法进行了广泛的分析。此外,我们验证了先前提出的但不为人所知的简单规则来解释恢复的群集。我们的计算表明(i) Nesterov梯度下降聚类是使用我们真实世界数据的最有效的聚类方法,而K-Means在使用我们的合成数据时具有优势;(ii) p = 1的Minkowski距离是最有效的距离函数,(iii)所研究的聚类解释规则直观有效。
An Empirical Scrutinization of Four Crisp Clustering Methods with Four Distance Metrics and One Straightforward Interpretation Rule
Clustering has always been in great demand by scientific and industrial communities. However, due to the lack of ground truth, interpreting its obtained results can be debatable. The current research provides an empirical benchmark on the efficiency of three popular and one recently proposed crisp clustering methods. To this end, we extensively analyzed these (four) methods by applying them to nine real-world and 420 synthetic datasets using four different values of p in Minkowski distance. Furthermore, we validated a previously proposed yet not well-known straightforward rule to interpret the recovered clusters. Our computations showed (i) Nesterov gradient descent clustering is the most effective clustering method using our real-world data, while K-Means had edge over it using our synthetic data; (ii) Minkowski distance with p = 1 is the most effective distance function, (iii) the investigated cluster interpretation rule is intuitive and valid.
期刊介绍:
Doklady Mathematics is a journal of the Presidium of the Russian Academy of Sciences. It contains English translations of papers published in Doklady Akademii Nauk (Proceedings of the Russian Academy of Sciences), which was founded in 1933 and is published 36 times a year. Doklady Mathematics includes the materials from the following areas: mathematics, mathematical physics, computer science, control theory, and computers. It publishes brief scientific reports on previously unpublished significant new research in mathematics and its applications. The main contributors to the journal are Members of the RAS, Corresponding Members of the RAS, and scientists from the former Soviet Union and other foreign countries. Among the contributors are the outstanding Russian mathematicians.