Guangtao Wang, Jiayu Zhou, Jingjie Ni, Tingjin Luo, Wei Long, Hai Zhen, G. Cong, Jieping Ye
{"title":"Robust Self-Tuning Sparse Subspace Clustering","authors":"Guangtao Wang, Jiayu Zhou, Jingjie Ni, Tingjin Luo, Wei Long, Hai Zhen, G. Cong, Jieping Ye","doi":"10.1109/ICDMW.2017.117","DOIUrl":null,"url":null,"abstract":"Sparse subspace clustering (SSC) is an effective approach to cluster high-dimensional data. However, how to adaptively select the number of clusters/eigenvectors for different data sets, especially when the data are corrupted by noise, is a big challenge in SSC and also an open problem in field of data mining. In this paper, considering the fact that the eigenvectors are robust to noise, we develop a self-adaptive search method to select cluster number for SSC by exploiting the cluster-separation information from eigenvectors. Our method solves the problem by identifying the cluster centers over eigenvectors. We first design a new density based metric, called centrality coefficient gap, to measure such separation information, and estimate the cluster centers by maximizing the gap. After getting the cluster centers, it is straightforward to group the remaining points into respective clusters which contain their nearest neighbors with higher density. This leads to a new clustering algorithm in which the final randomly initialized k-means stage in traditional SSC is eliminated. We theoretically verify the correctness of the proposed method on noise-free data. Extensive experiments on synthetic and real-world data corrupted by noise demonstrate the robustness and effectiveness of the proposed method comparing to the well-established competitors.","PeriodicalId":389183,"journal":{"name":"2017 IEEE International Conference on Data Mining Workshops (ICDMW)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE International Conference on Data Mining Workshops (ICDMW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDMW.2017.117","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Sparse subspace clustering (SSC) is an effective approach to cluster high-dimensional data. However, how to adaptively select the number of clusters/eigenvectors for different data sets, especially when the data are corrupted by noise, is a big challenge in SSC and also an open problem in field of data mining. In this paper, considering the fact that the eigenvectors are robust to noise, we develop a self-adaptive search method to select cluster number for SSC by exploiting the cluster-separation information from eigenvectors. Our method solves the problem by identifying the cluster centers over eigenvectors. We first design a new density based metric, called centrality coefficient gap, to measure such separation information, and estimate the cluster centers by maximizing the gap. After getting the cluster centers, it is straightforward to group the remaining points into respective clusters which contain their nearest neighbors with higher density. This leads to a new clustering algorithm in which the final randomly initialized k-means stage in traditional SSC is eliminated. We theoretically verify the correctness of the proposed method on noise-free data. Extensive experiments on synthetic and real-world data corrupted by noise demonstrate the robustness and effectiveness of the proposed method comparing to the well-established competitors.