{"title":"通过受限拉普拉斯秩进行亲和自适应稀疏子空间聚类","authors":"Ting Yang, Shuisheng Zhou, Zhuan Zhang","doi":"10.1007/s10489-024-05812-4","DOIUrl":null,"url":null,"abstract":"<div><p>Subspace clustering typically clusters data by performing spectral clustering to an affinity matrix constructed in some deterministic ways of self-representation coefficient matrix. Therefore, the quality of the affinity matrix is vital to their performance. However, traditional deterministic ways only provide a feasible affinity matrix but not the most suitable one for showing data structures. Besides, post-processing commonly on the coefficient matrix also affects the affinity matrix’s quality. Furthermore, constructing the affinity matrix is separate from optimizing the coefficient matrix and performing spectral clustering, which can not guarantee the optimal overall result. To this end, we propose a new method, affinity adaptive sparse subspace clustering (AASSC), by adding Laplacian rank constraint into a subspace sparse-representation model to adaptively learn a high-quality affinity matrix having accurate <i>p</i>-connected components from a sparse coefficient matrix without post-processing, where <i>p</i> represents categories. In addition, by relaxing the Laplacian rank constraint into a trace minimization, AASSC naturally combines the operations of the coefficient matrix, affinity matrix, and spectral clustering into a unified optimization, guaranteeing the overall optimal result. Extensive experimental results verify the proposed method to be effective and superior.</p></div>","PeriodicalId":8041,"journal":{"name":"Applied Intelligence","volume":"54 23","pages":"12378 - 12390"},"PeriodicalIF":3.4000,"publicationDate":"2024-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Affinity adaptive sparse subspace clustering via constrained Laplacian rank\",\"authors\":\"Ting Yang, Shuisheng Zhou, Zhuan Zhang\",\"doi\":\"10.1007/s10489-024-05812-4\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Subspace clustering typically clusters data by performing spectral clustering to an affinity matrix constructed in some deterministic ways of self-representation coefficient matrix. Therefore, the quality of the affinity matrix is vital to their performance. However, traditional deterministic ways only provide a feasible affinity matrix but not the most suitable one for showing data structures. Besides, post-processing commonly on the coefficient matrix also affects the affinity matrix’s quality. Furthermore, constructing the affinity matrix is separate from optimizing the coefficient matrix and performing spectral clustering, which can not guarantee the optimal overall result. To this end, we propose a new method, affinity adaptive sparse subspace clustering (AASSC), by adding Laplacian rank constraint into a subspace sparse-representation model to adaptively learn a high-quality affinity matrix having accurate <i>p</i>-connected components from a sparse coefficient matrix without post-processing, where <i>p</i> represents categories. In addition, by relaxing the Laplacian rank constraint into a trace minimization, AASSC naturally combines the operations of the coefficient matrix, affinity matrix, and spectral clustering into a unified optimization, guaranteeing the overall optimal result. Extensive experimental results verify the proposed method to be effective and superior.</p></div>\",\"PeriodicalId\":8041,\"journal\":{\"name\":\"Applied Intelligence\",\"volume\":\"54 23\",\"pages\":\"12378 - 12390\"},\"PeriodicalIF\":3.4000,\"publicationDate\":\"2024-09-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Applied Intelligence\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://link.springer.com/article/10.1007/s10489-024-05812-4\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Intelligence","FirstCategoryId":"94","ListUrlMain":"https://link.springer.com/article/10.1007/s10489-024-05812-4","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
摘要
子空间聚类通常是通过对以某种确定性方式构建的自表示系数矩阵的亲和矩阵进行频谱聚类,从而对数据进行聚类。因此,亲和矩阵的质量对其性能至关重要。然而,传统的确定性方法只能提供可行的亲和矩阵,却不能提供最适合显示数据结构的亲和矩阵。此外,通常对系数矩阵进行的后处理也会影响亲和矩阵的质量。而且,构建亲和矩阵与优化系数矩阵和进行频谱聚类是分开的,不能保证整体结果最优。为此,我们提出了一种新方法--亲和力自适应稀疏子空间聚类(AASSC),即在子空间稀疏表示模型中加入拉普拉斯秩约束,从而无需后处理即可从稀疏系数矩阵中自适应地学习出具有精确 p 个连接分量的高质量亲和力矩阵,其中 p 代表类别。此外,通过将拉普拉斯秩约束放宽为迹线最小化,AASSC 自然而然地将系数矩阵、亲和矩阵和谱聚类的操作结合为统一的优化,保证了整体最优结果。大量实验结果验证了所提方法的有效性和优越性。
Affinity adaptive sparse subspace clustering via constrained Laplacian rank
Subspace clustering typically clusters data by performing spectral clustering to an affinity matrix constructed in some deterministic ways of self-representation coefficient matrix. Therefore, the quality of the affinity matrix is vital to their performance. However, traditional deterministic ways only provide a feasible affinity matrix but not the most suitable one for showing data structures. Besides, post-processing commonly on the coefficient matrix also affects the affinity matrix’s quality. Furthermore, constructing the affinity matrix is separate from optimizing the coefficient matrix and performing spectral clustering, which can not guarantee the optimal overall result. To this end, we propose a new method, affinity adaptive sparse subspace clustering (AASSC), by adding Laplacian rank constraint into a subspace sparse-representation model to adaptively learn a high-quality affinity matrix having accurate p-connected components from a sparse coefficient matrix without post-processing, where p represents categories. In addition, by relaxing the Laplacian rank constraint into a trace minimization, AASSC naturally combines the operations of the coefficient matrix, affinity matrix, and spectral clustering into a unified optimization, guaranteeing the overall optimal result. Extensive experimental results verify the proposed method to be effective and superior.
期刊介绍:
With a focus on research in artificial intelligence and neural networks, this journal addresses issues involving solutions of real-life manufacturing, defense, management, government and industrial problems which are too complex to be solved through conventional approaches and require the simulation of intelligent thought processes, heuristics, applications of knowledge, and distributed and parallel processing. The integration of these multiple approaches in solving complex problems is of particular importance.
The journal presents new and original research and technological developments, addressing real and complex issues applicable to difficult problems. It provides a medium for exchanging scientific research and technological achievements accomplished by the international community.