通过受限拉普拉斯秩进行亲和自适应稀疏子空间聚类

IF 3.4 2区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Ting Yang, Shuisheng Zhou, Zhuan Zhang
{"title":"通过受限拉普拉斯秩进行亲和自适应稀疏子空间聚类","authors":"Ting Yang,&nbsp;Shuisheng Zhou,&nbsp;Zhuan Zhang","doi":"10.1007/s10489-024-05812-4","DOIUrl":null,"url":null,"abstract":"<div><p>Subspace clustering typically clusters data by performing spectral clustering to an affinity matrix constructed in some deterministic ways of self-representation coefficient matrix. Therefore, the quality of the affinity matrix is vital to their performance. However, traditional deterministic ways only provide a feasible affinity matrix but not the most suitable one for showing data structures. Besides, post-processing commonly on the coefficient matrix also affects the affinity matrix’s quality. Furthermore, constructing the affinity matrix is separate from optimizing the coefficient matrix and performing spectral clustering, which can not guarantee the optimal overall result. To this end, we propose a new method, affinity adaptive sparse subspace clustering (AASSC), by adding Laplacian rank constraint into a subspace sparse-representation model to adaptively learn a high-quality affinity matrix having accurate <i>p</i>-connected components from a sparse coefficient matrix without post-processing, where <i>p</i> represents categories. In addition, by relaxing the Laplacian rank constraint into a trace minimization, AASSC naturally combines the operations of the coefficient matrix, affinity matrix, and spectral clustering into a unified optimization, guaranteeing the overall optimal result. Extensive experimental results verify the proposed method to be effective and superior.</p></div>","PeriodicalId":8041,"journal":{"name":"Applied Intelligence","volume":"54 23","pages":"12378 - 12390"},"PeriodicalIF":3.4000,"publicationDate":"2024-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Affinity adaptive sparse subspace clustering via constrained Laplacian rank\",\"authors\":\"Ting Yang,&nbsp;Shuisheng Zhou,&nbsp;Zhuan Zhang\",\"doi\":\"10.1007/s10489-024-05812-4\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Subspace clustering typically clusters data by performing spectral clustering to an affinity matrix constructed in some deterministic ways of self-representation coefficient matrix. Therefore, the quality of the affinity matrix is vital to their performance. However, traditional deterministic ways only provide a feasible affinity matrix but not the most suitable one for showing data structures. Besides, post-processing commonly on the coefficient matrix also affects the affinity matrix’s quality. Furthermore, constructing the affinity matrix is separate from optimizing the coefficient matrix and performing spectral clustering, which can not guarantee the optimal overall result. To this end, we propose a new method, affinity adaptive sparse subspace clustering (AASSC), by adding Laplacian rank constraint into a subspace sparse-representation model to adaptively learn a high-quality affinity matrix having accurate <i>p</i>-connected components from a sparse coefficient matrix without post-processing, where <i>p</i> represents categories. In addition, by relaxing the Laplacian rank constraint into a trace minimization, AASSC naturally combines the operations of the coefficient matrix, affinity matrix, and spectral clustering into a unified optimization, guaranteeing the overall optimal result. Extensive experimental results verify the proposed method to be effective and superior.</p></div>\",\"PeriodicalId\":8041,\"journal\":{\"name\":\"Applied Intelligence\",\"volume\":\"54 23\",\"pages\":\"12378 - 12390\"},\"PeriodicalIF\":3.4000,\"publicationDate\":\"2024-09-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Applied Intelligence\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://link.springer.com/article/10.1007/s10489-024-05812-4\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Intelligence","FirstCategoryId":"94","ListUrlMain":"https://link.springer.com/article/10.1007/s10489-024-05812-4","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

子空间聚类通常是通过对以某种确定性方式构建的自表示系数矩阵的亲和矩阵进行频谱聚类,从而对数据进行聚类。因此,亲和矩阵的质量对其性能至关重要。然而,传统的确定性方法只能提供可行的亲和矩阵,却不能提供最适合显示数据结构的亲和矩阵。此外,通常对系数矩阵进行的后处理也会影响亲和矩阵的质量。而且,构建亲和矩阵与优化系数矩阵和进行频谱聚类是分开的,不能保证整体结果最优。为此,我们提出了一种新方法--亲和力自适应稀疏子空间聚类(AASSC),即在子空间稀疏表示模型中加入拉普拉斯秩约束,从而无需后处理即可从稀疏系数矩阵中自适应地学习出具有精确 p 个连接分量的高质量亲和力矩阵,其中 p 代表类别。此外,通过将拉普拉斯秩约束放宽为迹线最小化,AASSC 自然而然地将系数矩阵、亲和矩阵和谱聚类的操作结合为统一的优化,保证了整体最优结果。大量实验结果验证了所提方法的有效性和优越性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Affinity adaptive sparse subspace clustering via constrained Laplacian rank

Affinity adaptive sparse subspace clustering via constrained Laplacian rank

Affinity adaptive sparse subspace clustering via constrained Laplacian rank

Subspace clustering typically clusters data by performing spectral clustering to an affinity matrix constructed in some deterministic ways of self-representation coefficient matrix. Therefore, the quality of the affinity matrix is vital to their performance. However, traditional deterministic ways only provide a feasible affinity matrix but not the most suitable one for showing data structures. Besides, post-processing commonly on the coefficient matrix also affects the affinity matrix’s quality. Furthermore, constructing the affinity matrix is separate from optimizing the coefficient matrix and performing spectral clustering, which can not guarantee the optimal overall result. To this end, we propose a new method, affinity adaptive sparse subspace clustering (AASSC), by adding Laplacian rank constraint into a subspace sparse-representation model to adaptively learn a high-quality affinity matrix having accurate p-connected components from a sparse coefficient matrix without post-processing, where p represents categories. In addition, by relaxing the Laplacian rank constraint into a trace minimization, AASSC naturally combines the operations of the coefficient matrix, affinity matrix, and spectral clustering into a unified optimization, guaranteeing the overall optimal result. Extensive experimental results verify the proposed method to be effective and superior.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Applied Intelligence
Applied Intelligence 工程技术-计算机:人工智能
CiteScore
6.60
自引率
20.80%
发文量
1361
审稿时长
5.9 months
期刊介绍: With a focus on research in artificial intelligence and neural networks, this journal addresses issues involving solutions of real-life manufacturing, defense, management, government and industrial problems which are too complex to be solved through conventional approaches and require the simulation of intelligent thought processes, heuristics, applications of knowledge, and distributed and parallel processing. The integration of these multiple approaches in solving complex problems is of particular importance. The journal presents new and original research and technological developments, addressing real and complex issues applicable to difficult problems. It provides a medium for exchanging scientific research and technological achievements accomplished by the international community.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信