鲁棒稀疏正交基聚类的无监督特征选择

IF 7.5 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Jianyu Miao , Jingjing Zhao , Tiejun Yang , Yingjie Tian , Yong Shi , Mingliang Xu
{"title":"鲁棒稀疏正交基聚类的无监督特征选择","authors":"Jianyu Miao ,&nbsp;Jingjing Zhao ,&nbsp;Tiejun Yang ,&nbsp;Yingjie Tian ,&nbsp;Yong Shi ,&nbsp;Mingliang Xu","doi":"10.1016/j.eswa.2025.126890","DOIUrl":null,"url":null,"abstract":"<div><div>Unsupervised Feature Selection (UFS), which identifies the optimal-related feature subset from the original feature set to lower the dimensionality of data without label information, has had a high profile in recent years. Given the absence of label information, the existing UFS approaches usually utilize graph and manifold learning techniques to retain the intrinsic structure of the data. The inclusion of irrelevant and redundant features and noise, would inevitably lower the quality of the structure. For this purpose, in this paper, we come up with Robust Sparse Orthogonal Basis Clustering (RSOBC), a novel method for UFS that integrates feature selection process with clustering task into a unified framework. Instead of explicitly utilizing the pre-computed local information, such a strategy focuses on exploring the inherent clustering structures of data. RSOBC leverages the log-based function as the loss to lessen the effect of noise and outliers, thereby enhancing its robustness. To select the more useful and discriminative features, the <span><math><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>2</mn><mo>,</mo><mn>1</mn></mrow></msub></math></span> norm is employed as the sparse regularization to encourage sparsity of the projection matrix. Meanwhile, we adopt the low redundancy regularization to make the weights of the correlated features small. In this way, the correlated features cannot be selected simultaneously. Consequently, the projection matrix, centroid matrix and cluster label matrix are learned simultaneously, such that the intrinsic structure is constructed in a more accurate way. The resulting optimization can be readily tackled by multi-block Alternating Direction Method of Multipliers (ADMM) based algorithm. Comprehensive experiments have been carried out on nine diverse real-world datasets. The results demonstrate that RSOBC surpasses many state-of-the-art UFS approaches, which indicates its effectiveness and superiority.</div></div>","PeriodicalId":50461,"journal":{"name":"Expert Systems with Applications","volume":"274 ","pages":"Article 126890"},"PeriodicalIF":7.5000,"publicationDate":"2025-02-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Robust sparse orthogonal basis clustering for unsupervised feature selection\",\"authors\":\"Jianyu Miao ,&nbsp;Jingjing Zhao ,&nbsp;Tiejun Yang ,&nbsp;Yingjie Tian ,&nbsp;Yong Shi ,&nbsp;Mingliang Xu\",\"doi\":\"10.1016/j.eswa.2025.126890\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Unsupervised Feature Selection (UFS), which identifies the optimal-related feature subset from the original feature set to lower the dimensionality of data without label information, has had a high profile in recent years. Given the absence of label information, the existing UFS approaches usually utilize graph and manifold learning techniques to retain the intrinsic structure of the data. The inclusion of irrelevant and redundant features and noise, would inevitably lower the quality of the structure. For this purpose, in this paper, we come up with Robust Sparse Orthogonal Basis Clustering (RSOBC), a novel method for UFS that integrates feature selection process with clustering task into a unified framework. Instead of explicitly utilizing the pre-computed local information, such a strategy focuses on exploring the inherent clustering structures of data. RSOBC leverages the log-based function as the loss to lessen the effect of noise and outliers, thereby enhancing its robustness. To select the more useful and discriminative features, the <span><math><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>2</mn><mo>,</mo><mn>1</mn></mrow></msub></math></span> norm is employed as the sparse regularization to encourage sparsity of the projection matrix. Meanwhile, we adopt the low redundancy regularization to make the weights of the correlated features small. In this way, the correlated features cannot be selected simultaneously. Consequently, the projection matrix, centroid matrix and cluster label matrix are learned simultaneously, such that the intrinsic structure is constructed in a more accurate way. The resulting optimization can be readily tackled by multi-block Alternating Direction Method of Multipliers (ADMM) based algorithm. Comprehensive experiments have been carried out on nine diverse real-world datasets. The results demonstrate that RSOBC surpasses many state-of-the-art UFS approaches, which indicates its effectiveness and superiority.</div></div>\",\"PeriodicalId\":50461,\"journal\":{\"name\":\"Expert Systems with Applications\",\"volume\":\"274 \",\"pages\":\"Article 126890\"},\"PeriodicalIF\":7.5000,\"publicationDate\":\"2025-02-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Expert Systems with Applications\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0957417425005123\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Expert Systems with Applications","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0957417425005123","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

无监督特征选择(Unsupervised Feature Selection, UFS)是近年来备受关注的一种方法,它从原始特征集中识别出最优相关的特征子集,从而降低无标签信息数据的维数。由于缺乏标签信息,现有的UFS方法通常利用图和流形学习技术来保留数据的内在结构。包含不相关和冗余的特征和噪声,将不可避免地降低结构的质量。为此,本文提出了鲁棒稀疏正交基聚类(Robust Sparse Orthogonal Basis Clustering, RSOBC),这是一种将特征选择过程与聚类任务整合到一个统一框架中的UFS新方法。这种策略不是显式地利用预先计算的局部信息,而是侧重于探索数据固有的聚类结构。RSOBC利用基于对数的函数作为损失来减少噪声和异常值的影响,从而增强其鲁棒性。为了选择更有用和判别性更强的特征,采用了1,1,2范数作为稀疏正则化来提高投影矩阵的稀疏性。同时,采用低冗余正则化,使相关特征的权重较小。这样就不能同时选择相关的特征。因此,同时学习投影矩阵、质心矩阵和聚类标记矩阵,从而更准确地构造出内在结构。通过基于多块乘法器交替方向法(ADMM)的算法可以很容易地进行优化。在九个不同的真实世界数据集上进行了全面的实验。结果表明,RSOBC方法优于许多最先进的UFS方法,表明了其有效性和优越性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Robust sparse orthogonal basis clustering for unsupervised feature selection
Unsupervised Feature Selection (UFS), which identifies the optimal-related feature subset from the original feature set to lower the dimensionality of data without label information, has had a high profile in recent years. Given the absence of label information, the existing UFS approaches usually utilize graph and manifold learning techniques to retain the intrinsic structure of the data. The inclusion of irrelevant and redundant features and noise, would inevitably lower the quality of the structure. For this purpose, in this paper, we come up with Robust Sparse Orthogonal Basis Clustering (RSOBC), a novel method for UFS that integrates feature selection process with clustering task into a unified framework. Instead of explicitly utilizing the pre-computed local information, such a strategy focuses on exploring the inherent clustering structures of data. RSOBC leverages the log-based function as the loss to lessen the effect of noise and outliers, thereby enhancing its robustness. To select the more useful and discriminative features, the 2,1 norm is employed as the sparse regularization to encourage sparsity of the projection matrix. Meanwhile, we adopt the low redundancy regularization to make the weights of the correlated features small. In this way, the correlated features cannot be selected simultaneously. Consequently, the projection matrix, centroid matrix and cluster label matrix are learned simultaneously, such that the intrinsic structure is constructed in a more accurate way. The resulting optimization can be readily tackled by multi-block Alternating Direction Method of Multipliers (ADMM) based algorithm. Comprehensive experiments have been carried out on nine diverse real-world datasets. The results demonstrate that RSOBC surpasses many state-of-the-art UFS approaches, which indicates its effectiveness and superiority.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Expert Systems with Applications
Expert Systems with Applications 工程技术-工程:电子与电气
CiteScore
13.80
自引率
10.60%
发文量
2045
审稿时长
8.7 months
期刊介绍: Expert Systems With Applications is an international journal dedicated to the exchange of information on expert and intelligent systems used globally in industry, government, and universities. The journal emphasizes original papers covering the design, development, testing, implementation, and management of these systems, offering practical guidelines. It spans various sectors such as finance, engineering, marketing, law, project management, information management, medicine, and more. The journal also welcomes papers on multi-agent systems, knowledge management, neural networks, knowledge discovery, data mining, and other related areas, excluding applications to military/defense systems.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信