{"title":"基于聚类注入的低秩子空间学习的鲁棒多标签分类","authors":"Ziyue Zhu, Conghua Zhou, Shijie Sun, Emmanuel Ntaye, Xiang-Jun Shen, Zhifeng Liu","doi":"10.1007/s10489-025-06837-z","DOIUrl":null,"url":null,"abstract":"<div><p>Multi-label learning in high-dimensional spaces Suffers from the curse of dimensionality, noisy labels, and complex feature-label dependencies. Traditional deep learning solutions for multi-label classification employ multi-layer networks but overfit and generalize poorly owing to ineffective high-order data dependencies. In this paper, we introduce a cluster-infused low-rank subspace learning framework that integrates low-rank subspace learning with cluster infusion to solve these issues. Our model resolves sensitivity to noise, overfitting and poor generalization in high-dimensional data by using low-rank subspace representation decomposition of the classifier for dimension reduction and low-rank classifier for discriminative classification. To enhance robustness, we reconstruct each data sample as a Linear combination of its neighbours, infusing clustering-derived features into the model. These facilitate feature robustness via local correlations, thereby improving noise resilience and discriminative power. Extensive experiments on benchmark high-dimensional datasets, compared against state-of-the-art approaches, indicate that our approach significantly improves classification accuracy and robustness, making it a good solution for noisy, high-dimensional multi-label classification tasks. This effectiveness is evidenced across datasets of various scales, including a 3.04% improvement in Example-F1 over CNN-RNN on the smaller 20NG dataset and a significant 9.9% gain in Micro-F1 against RethinkNet on the large-scale NUS-WIDE dataset, highlighting DL-CS’s superiority for diverse multi-label classification tasks.</p></div>","PeriodicalId":8041,"journal":{"name":"Applied Intelligence","volume":"55 14","pages":""},"PeriodicalIF":3.5000,"publicationDate":"2025-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Cluster-infused low-rank subspace learning for robust multi-label classification\",\"authors\":\"Ziyue Zhu, Conghua Zhou, Shijie Sun, Emmanuel Ntaye, Xiang-Jun Shen, Zhifeng Liu\",\"doi\":\"10.1007/s10489-025-06837-z\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Multi-label learning in high-dimensional spaces Suffers from the curse of dimensionality, noisy labels, and complex feature-label dependencies. Traditional deep learning solutions for multi-label classification employ multi-layer networks but overfit and generalize poorly owing to ineffective high-order data dependencies. In this paper, we introduce a cluster-infused low-rank subspace learning framework that integrates low-rank subspace learning with cluster infusion to solve these issues. Our model resolves sensitivity to noise, overfitting and poor generalization in high-dimensional data by using low-rank subspace representation decomposition of the classifier for dimension reduction and low-rank classifier for discriminative classification. To enhance robustness, we reconstruct each data sample as a Linear combination of its neighbours, infusing clustering-derived features into the model. These facilitate feature robustness via local correlations, thereby improving noise resilience and discriminative power. Extensive experiments on benchmark high-dimensional datasets, compared against state-of-the-art approaches, indicate that our approach significantly improves classification accuracy and robustness, making it a good solution for noisy, high-dimensional multi-label classification tasks. This effectiveness is evidenced across datasets of various scales, including a 3.04% improvement in Example-F1 over CNN-RNN on the smaller 20NG dataset and a significant 9.9% gain in Micro-F1 against RethinkNet on the large-scale NUS-WIDE dataset, highlighting DL-CS’s superiority for diverse multi-label classification tasks.</p></div>\",\"PeriodicalId\":8041,\"journal\":{\"name\":\"Applied Intelligence\",\"volume\":\"55 14\",\"pages\":\"\"},\"PeriodicalIF\":3.5000,\"publicationDate\":\"2025-09-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Applied Intelligence\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://link.springer.com/article/10.1007/s10489-025-06837-z\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Intelligence","FirstCategoryId":"94","ListUrlMain":"https://link.springer.com/article/10.1007/s10489-025-06837-z","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Cluster-infused low-rank subspace learning for robust multi-label classification
Multi-label learning in high-dimensional spaces Suffers from the curse of dimensionality, noisy labels, and complex feature-label dependencies. Traditional deep learning solutions for multi-label classification employ multi-layer networks but overfit and generalize poorly owing to ineffective high-order data dependencies. In this paper, we introduce a cluster-infused low-rank subspace learning framework that integrates low-rank subspace learning with cluster infusion to solve these issues. Our model resolves sensitivity to noise, overfitting and poor generalization in high-dimensional data by using low-rank subspace representation decomposition of the classifier for dimension reduction and low-rank classifier for discriminative classification. To enhance robustness, we reconstruct each data sample as a Linear combination of its neighbours, infusing clustering-derived features into the model. These facilitate feature robustness via local correlations, thereby improving noise resilience and discriminative power. Extensive experiments on benchmark high-dimensional datasets, compared against state-of-the-art approaches, indicate that our approach significantly improves classification accuracy and robustness, making it a good solution for noisy, high-dimensional multi-label classification tasks. This effectiveness is evidenced across datasets of various scales, including a 3.04% improvement in Example-F1 over CNN-RNN on the smaller 20NG dataset and a significant 9.9% gain in Micro-F1 against RethinkNet on the large-scale NUS-WIDE dataset, highlighting DL-CS’s superiority for diverse multi-label classification tasks.
期刊介绍:
With a focus on research in artificial intelligence and neural networks, this journal addresses issues involving solutions of real-life manufacturing, defense, management, government and industrial problems which are too complex to be solved through conventional approaches and require the simulation of intelligent thought processes, heuristics, applications of knowledge, and distributed and parallel processing. The integration of these multiple approaches in solving complex problems is of particular importance.
The journal presents new and original research and technological developments, addressing real and complex issues applicable to difficult problems. It provides a medium for exchanging scientific research and technological achievements accomplished by the international community.