{"title":"Robust low-rank representation with structured similarity learning for multi-label classification","authors":"Emmanuel Ntaye, Conghua Zhou, Zhifeng Liu, Heping Song, Fadilul-lah Yassaanah Issahaku, Xiang-Jun Shen","doi":"10.1007/s10489-025-06879-3","DOIUrl":null,"url":null,"abstract":"<div><p>Handling high-dimensional, noisy data in multi-label classification is challenging, as feature abundance and noise obscure actual data-label relationships. Traditional approaches often model labels and features independently, limiting dependency modeling and noise reduction. To address this, we propose a unified framework combining low-rank representation using nuclear norm regularization with structured similarity learning. This simultaneously projects features and labels into low-rank spaces while preserving key inter-sample and inter-label relationships through structural constraints, further capturing fine-grained correlations via a learned similarity Matrix. Extensive experiments on five benchmark datasets show our model outperforms state-of-the-art methods, achieving a 16% reduction in Hamming Lossl and a 14% improvement in Micro-F1 on high-dimensional, noisy datasets like CAL500 and Corel16k7, with consistent gains in Macro-F1 and Example-F1. These results demonstrate the model’s strong capability for noisy, high-dimensional multi-label classification.</p></div>","PeriodicalId":8041,"journal":{"name":"Applied Intelligence","volume":"55 15","pages":""},"PeriodicalIF":3.5000,"publicationDate":"2025-09-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Intelligence","FirstCategoryId":"94","ListUrlMain":"https://link.springer.com/article/10.1007/s10489-025-06879-3","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Handling high-dimensional, noisy data in multi-label classification is challenging, as feature abundance and noise obscure actual data-label relationships. Traditional approaches often model labels and features independently, limiting dependency modeling and noise reduction. To address this, we propose a unified framework combining low-rank representation using nuclear norm regularization with structured similarity learning. This simultaneously projects features and labels into low-rank spaces while preserving key inter-sample and inter-label relationships through structural constraints, further capturing fine-grained correlations via a learned similarity Matrix. Extensive experiments on five benchmark datasets show our model outperforms state-of-the-art methods, achieving a 16% reduction in Hamming Lossl and a 14% improvement in Micro-F1 on high-dimensional, noisy datasets like CAL500 and Corel16k7, with consistent gains in Macro-F1 and Example-F1. These results demonstrate the model’s strong capability for noisy, high-dimensional multi-label classification.
期刊介绍:
With a focus on research in artificial intelligence and neural networks, this journal addresses issues involving solutions of real-life manufacturing, defense, management, government and industrial problems which are too complex to be solved through conventional approaches and require the simulation of intelligent thought processes, heuristics, applications of knowledge, and distributed and parallel processing. The integration of these multiple approaches in solving complex problems is of particular importance.
The journal presents new and original research and technological developments, addressing real and complex issues applicable to difficult problems. It provides a medium for exchanging scientific research and technological achievements accomplished by the international community.