Huajun Wang , Wenqian Li , Yuanhai Shao , Hongwei Zhang
{"title":"Sparse and robust alternating direction method of multipliers for large-scale classification learning","authors":"Huajun Wang , Wenqian Li , Yuanhai Shao , Hongwei Zhang","doi":"10.1016/j.neucom.2025.130893","DOIUrl":null,"url":null,"abstract":"<div><div>Support vector machine (SVM) is a highly effective method in terms of classification learning. Nonetheless, when faced with large-scale classification problems, the high computational complexity involved can pose a significant obstacle. To tackle this problem, we establish a new trimmed squared loss SVM model known as TSVM. This model can be designed for achieving both sparsity and robustness at the same time. A novel optimality theory has been developed for the nonsmooth and nonconvex TSVM. Utilizing this new theory, the innovative fast alternating direction method of multipliers with low computational complexity and working set has been proposed to solve TSVM. Numerical tests show the effectiveness of the new method regarding the computational speed, number of support vector and classification accuracy, outperforming eight alternative top solvers. As an illustration, when tackling the real dataset with more than <span><math><mrow><mn>1</mn><msup><mrow><mn>0</mn></mrow><mrow><mn>7</mn></mrow></msup></mrow></math></span> instances, compared to seven other algorithms, our algorithm exhibited a 34 times enhancement in computation time, alongside achieving a 6.5% enhancement in accuracy and a 25 times decrease in support vector rates.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"652 ","pages":"Article 130893"},"PeriodicalIF":5.5000,"publicationDate":"2025-07-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231225015656","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Support vector machine (SVM) is a highly effective method in terms of classification learning. Nonetheless, when faced with large-scale classification problems, the high computational complexity involved can pose a significant obstacle. To tackle this problem, we establish a new trimmed squared loss SVM model known as TSVM. This model can be designed for achieving both sparsity and robustness at the same time. A novel optimality theory has been developed for the nonsmooth and nonconvex TSVM. Utilizing this new theory, the innovative fast alternating direction method of multipliers with low computational complexity and working set has been proposed to solve TSVM. Numerical tests show the effectiveness of the new method regarding the computational speed, number of support vector and classification accuracy, outperforming eight alternative top solvers. As an illustration, when tackling the real dataset with more than instances, compared to seven other algorithms, our algorithm exhibited a 34 times enhancement in computation time, alongside achieving a 6.5% enhancement in accuracy and a 25 times decrease in support vector rates.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.