{"title":"灵活高效的高维支持向量回归算法","authors":"Menglei Yang , Hao Liang , Xiaofei Wu , Zhimin Zhang","doi":"10.1016/j.neucom.2024.128671","DOIUrl":null,"url":null,"abstract":"<div><div>In high dimensional statistical learning, variable selection and handling highly correlated phenomena are two crucial topics. Elastic-net regularization can automatically perform variable selection and tends to either simultaneously select or remove highly correlated variables. Consequently, it has been widely applied in machine learning. In this paper, we incorporate elastic-net regularization into the support vector regression model, introducing the Elastic-net Support Vector Regression (En-SVR) model. Due to the inclusion of elastic-net regularization, the En-SVR model possesses the capability of variable selection, addressing high dimensional and highly correlated statistical learning problems. However, the optimization problem for the En-SVR model is rather complex, and common methods for solving the En-SVR model are challenging. Nevertheless, we observe that the optimization problem for the En-SVR model can be reformulated as a convex optimization problem where the objective function is separable into multiple blocks and connected by an inequality constraint. Therefore, we employ a novel and efficient Alternating Direction Method of Multipliers (ADMM) algorithm to solve the En-SVR model, and provide a complexity analysis as well as convergence analysis for the algorithm. Furthermore, extensive numerical experiments validate the outstanding performance of the En-SVR model in high dimensional statistical learning and the efficiency of this novel ADMM algorithm.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":null,"pages":null},"PeriodicalIF":5.5000,"publicationDate":"2024-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A flexible and efficient algorithm for high dimensional support vector regression\",\"authors\":\"Menglei Yang , Hao Liang , Xiaofei Wu , Zhimin Zhang\",\"doi\":\"10.1016/j.neucom.2024.128671\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>In high dimensional statistical learning, variable selection and handling highly correlated phenomena are two crucial topics. Elastic-net regularization can automatically perform variable selection and tends to either simultaneously select or remove highly correlated variables. Consequently, it has been widely applied in machine learning. In this paper, we incorporate elastic-net regularization into the support vector regression model, introducing the Elastic-net Support Vector Regression (En-SVR) model. Due to the inclusion of elastic-net regularization, the En-SVR model possesses the capability of variable selection, addressing high dimensional and highly correlated statistical learning problems. However, the optimization problem for the En-SVR model is rather complex, and common methods for solving the En-SVR model are challenging. Nevertheless, we observe that the optimization problem for the En-SVR model can be reformulated as a convex optimization problem where the objective function is separable into multiple blocks and connected by an inequality constraint. Therefore, we employ a novel and efficient Alternating Direction Method of Multipliers (ADMM) algorithm to solve the En-SVR model, and provide a complexity analysis as well as convergence analysis for the algorithm. Furthermore, extensive numerical experiments validate the outstanding performance of the En-SVR model in high dimensional statistical learning and the efficiency of this novel ADMM algorithm.</div></div>\",\"PeriodicalId\":19268,\"journal\":{\"name\":\"Neurocomputing\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":5.5000,\"publicationDate\":\"2024-10-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neurocomputing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0925231224014425\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231224014425","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
A flexible and efficient algorithm for high dimensional support vector regression
In high dimensional statistical learning, variable selection and handling highly correlated phenomena are two crucial topics. Elastic-net regularization can automatically perform variable selection and tends to either simultaneously select or remove highly correlated variables. Consequently, it has been widely applied in machine learning. In this paper, we incorporate elastic-net regularization into the support vector regression model, introducing the Elastic-net Support Vector Regression (En-SVR) model. Due to the inclusion of elastic-net regularization, the En-SVR model possesses the capability of variable selection, addressing high dimensional and highly correlated statistical learning problems. However, the optimization problem for the En-SVR model is rather complex, and common methods for solving the En-SVR model are challenging. Nevertheless, we observe that the optimization problem for the En-SVR model can be reformulated as a convex optimization problem where the objective function is separable into multiple blocks and connected by an inequality constraint. Therefore, we employ a novel and efficient Alternating Direction Method of Multipliers (ADMM) algorithm to solve the En-SVR model, and provide a complexity analysis as well as convergence analysis for the algorithm. Furthermore, extensive numerical experiments validate the outstanding performance of the En-SVR model in high dimensional statistical learning and the efficiency of this novel ADMM algorithm.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.