Paulo Vitor Freitas da Silva, R. F. Neto, Saulo Moraes Villela
{"title":"An Ordered Search for Subset Selection in Support Vector Orthogonal Regression","authors":"Paulo Vitor Freitas da Silva, R. F. Neto, Saulo Moraes Villela","doi":"10.1109/BRACIS.2019.00042","DOIUrl":null,"url":null,"abstract":"Subset selection is an important task in many problems, especially when dealing with high dimensional problems, such as classification, regression, and others. In this sense, this work proposes an ordered search to select variables in orthogonal regression problems based on support vectors. The admissible search is based on a monotone property of the radius parameter. Thus, we use the radius of the SV-regression as an evaluation measure for the search, making it able to find the subsets with the smallest radius in each dimension of the problem without exhaustively exploring all possibilities. The main reason for choosing the orthogonal regression is due to the fact that this model also considers the existence of error in dependent variables. The obtained results, represented by the test error, when compared to the LASSO and a recursive feature elimination technique, demonstrate the efficiency of the method.","PeriodicalId":335206,"journal":{"name":"Brazilian Conference on Intelligent Systems","volume":"72 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Brazilian Conference on Intelligent Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/BRACIS.2019.00042","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Subset selection is an important task in many problems, especially when dealing with high dimensional problems, such as classification, regression, and others. In this sense, this work proposes an ordered search to select variables in orthogonal regression problems based on support vectors. The admissible search is based on a monotone property of the radius parameter. Thus, we use the radius of the SV-regression as an evaluation measure for the search, making it able to find the subsets with the smallest radius in each dimension of the problem without exhaustively exploring all possibilities. The main reason for choosing the orthogonal regression is due to the fact that this model also considers the existence of error in dependent variables. The obtained results, represented by the test error, when compared to the LASSO and a recursive feature elimination technique, demonstrate the efficiency of the method.