{"title":"Learning ℓ1-penalized logistic regressions with smooth approximation","authors":"J. Klimaszewski, M. Sklyar, M. Korzeń","doi":"10.1109/INISTA.2017.8001144","DOIUrl":null,"url":null,"abstract":"The paper presents comparison of learning logistic regression model with different penalty terms. Main part of the paper concerns sparse regression, which includes absolute value function. This function is not strictly convex, thus common optimizers cannot be used directly. In the paper we show that in those cases smooth approximation of absolute value function can be effectively used either in the case of lasso regression or in fussed-lasso like case. One of examples focuses on two dimensional analogue of fussed-lasso model. The experimental results present the comparison of our implementations (in C++ and Python) on three benchmark datasets.","PeriodicalId":314687,"journal":{"name":"2017 IEEE International Conference on INnovations in Intelligent SysTems and Applications (INISTA)","volume":"22 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE International Conference on INnovations in Intelligent SysTems and Applications (INISTA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/INISTA.2017.8001144","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
The paper presents comparison of learning logistic regression model with different penalty terms. Main part of the paper concerns sparse regression, which includes absolute value function. This function is not strictly convex, thus common optimizers cannot be used directly. In the paper we show that in those cases smooth approximation of absolute value function can be effectively used either in the case of lasso regression or in fussed-lasso like case. One of examples focuses on two dimensional analogue of fussed-lasso model. The experimental results present the comparison of our implementations (in C++ and Python) on three benchmark datasets.