{"title":"Convergence rate analysis of smoothed LASSO","authors":"Subhadip Mukherjee, C. Seelamantula","doi":"10.1109/NCC.2016.7561090","DOIUrl":null,"url":null,"abstract":"The LASSO regression has been studied extensively in the statistics and signal processing community, especially in the realm of sparse parameter estimation from linear measurements. We analyze the convergence rate of a first-order method applied on a smooth, strictly convex, and parametric upper bound on the LASSO objective function. The upper bound approaches the true non-smooth objective as the parameter tends to infinity. We show that a gradient-based algorithm, applied to minimize the smooth upper bound, yields a convergence rate of O (1/K), where K denotes the number of iterations performed. The analysis also reveals the optimum value of the parameter that achieves a desired prediction accuracy, provided that the total number of iterations is decided a priori. The convergence rate of the proposed algorithm and the amount of computation required in each iteration are same as that of the iterative soft thresholding technique. However, the proposed algorithm does not involve any thresholding operation. The performance of the proposed technique, referred to as smoothed LASSO, is validated on synthesized signals. We also deploy smoothed LASSO for estimating an image from its blurred and noisy measurement, and compare the performance with the fast iterative shrinkage thresholding algorithm for a fixed run-time budget, in terms of the reconstruction peak signal-to-noise ratio and structural similarity index.","PeriodicalId":279637,"journal":{"name":"2016 Twenty Second National Conference on Communication (NCC)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-03-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 Twenty Second National Conference on Communication (NCC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NCC.2016.7561090","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
The LASSO regression has been studied extensively in the statistics and signal processing community, especially in the realm of sparse parameter estimation from linear measurements. We analyze the convergence rate of a first-order method applied on a smooth, strictly convex, and parametric upper bound on the LASSO objective function. The upper bound approaches the true non-smooth objective as the parameter tends to infinity. We show that a gradient-based algorithm, applied to minimize the smooth upper bound, yields a convergence rate of O (1/K), where K denotes the number of iterations performed. The analysis also reveals the optimum value of the parameter that achieves a desired prediction accuracy, provided that the total number of iterations is decided a priori. The convergence rate of the proposed algorithm and the amount of computation required in each iteration are same as that of the iterative soft thresholding technique. However, the proposed algorithm does not involve any thresholding operation. The performance of the proposed technique, referred to as smoothed LASSO, is validated on synthesized signals. We also deploy smoothed LASSO for estimating an image from its blurred and noisy measurement, and compare the performance with the fast iterative shrinkage thresholding algorithm for a fixed run-time budget, in terms of the reconstruction peak signal-to-noise ratio and structural similarity index.