{"title":"Subgradient Regularized Multivariate Convex Regression at Scale","authors":"Wenyu Chen, Rahul Mazumder","doi":"10.1137/21m1413134","DOIUrl":null,"url":null,"abstract":"SIAM Journal on Optimization, Volume 34, Issue 3, Page 2350-2377, September 2024. <br/> Abstract. We present new large-scale algorithms for fitting a subgradient regularized multivariate convex regression function to [math] samples in [math] dimensions—a key problem in shape constrained nonparametric regression with applications in statistics, engineering, and the applied sciences. The infinite-dimensional learning task can be expressed via a convex quadratic program (QP) with [math] decision variables and [math] constraints. While instances with [math] in the lower thousands can be addressed with current algorithms within reasonable runtimes, solving larger problems (e.g., [math] or [math]) is computationally challenging. To this end, we present an active set type algorithm on the dual QP. For computational scalability, we allow for approximate optimization of the reduced subproblems and propose randomized augmentation rules for expanding the active set. We derive novel computational guarantees for our algorithms. We demonstrate that our framework can approximately solve instances of the subgradient regularized convex regression problem with [math] and [math] within minutes and shows strong computational performance compared to earlier approaches.","PeriodicalId":49529,"journal":{"name":"SIAM Journal on Optimization","volume":null,"pages":null},"PeriodicalIF":2.6000,"publicationDate":"2024-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIAM Journal on Optimization","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1137/21m1413134","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
SIAM Journal on Optimization, Volume 34, Issue 3, Page 2350-2377, September 2024. Abstract. We present new large-scale algorithms for fitting a subgradient regularized multivariate convex regression function to [math] samples in [math] dimensions—a key problem in shape constrained nonparametric regression with applications in statistics, engineering, and the applied sciences. The infinite-dimensional learning task can be expressed via a convex quadratic program (QP) with [math] decision variables and [math] constraints. While instances with [math] in the lower thousands can be addressed with current algorithms within reasonable runtimes, solving larger problems (e.g., [math] or [math]) is computationally challenging. To this end, we present an active set type algorithm on the dual QP. For computational scalability, we allow for approximate optimization of the reduced subproblems and propose randomized augmentation rules for expanding the active set. We derive novel computational guarantees for our algorithms. We demonstrate that our framework can approximately solve instances of the subgradient regularized convex regression problem with [math] and [math] within minutes and shows strong computational performance compared to earlier approaches.
期刊介绍:
The SIAM Journal on Optimization contains research articles on the theory and practice of optimization. The areas addressed include linear and quadratic programming, convex programming, nonlinear programming, complementarity problems, stochastic optimization, combinatorial optimization, integer programming, and convex, nonsmooth and variational analysis. Contributions may emphasize optimization theory, algorithms, software, computational practice, applications, or the links between these subjects.