Subgradient Regularized Multivariate Convex Regression at Scale

IF 2.6 1区 数学 Q1 MATHEMATICS, APPLIED
Wenyu Chen, Rahul Mazumder
{"title":"Subgradient Regularized Multivariate Convex Regression at Scale","authors":"Wenyu Chen, Rahul Mazumder","doi":"10.1137/21m1413134","DOIUrl":null,"url":null,"abstract":"SIAM Journal on Optimization, Volume 34, Issue 3, Page 2350-2377, September 2024. <br/> Abstract. We present new large-scale algorithms for fitting a subgradient regularized multivariate convex regression function to [math] samples in [math] dimensions—a key problem in shape constrained nonparametric regression with applications in statistics, engineering, and the applied sciences. The infinite-dimensional learning task can be expressed via a convex quadratic program (QP) with [math] decision variables and [math] constraints. While instances with [math] in the lower thousands can be addressed with current algorithms within reasonable runtimes, solving larger problems (e.g., [math] or [math]) is computationally challenging. To this end, we present an active set type algorithm on the dual QP. For computational scalability, we allow for approximate optimization of the reduced subproblems and propose randomized augmentation rules for expanding the active set. We derive novel computational guarantees for our algorithms. We demonstrate that our framework can approximately solve instances of the subgradient regularized convex regression problem with [math] and [math] within minutes and shows strong computational performance compared to earlier approaches.","PeriodicalId":49529,"journal":{"name":"SIAM Journal on Optimization","volume":null,"pages":null},"PeriodicalIF":2.6000,"publicationDate":"2024-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIAM Journal on Optimization","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1137/21m1413134","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0

Abstract

SIAM Journal on Optimization, Volume 34, Issue 3, Page 2350-2377, September 2024.
Abstract. We present new large-scale algorithms for fitting a subgradient regularized multivariate convex regression function to [math] samples in [math] dimensions—a key problem in shape constrained nonparametric regression with applications in statistics, engineering, and the applied sciences. The infinite-dimensional learning task can be expressed via a convex quadratic program (QP) with [math] decision variables and [math] constraints. While instances with [math] in the lower thousands can be addressed with current algorithms within reasonable runtimes, solving larger problems (e.g., [math] or [math]) is computationally challenging. To this end, we present an active set type algorithm on the dual QP. For computational scalability, we allow for approximate optimization of the reduced subproblems and propose randomized augmentation rules for expanding the active set. We derive novel computational guarantees for our algorithms. We demonstrate that our framework can approximately solve instances of the subgradient regularized convex regression problem with [math] and [math] within minutes and shows strong computational performance compared to earlier approaches.
子梯度正则化多变量尺度凸回归
SIAM 优化期刊》,第 34 卷第 3 期,第 2350-2377 页,2024 年 9 月。 摘要我们提出了在[数学]维度上对[数学]样本进行亚梯度正则化多元凸回归函数拟合的新大规模算法--这是形状约束非参数回归的关键问题,在统计学、工程学和应用科学中都有应用。无穷维学习任务可以通过一个具有[数学]决策变量和[数学]约束条件的凸二次方程程序(QP)来表达。虽然目前的算法可以在合理的运行时间内解决[math]在数千以下的实例,但解决更大的问题(如[math]或[math])在计算上具有挑战性。为此,我们提出了一种对偶 QP 的有源集类型算法。为了提高计算的可扩展性,我们允许对缩小的子问题进行近似优化,并提出了扩展活动集的随机增强规则。我们为算法推导出了新的计算保证。我们证明,我们的框架可以在几分钟内近似解决[math]和[math]的子梯度正则化凸回归问题实例,与早期方法相比,我们的框架显示出很强的计算性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
SIAM Journal on Optimization
SIAM Journal on Optimization 数学-应用数学
CiteScore
5.30
自引率
9.70%
发文量
101
审稿时长
6-12 weeks
期刊介绍: The SIAM Journal on Optimization contains research articles on the theory and practice of optimization. The areas addressed include linear and quadratic programming, convex programming, nonlinear programming, complementarity problems, stochastic optimization, combinatorial optimization, integer programming, and convex, nonsmooth and variational analysis. Contributions may emphasize optimization theory, algorithms, software, computational practice, applications, or the links between these subjects.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信