基于随机(或确定性)镜像下降的非lipschitz非光滑凸优化的“相对连续性”

Haihao Lu
{"title":"基于随机(或确定性)镜像下降的非lipschitz非光滑凸优化的“相对连续性”","authors":"Haihao Lu","doi":"10.1287/IJOO.2018.0008","DOIUrl":null,"url":null,"abstract":"The usual approach to developing and analyzing first-order methods for non-smooth (stochastic or deterministic) convex optimization assumes that the objective function is uniformly Lipschitz continuous with parameter $M_f$. However, in many settings the non-differentiable convex function $f(\\cdot)$ is not uniformly Lipschitz continuous -- for example (i) the classical support vector machine (SVM) problem, (ii) the problem of minimizing the maximum of convex quadratic functions, and even (iii) the univariate setting with $f(x) := \\max\\{0, x\\} + x^2$. Herein we develop a notion of \"relative continuity\" that is determined relative to a user-specified \"reference function\" $h(\\cdot)$ (that should be computationally tractable for algorithms), and we show that many non-differentiable convex functions are relatively continuous with respect to a correspondingly fairly-simple reference function $h(\\cdot)$. We also similarly develop a notion of \"relative stochastic continuity\" for the stochastic setting. We analysis two standard algorithms -- the (deterministic) mirror descent algorithm and the stochastic mirror descent algorithm -- for solving optimization problems in these two new settings, and we develop for the first time computational guarantees for instances where the objective function is not uniformly Lipschitz continuous. This paper is a companion paper for non-differentiable convex optimization to the recent paper by Lu, Freund, and Nesterov, which developed similar sorts of results for differentiable convex optimization.","PeriodicalId":73382,"journal":{"name":"INFORMS journal on optimization","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2017-10-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1287/IJOO.2018.0008","citationCount":"56","resultStr":"{\"title\":\"“Relative Continuity” for Non-Lipschitz Nonsmooth Convex Optimization Using Stochastic (or Deterministic) Mirror Descent\",\"authors\":\"Haihao Lu\",\"doi\":\"10.1287/IJOO.2018.0008\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The usual approach to developing and analyzing first-order methods for non-smooth (stochastic or deterministic) convex optimization assumes that the objective function is uniformly Lipschitz continuous with parameter $M_f$. However, in many settings the non-differentiable convex function $f(\\\\cdot)$ is not uniformly Lipschitz continuous -- for example (i) the classical support vector machine (SVM) problem, (ii) the problem of minimizing the maximum of convex quadratic functions, and even (iii) the univariate setting with $f(x) := \\\\max\\\\{0, x\\\\} + x^2$. Herein we develop a notion of \\\"relative continuity\\\" that is determined relative to a user-specified \\\"reference function\\\" $h(\\\\cdot)$ (that should be computationally tractable for algorithms), and we show that many non-differentiable convex functions are relatively continuous with respect to a correspondingly fairly-simple reference function $h(\\\\cdot)$. We also similarly develop a notion of \\\"relative stochastic continuity\\\" for the stochastic setting. We analysis two standard algorithms -- the (deterministic) mirror descent algorithm and the stochastic mirror descent algorithm -- for solving optimization problems in these two new settings, and we develop for the first time computational guarantees for instances where the objective function is not uniformly Lipschitz continuous. This paper is a companion paper for non-differentiable convex optimization to the recent paper by Lu, Freund, and Nesterov, which developed similar sorts of results for differentiable convex optimization.\",\"PeriodicalId\":73382,\"journal\":{\"name\":\"INFORMS journal on optimization\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-10-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1287/IJOO.2018.0008\",\"citationCount\":\"56\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"INFORMS journal on optimization\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1287/IJOO.2018.0008\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"INFORMS journal on optimization","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1287/IJOO.2018.0008","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 56

摘要

开发和分析非光滑(随机或确定性)凸优化的一阶方法的常用方法假设目标函数与参数$M_f$一致Lipschitz连续。然而,在许多设置中,不可微凸函数$f(\cdot)$不是一致Lipschitz连续的——例如(i)经典支持向量机(SVM)问题,(ii)凸二次函数的最大值最小化问题,甚至(iii)$f(x):=\max\{0,x\}+x^2$的单变量设置。在此,我们发展了一个“相对连续性”的概念,该概念是相对于用户指定的“参考函数”$h(\cdot)$(对于算法来说应该是可计算的)确定的,并且我们证明了许多不可微凸函数相对于相应的相当简单的参考函数$h(/cdot)$是相对连续的。我们还类似地为随机设置发展了“相对随机连续性”的概念。我们分析了两种标准算法——(确定性)镜像下降算法和随机镜像下降算法——用于解决这两种新设置下的优化问题,并首次为目标函数不是一致Lipschitz连续的情况开发了计算保证。这篇论文是Lu、Freund和Nesterov最近的一篇关于不可微凸优化的论文的配套论文,Lu、Flund和Neterov最近的论文发展了类似的关于可微凸最优化的结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
“Relative Continuity” for Non-Lipschitz Nonsmooth Convex Optimization Using Stochastic (or Deterministic) Mirror Descent
The usual approach to developing and analyzing first-order methods for non-smooth (stochastic or deterministic) convex optimization assumes that the objective function is uniformly Lipschitz continuous with parameter $M_f$. However, in many settings the non-differentiable convex function $f(\cdot)$ is not uniformly Lipschitz continuous -- for example (i) the classical support vector machine (SVM) problem, (ii) the problem of minimizing the maximum of convex quadratic functions, and even (iii) the univariate setting with $f(x) := \max\{0, x\} + x^2$. Herein we develop a notion of "relative continuity" that is determined relative to a user-specified "reference function" $h(\cdot)$ (that should be computationally tractable for algorithms), and we show that many non-differentiable convex functions are relatively continuous with respect to a correspondingly fairly-simple reference function $h(\cdot)$. We also similarly develop a notion of "relative stochastic continuity" for the stochastic setting. We analysis two standard algorithms -- the (deterministic) mirror descent algorithm and the stochastic mirror descent algorithm -- for solving optimization problems in these two new settings, and we develop for the first time computational guarantees for instances where the objective function is not uniformly Lipschitz continuous. This paper is a companion paper for non-differentiable convex optimization to the recent paper by Lu, Freund, and Nesterov, which developed similar sorts of results for differentiable convex optimization.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信