SIAM Journal on Optimization最新文献

筛选
英文 中文
Exact Quantization of Multistage Stochastic Linear Problems 多阶段随机线性问题的精确量化
IF 3.1 1区 数学
SIAM Journal on Optimization Pub Date : 2024-02-05 DOI: 10.1137/22m1508005
Maël Forcier, Stéphane Gaubert, Vincent Leclère
{"title":"Exact Quantization of Multistage Stochastic Linear Problems","authors":"Maël Forcier, Stéphane Gaubert, Vincent Leclère","doi":"10.1137/22m1508005","DOIUrl":"https://doi.org/10.1137/22m1508005","url":null,"abstract":"SIAM Journal on Optimization, Volume 34, Issue 1, Page 533-562, March 2024. <br/> Abstract. We show that the multistage stochastic linear problem (MSLP) with an arbitrary cost distribution is equivalent to an MSLP on a finite scenario tree. We establish this exact quantization result by analyzing the polyhedral structure of MSLPs. In particular, we show that the expected cost-to-go functions are polyhedral and affine on the cells of a chamber complex, which is independent of the cost distribution. This leads to new complexity results, showing that MSLP becomes polynomial when certain parameters are fixed.","PeriodicalId":49529,"journal":{"name":"SIAM Journal on Optimization","volume":"69 1","pages":""},"PeriodicalIF":3.1,"publicationDate":"2024-02-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139765297","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Hybrid Algorithms for Finding a D-Stationary Point of a Class of Structured Nonsmooth DC Minimization 寻找一类结构非光滑直流最小化的 D-静态点的混合算法
IF 3.1 1区 数学
SIAM Journal on Optimization Pub Date : 2024-02-01 DOI: 10.1137/21m1457709
Zhe Sun, Lei Wu
{"title":"Hybrid Algorithms for Finding a D-Stationary Point of a Class of Structured Nonsmooth DC Minimization","authors":"Zhe Sun, Lei Wu","doi":"10.1137/21m1457709","DOIUrl":"https://doi.org/10.1137/21m1457709","url":null,"abstract":"SIAM Journal on Optimization, Volume 34, Issue 1, Page 485-506, March 2024. <br/> Abstract. In this paper, we consider a class of structured nonsmooth difference-of-convex (DC) minimization in which the first convex component is the sum of a smooth and a nonsmooth function, while the second convex component is the supremum of finitely many convex smooth functions. The existing methods for this problem usually have weak convergence guarantees or need to solve lots of subproblems per iteration. Due to this, we propose hybrid algorithms for solving this problem in which we first compute approximate critical points and then check whether these points are approximate D-stationary points. Under suitable conditions, we prove that there exists a subsequence of iterates of which every accumulation point is a D-stationary point. Some preliminary numerical experiments are conducted to demonstrate the efficiency of the proposed algorithms.","PeriodicalId":49529,"journal":{"name":"SIAM Journal on Optimization","volume":"10 1","pages":""},"PeriodicalIF":3.1,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139669745","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Shortest Paths in Graphs of Convex Sets 凸集合图中的最短路径
IF 3.1 1区 数学
SIAM Journal on Optimization Pub Date : 2024-02-01 DOI: 10.1137/22m1523790
Tobia Marcucci, Jack Umenberger, Pablo Parrilo, Russ Tedrake
{"title":"Shortest Paths in Graphs of Convex Sets","authors":"Tobia Marcucci, Jack Umenberger, Pablo Parrilo, Russ Tedrake","doi":"10.1137/22m1523790","DOIUrl":"https://doi.org/10.1137/22m1523790","url":null,"abstract":"SIAM Journal on Optimization, Volume 34, Issue 1, Page 507-532, March 2024. <br/> Abstract. Given a graph, the shortest-path problem requires finding a sequence of edges with minimum cumulative length that connects a source vertex to a target vertex. We consider a variant of this classical problem in which the position of each vertex in the graph is a continuous decision variable constrained in a convex set, and the length of an edge is a convex function of the position of its endpoints. Problems of this form arise naturally in many areas, from motion planning of autonomous vehicles to optimal control of hybrid systems. The price for such a wide applicability is the complexity of this problem, which is easily seen to be NP-hard. Our main contribution is a strong and lightweight mixed-integer convex formulation based on perspective operators, that makes it possible to efficiently find globally optimal paths in large graphs and in high-dimensional spaces.","PeriodicalId":49529,"journal":{"name":"SIAM Journal on Optimization","volume":"38 1","pages":""},"PeriodicalIF":3.1,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139669374","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Infeasibility Detection with Primal-Dual Hybrid Gradient for Large-Scale Linear Programming 大规模线性规划的原点-双混合梯度不可行性检测
IF 3.1 1区 数学
SIAM Journal on Optimization Pub Date : 2024-01-31 DOI: 10.1137/22m1510467
David Applegate, Mateo Díaz, Haihao Lu, Miles Lubin
{"title":"Infeasibility Detection with Primal-Dual Hybrid Gradient for Large-Scale Linear Programming","authors":"David Applegate, Mateo Díaz, Haihao Lu, Miles Lubin","doi":"10.1137/22m1510467","DOIUrl":"https://doi.org/10.1137/22m1510467","url":null,"abstract":"SIAM Journal on Optimization, Volume 34, Issue 1, Page 459-484, March 2024. <br/> Abstract. We study the problem of detecting infeasibility of large-scale linear programming problems using the primal-dual hybrid gradient (PDHG) method of Chambolle and Pock [J. Math. Imaging Vision, 40 (2011), pp. 120–145]. The literature on PDHG has focused chiefly on problems with at least one optimal solution. We show that when the problem is infeasible or unbounded, the iterates diverge at a controlled rate toward a well-defined ray. In turn, the direction of such a ray recovers infeasibility certificates. Based on this fact, we propose a simple way to extract approximate infeasibility certificates from the iterates of PDHG. We study three sequences that converge to certificates: the difference of iterates, the normalized iterates, and the normalized average. All of them are easy to compute and suitable for large-scale problems. We show that the normalized iterates and normalized averages achieve a convergence rate of [math]. This rate is general and applies to any fixed-point iteration of a nonexpansive operator. Thus, it is a result of independent interest that goes well beyond our setting. Finally, we show that, under nondegeneracy assumptions, the iterates of PDHG identify the active set of an auxiliary feasible problem in finite time, which ensures that the difference of iterates exhibits eventual linear convergence. These results provide a theoretical justification for infeasibility detection in the newly developed linear programming solver PDLP.","PeriodicalId":49529,"journal":{"name":"SIAM Journal on Optimization","volume":"26 1","pages":""},"PeriodicalIF":3.1,"publicationDate":"2024-01-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139657357","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Distributionally Favorable Optimization: A Framework for Data-Driven Decision-Making with Endogenous Outliers 有利于分布的优化:内生异常值数据驱动决策框架
IF 3.1 1区 数学
SIAM Journal on Optimization Pub Date : 2024-01-29 DOI: 10.1137/22m1528094
Nan Jiang, Weijun Xie
{"title":"Distributionally Favorable Optimization: A Framework for Data-Driven Decision-Making with Endogenous Outliers","authors":"Nan Jiang, Weijun Xie","doi":"10.1137/22m1528094","DOIUrl":"https://doi.org/10.1137/22m1528094","url":null,"abstract":"SIAM Journal on Optimization, Volume 34, Issue 1, Page 419-458, March 2024. <br/> Abstract. A typical data-driven stochastic program seeks the best decision that minimizes the sum of a deterministic cost function and an expected recourse function under a given distribution. Recently, much success has been witnessed in the development of distributionally robust optimization (DRO), which considers the worst-case expected recourse function under the least favorable probability distribution from a distributional family. However, in the presence of endogenous outliers such that their corresponding recourse function values are very large or even infinite, the commonly used DRO framework alone tends to overemphasize these endogenous outliers and cause undesirable or even infeasible decisions. On the contrary, distributionally favorable optimization (DFO), concerning the best-case expected recourse function under the most favorable distribution from the distributional family, can serve as a proper measure of the stochastic recourse function and mitigate the effect of endogenous outliers. We show that DFO recovers many robust statistics, suggesting that the DFO framework might be appropriate for the stochastic recourse function in the presence of endogenous outliers. A notion of decision outlier robustness is proposed for selecting a DFO framework for data-driven optimization with outliers. We also provide a unified way to integrate DRO with DFO, where DRO addresses the out-of-sample performance, and DFO properly handles the stochastic recourse function under endogenous outliers. We further extend the proposed DFO framework to solve two-stage stochastic programs without relatively complete recourse. The numerical study demonstrates that the framework is promising.","PeriodicalId":49529,"journal":{"name":"SIAM Journal on Optimization","volume":"46 2 1","pages":""},"PeriodicalIF":3.1,"publicationDate":"2024-01-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139585526","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Bayesian Stochastic Gradient Descent for Stochastic Optimization with Streaming Input Data 利用流输入数据进行随机优化的贝叶斯随机梯度下降法
IF 3.1 1区 数学
SIAM Journal on Optimization Pub Date : 2024-01-25 DOI: 10.1137/22m1478951
Tianyi Liu, Yifan Lin, Enlu Zhou
{"title":"Bayesian Stochastic Gradient Descent for Stochastic Optimization with Streaming Input Data","authors":"Tianyi Liu, Yifan Lin, Enlu Zhou","doi":"10.1137/22m1478951","DOIUrl":"https://doi.org/10.1137/22m1478951","url":null,"abstract":"SIAM Journal on Optimization, Volume 34, Issue 1, Page 389-418, March 2024. <br/> Abstract. We consider stochastic optimization under distributional uncertainty, where the unknown distributional parameter is estimated from streaming data that arrive sequentially over time. Moreover, data may depend on the decision at the time when they are generated. For both decision-independent and decision-dependent uncertainties, we propose an approach to jointly estimate the distributional parameter via Bayesian posterior distribution and update the decision by applying stochastic gradient descent (SGD) on the Bayesian average of the objective function. Our approach converges asymptotically over time and achieves the convergence rates of classical SGD in the decision-independent case. We demonstrate the empirical performance of our approach on both synthetic test problems and a classical newsvendor problem.","PeriodicalId":49529,"journal":{"name":"SIAM Journal on Optimization","volume":"214 1","pages":""},"PeriodicalIF":3.1,"publicationDate":"2024-01-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139551719","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Descent Properties of an Anderson Accelerated Gradient Method with Restarting 带重启的安德森加速梯度法的下降特性
IF 3.1 1区 数学
SIAM Journal on Optimization Pub Date : 2024-01-19 DOI: 10.1137/22m151460x
Wenqing Ouyang, Yang Liu, Andre Milzarek
{"title":"Descent Properties of an Anderson Accelerated Gradient Method with Restarting","authors":"Wenqing Ouyang, Yang Liu, Andre Milzarek","doi":"10.1137/22m151460x","DOIUrl":"https://doi.org/10.1137/22m151460x","url":null,"abstract":"SIAM Journal on Optimization, Volume 34, Issue 1, Page 336-365, March 2024. <br/> Abstract. Anderson acceleration ([math]) is a popular acceleration technique to enhance the convergence of fixed-point schemes. The analysis of [math] approaches often focuses on the convergence behavior of a corresponding fixed-point residual, while the behavior of the underlying objective function values along the accelerated iterates is currently not well understood. In this paper, we investigate local properties of [math] with restarting applied to a basic gradient scheme ([math]) in terms of function values. Specifically, we show that [math] is a local descent method and that it can decrease the objective function at a rate no slower than the gradient method up to higher-order error terms. These new results theoretically support the good numerical performance of [math] when heuristic descent conditions are used for globalization and they provide a novel perspective on the convergence analysis of [math] that is more amenable to nonconvex optimization problems. Numerical experiments are conducted to illustrate our theoretical findings.","PeriodicalId":49529,"journal":{"name":"SIAM Journal on Optimization","volume":"3 1","pages":""},"PeriodicalIF":3.1,"publicationDate":"2024-01-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139495212","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Basic Convex Analysis in Metric Spaces with Bounded Curvature 有界曲率公元空间中的基本凸分析
IF 3.1 1区 数学
SIAM Journal on Optimization Pub Date : 2024-01-19 DOI: 10.1137/23m1551389
Adrian S. Lewis, Genaro López-Acedo, Adriana Nicolae
{"title":"Basic Convex Analysis in Metric Spaces with Bounded Curvature","authors":"Adrian S. Lewis, Genaro López-Acedo, Adriana Nicolae","doi":"10.1137/23m1551389","DOIUrl":"https://doi.org/10.1137/23m1551389","url":null,"abstract":"SIAM Journal on Optimization, Volume 34, Issue 1, Page 366-388, March 2024. <br/> Abstract. Differentiable structure ensures that many of the basics of classical convex analysis extend naturally from Euclidean space to Riemannian manifolds. Without such structure, however, extensions are more challenging. Nonetheless, in Alexandrov spaces with curvature bounded above (but possibly positive), we develop several basic building blocks. We define subgradients via projection and the normal cone, prove their existence, and relate them to the classical affine minorant property. Then, in what amounts to a simple calculus or duality result, we develop a necessary optimality condition for minimizing the sum of two convex functions.","PeriodicalId":49529,"journal":{"name":"SIAM Journal on Optimization","volume":"1 1","pages":""},"PeriodicalIF":3.1,"publicationDate":"2024-01-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139495059","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Decomposition Algorithm for Two-Stage Stochastic Programs with Nonconvex Recourse Functions 具有非凸求助函数的两阶段随机程序的分解算法
IF 3.1 1区 数学
SIAM Journal on Optimization Pub Date : 2024-01-19 DOI: 10.1137/22m1488533
Hanyang Li, Ying Cui
{"title":"A Decomposition Algorithm for Two-Stage Stochastic Programs with Nonconvex Recourse Functions","authors":"Hanyang Li, Ying Cui","doi":"10.1137/22m1488533","DOIUrl":"https://doi.org/10.1137/22m1488533","url":null,"abstract":"SIAM Journal on Optimization, Volume 34, Issue 1, Page 306-335, March 2024. <br/> Abstract. In this paper, we have studied a decomposition method for solving a class of nonconvex two-stage stochastic programs, where both the objective and constraints of the second-stage problem are nonlinearly parameterized by the first-stage variables. Due to the failure of the Clarke regularity of the resulting nonconvex recourse function, classical decomposition approaches such as Benders decomposition and (augmented) Lagrangian-based algorithms cannot be directly generalized to solve such models. By exploring an implicitly convex-concave structure of the recourse function, we introduce a novel decomposition framework based on the so-called partial Moreau envelope. The algorithm successively generates strongly convex quadratic approximations of the recourse function based on the solutions of the second-stage convex subproblems and adds them to the first-stage master problem. Convergence has been established for both a fixed number of scenarios and a sequential internal sampling strategy. Numerical experiments are conducted to demonstrate the effectiveness of the proposed algorithm.","PeriodicalId":49529,"journal":{"name":"SIAM Journal on Optimization","volume":"1 1","pages":""},"PeriodicalIF":3.1,"publicationDate":"2024-01-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139495101","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Sharper Bounds for Proximal Gradient Algorithms with Errors 有误差的近端梯度算法的更清晰边界
IF 3.1 1区 数学
SIAM Journal on Optimization Pub Date : 2024-01-19 DOI: 10.1137/22m1480161
Anis Hamadouche, Yun Wu, Andrew M. Wallace, João F. C. Mota
{"title":"Sharper Bounds for Proximal Gradient Algorithms with Errors","authors":"Anis Hamadouche, Yun Wu, Andrew M. Wallace, João F. C. Mota","doi":"10.1137/22m1480161","DOIUrl":"https://doi.org/10.1137/22m1480161","url":null,"abstract":"SIAM Journal on Optimization, Volume 34, Issue 1, Page 278-305, March 2024. <br/> Abstract. We analyze the convergence of the proximal gradient algorithm for convex composite problems in the presence of gradient and proximal computational inaccuracies. We generalize the deterministic analysis to the quasi-Fejér case and quantify the uncertainty incurred from approximate computing and early termination errors. We propose new probabilistic tighter bounds that we use to verify a simulated Model Predictive Control (MPC) with sparse controls problem solved with early termination, reduced precision, and proximal errors. We also show how the probabilistic bounds are more suitable than the deterministic ones for algorithm verification and more accurate for application performance guarantees. Under mild statistical assumptions, we also prove that some cumulative error terms follow a martingale property. And conforming to observations, e.g., in [M. Schmidt, N. L. Roux, and F. R. Bach, Convergence rates of inexact proximal-gradient methods for convex optimization, in Advances in Neural Information Processing Systems, 2011, pp. 1458–1466], we also show how the acceleration of the algorithm amplifies the gradient and proximal computational errors.","PeriodicalId":49529,"journal":{"name":"SIAM Journal on Optimization","volume":"27 1","pages":""},"PeriodicalIF":3.1,"publicationDate":"2024-01-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139495041","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信