Computational Optimization and Applications最新文献

筛选
英文 中文
Enhancements of discretization approaches for non-convex mixed-integer quadratically constrained quadratic programming: Part I 非凸混合整数二次约束二次编程离散化方法的改进:第一部分
IF 2.2 2区 数学
Computational Optimization and Applications Pub Date : 2024-01-30 DOI: 10.1007/s10589-023-00543-7
Benjamin Beach, Robert Burlacu, Andreas Bärmann, Lukas Hager, Robert Hildebrand
{"title":"Enhancements of discretization approaches for non-convex mixed-integer quadratically constrained quadratic programming: Part I","authors":"Benjamin Beach, Robert Burlacu, Andreas Bärmann, Lukas Hager, Robert Hildebrand","doi":"10.1007/s10589-023-00543-7","DOIUrl":"https://doi.org/10.1007/s10589-023-00543-7","url":null,"abstract":"<p>We study mixed-integer programming (MIP) relaxation techniques for the solution of non-convex mixed-integer quadratically constrained quadratic programs (MIQCQPs). We present MIP relaxation methods for non-convex continuous variable products. In this paper, we consider MIP relaxations based on separable reformulation. The main focus is the introduction of the enhanced separable MIP relaxation for non-convex quadratic products of the form <span>(z=xy)</span>, called <i>hybrid separable</i> (HybS). Additionally, we introduce a logarithmic MIP relaxation for univariate quadratic terms, called <i>sawtooth relaxation</i>, based on Beach (Beach in J Glob Optim 84:869–912, 2022). We combine the latter with HybS and existing separable reformulations to derive MIP relaxations of MIQCQPs. We provide a comprehensive theoretical analysis of these techniques, underlining the theoretical advantages of HybS compared to its predecessors. We perform a broad computational study to demonstrate the effectiveness of the enhanced MIP relaxation in terms of producing tight dual bounds for MIQCQPs. In Part II, we study MIP relaxations that extend the MIP relaxation <i>normalized multiparametric disaggregation technique</i> (NMDT) (Castro in J Glob Optim 64:765–784, 2015) and present a computational study which also includes the MIP relaxations from this work and compares them with a state-of-the-art of MIQCQP solvers.</p>","PeriodicalId":55227,"journal":{"name":"Computational Optimization and Applications","volume":"62 4 1","pages":""},"PeriodicalIF":2.2,"publicationDate":"2024-01-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139646514","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Alternative extension of the Hager–Zhang conjugate gradient method for vector optimization 用于矢量优化的哈格-张共轭梯度法的替代扩展
IF 2.2 2区 数学
Computational Optimization and Applications Pub Date : 2024-01-24 DOI: 10.1007/s10589-023-00548-2
Qingjie Hu, Liping Zhu, Yu Chen
{"title":"Alternative extension of the Hager–Zhang conjugate gradient method for vector optimization","authors":"Qingjie Hu, Liping Zhu, Yu Chen","doi":"10.1007/s10589-023-00548-2","DOIUrl":"https://doi.org/10.1007/s10589-023-00548-2","url":null,"abstract":"<p>Recently, Gonçalves and Prudente proposed an extension of the Hager–Zhang nonlinear conjugate gradient method for vector optimization (Comput Optim Appl 76:889–916, 2020). They initially demonstrated that directly extending the Hager–Zhang method for vector optimization may not result in descent in the vector sense, even when employing an exact line search. By utilizing a sufficiently accurate line search, they subsequently introduced a self-adjusting Hager–Zhang conjugate gradient method in the vector sense. The global convergence of this new scheme was proven without requiring regular restarts or any convex assumptions. In this paper, we propose an alternative extension of the Hager–Zhang nonlinear conjugate gradient method for vector optimization that preserves its desirable scalar property, i.e., ensuring sufficiently descent without relying on any line search or convexity assumption. Furthermore, we investigate its global convergence with the Wolfe line search under mild assumptions. Finally, numerical experiments are presented to illustrate the practical behavior of our proposed method.</p>","PeriodicalId":55227,"journal":{"name":"Computational Optimization and Applications","volume":"11 1","pages":""},"PeriodicalIF":2.2,"publicationDate":"2024-01-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139557926","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A family of Barzilai-Borwein steplengths from the viewpoint of scaled total least squares 从按比例总最小二乘法的角度看 Barzilai-Borwein 步长族
IF 2.2 2区 数学
Computational Optimization and Applications Pub Date : 2024-01-18 DOI: 10.1007/s10589-023-00546-4
Shiru Li, Tao Zhang, Yong Xia
{"title":"A family of Barzilai-Borwein steplengths from the viewpoint of scaled total least squares","authors":"Shiru Li, Tao Zhang, Yong Xia","doi":"10.1007/s10589-023-00546-4","DOIUrl":"https://doi.org/10.1007/s10589-023-00546-4","url":null,"abstract":"<p>The Barzilai-Borwein (BB) steplengths play great roles in practical gradient methods for solving unconstrained optimization problems. Motivated by the observation that the two well-known BB steplengths correspond to the ordinary and the data least squares, respectively, we introduce a novel family of BB steplengths from the viewpoint of scaled total least squares. Numerical experiments demonstrate that high performance can be received by a carefully-selected BB steplength in the new family.</p>","PeriodicalId":55227,"journal":{"name":"Computational Optimization and Applications","volume":"20 1","pages":""},"PeriodicalIF":2.2,"publicationDate":"2024-01-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139497661","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Internet traffic tensor completion with tensor nuclear norm 用张量核规范完成互联网流量张量
IF 2.2 2区 数学
Computational Optimization and Applications Pub Date : 2023-12-21 DOI: 10.1007/s10589-023-00545-5
Can Li, Yannan Chen, Dong-hui Li
{"title":"Internet traffic tensor completion with tensor nuclear norm","authors":"Can Li, Yannan Chen, Dong-hui Li","doi":"10.1007/s10589-023-00545-5","DOIUrl":"https://doi.org/10.1007/s10589-023-00545-5","url":null,"abstract":"","PeriodicalId":55227,"journal":{"name":"Computational Optimization and Applications","volume":"20 7","pages":""},"PeriodicalIF":2.2,"publicationDate":"2023-12-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138952785","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Correction to: The continuous stochastic gradient method: part II–application and numerics 更正:连续随机梯度法:第二部分--应用和数值计算
IF 2.2 2区 数学
Computational Optimization and Applications Pub Date : 2023-12-13 DOI: 10.1007/s10589-023-00544-6
Max Grieshammer, Lukas Pflug, Michael Stingl, Andrian Uihlein
{"title":"Correction to: The continuous stochastic gradient method: part II–application and numerics","authors":"Max Grieshammer, Lukas Pflug, Michael Stingl, Andrian Uihlein","doi":"10.1007/s10589-023-00544-6","DOIUrl":"https://doi.org/10.1007/s10589-023-00544-6","url":null,"abstract":"","PeriodicalId":55227,"journal":{"name":"Computational Optimization and Applications","volume":"26 6","pages":""},"PeriodicalIF":2.2,"publicationDate":"2023-12-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139005130","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Bregman–Kaczmarz method for nonlinear systems of equations 非线性方程组的 Bregman-Kaczmarz 方法
IF 2.2 2区 数学
Computational Optimization and Applications Pub Date : 2023-12-07 DOI: 10.1007/s10589-023-00541-9
Robert Gower, Dirk A. Lorenz, Maximilian Winkler
{"title":"A Bregman–Kaczmarz method for nonlinear systems of equations","authors":"Robert Gower, Dirk A. Lorenz, Maximilian Winkler","doi":"10.1007/s10589-023-00541-9","DOIUrl":"https://doi.org/10.1007/s10589-023-00541-9","url":null,"abstract":"<p>We propose a new randomized method for solving systems of nonlinear equations, which can find sparse solutions or solutions under certain simple constraints. The scheme only takes gradients of component functions and uses Bregman projections onto the solution space of a Newton equation. In the special case of euclidean projections, the method is known as nonlinear Kaczmarz method. Furthermore if the component functions are nonnegative, we are in the setting of optimization under the interpolation assumption and the method reduces to SGD with the recently proposed stochastic Polyak step size. For general Bregman projections, our method is a stochastic mirror descent with a novel adaptive step size. We prove that in the convex setting each iteration of our method results in a smaller Bregman distance to exact solutions as compared to the standard Polyak step. Our generalization to Bregman projections comes with the price that a convex one-dimensional optimization problem needs to be solved in each iteration. This can typically be done with globalized Newton iterations. Convergence is proved in two classical settings of nonlinearity: for convex nonnegative functions and locally for functions which fulfill the tangential cone condition. Finally, we show examples in which the proposed method outperforms similar methods with the same memory requirements.</p>","PeriodicalId":55227,"journal":{"name":"Computational Optimization and Applications","volume":"22 1","pages":""},"PeriodicalIF":2.2,"publicationDate":"2023-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138557060","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The continuous stochastic gradient method: part II–application and numerics 连续随机梯度法:第二部分-应用与数值
IF 2.2 2区 数学
Computational Optimization and Applications Pub Date : 2023-11-24 DOI: 10.1007/s10589-023-00540-w
Max Grieshammer, Lukas Pflug, Michael Stingl, Andrian Uihlein
{"title":"The continuous stochastic gradient method: part II–application and numerics","authors":"Max Grieshammer, Lukas Pflug, Michael Stingl, Andrian Uihlein","doi":"10.1007/s10589-023-00540-w","DOIUrl":"https://doi.org/10.1007/s10589-023-00540-w","url":null,"abstract":"<p>In this contribution, we present a numerical analysis of the <i>continuous stochastic gradient</i> (CSG) method, including applications from topology optimization and convergence rates. In contrast to standard stochastic gradient optimization schemes, CSG does not discard old gradient samples from previous iterations. Instead, design dependent integration weights are calculated to form a convex combination as an approximation to the true gradient at the current design. As the approximation error vanishes in the course of the iterations, CSG represents a hybrid approach, starting off like a purely stochastic method and behaving like a full gradient scheme in the limit. In this work, the efficiency of CSG is demonstrated for practically relevant applications from topology optimization. These settings are characterized by both, a large number of optimization variables <i>and</i> an objective function, whose evaluation requires the numerical computation of multiple integrals concatenated in a nonlinear fashion. Such problems could not be solved by any existing optimization method before. Lastly, with regards to convergence rates, first estimates are provided and confirmed with the help of numerical experiments.\u0000</p>","PeriodicalId":55227,"journal":{"name":"Computational Optimization and Applications","volume":"62 3","pages":""},"PeriodicalIF":2.2,"publicationDate":"2023-11-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138513594","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
The continuous stochastic gradient method: part I–convergence theory 连续随机梯度法:第一部分收敛理论
IF 2.2 2区 数学
Computational Optimization and Applications Pub Date : 2023-11-23 DOI: 10.1007/s10589-023-00542-8
Max Grieshammer, Lukas Pflug, Michael Stingl, Andrian Uihlein
{"title":"The continuous stochastic gradient method: part I–convergence theory","authors":"Max Grieshammer, Lukas Pflug, Michael Stingl, Andrian Uihlein","doi":"10.1007/s10589-023-00542-8","DOIUrl":"https://doi.org/10.1007/s10589-023-00542-8","url":null,"abstract":"<p>In this contribution, we present a full overview of the <i>continuous stochastic gradient</i> (CSG) method, including convergence results, step size rules and algorithmic insights. We consider optimization problems in which the objective function requires some form of integration, e.g., expected values. Since approximating the integration by a fixed quadrature rule can introduce artificial local solutions into the problem while simultaneously raising the computational effort, stochastic optimization schemes have become increasingly popular in such contexts. However, known stochastic gradient type methods are typically limited to expected risk functions and inherently require many iterations. The latter is particularly problematic, if the evaluation of the cost function involves solving multiple state equations, given, e.g., in form of partial differential equations. To overcome these drawbacks, a recent article introduced the CSG method, which reuses old gradient sample information via the calculation of design dependent integration weights to obtain a better approximation to the full gradient. While in the original CSG paper convergence of a subsequence was established for a diminishing step size, here, we provide a complete convergence analysis of CSG for constant step sizes and an Armijo-type line search. Moreover, new methods to obtain the integration weights are presented, extending the application range of CSG to problems involving higher dimensional integrals and distributed data.</p>","PeriodicalId":55227,"journal":{"name":"Computational Optimization and Applications","volume":"62 2","pages":""},"PeriodicalIF":2.2,"publicationDate":"2023-11-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138513595","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Preface to Asen L. Dontchev Memorial Special Issue 阿森·l·顿切夫纪念特刊前言
2区 数学
Computational Optimization and Applications Pub Date : 2023-11-03 DOI: 10.1007/s10589-023-00537-5
William W. Hager, R. Tyrrell Rockafellar, Vladimir M. Veliov
{"title":"Preface to Asen L. Dontchev Memorial Special Issue","authors":"William W. Hager, R. Tyrrell Rockafellar, Vladimir M. Veliov","doi":"10.1007/s10589-023-00537-5","DOIUrl":"https://doi.org/10.1007/s10589-023-00537-5","url":null,"abstract":"","PeriodicalId":55227,"journal":{"name":"Computational Optimization and Applications","volume":"14 5","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-11-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135867996","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
COAP 2022 Best Paper Prize COAP 2022最佳论文奖
2区 数学
Computational Optimization and Applications Pub Date : 2023-10-30 DOI: 10.1007/s10589-023-00538-4
{"title":"COAP 2022 Best Paper Prize","authors":"","doi":"10.1007/s10589-023-00538-4","DOIUrl":"https://doi.org/10.1007/s10589-023-00538-4","url":null,"abstract":"","PeriodicalId":55227,"journal":{"name":"Computational Optimization and Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136104847","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信