{"title":"The appeals of quadratic majorization–minimization","authors":"Marc C. Robini, Lihui Wang, Yuemin Zhu","doi":"10.1007/s10898-023-01361-1","DOIUrl":"https://doi.org/10.1007/s10898-023-01361-1","url":null,"abstract":"<p>Majorization–minimization (MM) is a versatile optimization technique that operates on surrogate functions satisfying tangency and domination conditions. Our focus is on differentiable optimization using inexact MM with quadratic surrogates, which amounts to approximately solving a sequence of symmetric positive definite systems. We begin by investigating the convergence properties of this process, from subconvergence to R-linear convergence, with emphasis on tame objectives. Then we provide a numerically stable implementation based on truncated conjugate gradient. Applications to multidimensional scaling and regularized inversion are discussed and illustrated through numerical experiments on graph layout and X-ray tomography. In the end, quadratic MM not only offers solid guarantees of convergence and stability, but is robust to the choice of its control parameters.</p>","PeriodicalId":15961,"journal":{"name":"Journal of Global Optimization","volume":null,"pages":null},"PeriodicalIF":1.8,"publicationDate":"2024-01-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139583595","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Generalized derivatives of optimal-value functions with parameterized convex programs embedded","authors":"Yingkai Song, Paul I. Barton","doi":"10.1007/s10898-023-01359-9","DOIUrl":"https://doi.org/10.1007/s10898-023-01359-9","url":null,"abstract":"<p>This article proposes new practical methods for furnishing generalized derivative information of optimal-value functions with embedded parameterized convex programs, with potential applications in nonsmooth equation-solving and optimization. We consider three cases of parameterized convex programs: (1) partial convexity—functions in the convex programs are convex with respect to decision variables for fixed values of parameters, (2) joint convexity—the functions are convex with respect to both decision variables and parameters, and (3) linear programs where the parameters appear in the objective function. These new methods calculate an LD-derivative, which is a recently established useful generalized derivative concept, by constructing and solving a sequence of auxiliary linear programs. In the general partial convexity case, our new method requires that the strong Slater conditions are satisfied for the embedded convex program’s decision space, and requires that the convex program has a unique optimal solution. It is shown that these conditions are essentially less stringent than the regularity conditions required by certain established methods, and our new method is at the same time computationally preferable over these methods. In the joint convexity case, the uniqueness requirement of an optimal solution is further relaxed, and to our knowledge, there is no established method for computing generalized derivatives prior to this work. In the linear program case, both the Slater conditions and the uniqueness of an optimal solution are not required by our new method.</p>","PeriodicalId":15961,"journal":{"name":"Journal of Global Optimization","volume":null,"pages":null},"PeriodicalIF":1.8,"publicationDate":"2024-01-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139560266","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A strong P-formulation for global optimization of industrial water-using and treatment networks","authors":"Xin Cheng, Xiang Li","doi":"10.1007/s10898-023-01363-z","DOIUrl":"https://doi.org/10.1007/s10898-023-01363-z","url":null,"abstract":"<p>The problem of finding the optimal flow allocation within an industrial water-using and treatment network can be formulated into nonconvex nonlinear program or nonconvex mixed-integer nonlinear program. The efficiency of global optimization of the nonconvex program relies heavily on the strength of the problem formulation. In this paper, we propose a variant of the commonly used P-formulation, called the P<span>(^*)</span>-formulation, for the water treatment network (WTN) and the total water network (TWN) that includes water-using and water treatment units. For either type of networks, we prove that the P<span>(^*)</span>-formulation is at least as strong as the P-formulation under mild bound consistency conditions. We also prove for either type of networks that the P<span>(^*)</span>-formulation is at least as strong as the split-fraction based formulation (called SF-formulation) under certain bound consistency conditions. The computational study shows that the P<span>(^*)</span>-formulation significantly outperforms the P- and the SF-formulations. For some problem instances, the P<span>(^*)</span>-formulation is faster than the other two formulations by several orders of magnitudes.</p>","PeriodicalId":15961,"journal":{"name":"Journal of Global Optimization","volume":null,"pages":null},"PeriodicalIF":1.8,"publicationDate":"2024-01-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139560375","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"First- and second-order optimality conditions of nonsmooth sparsity multiobjective optimization via variational analysis","authors":"Jiawei Chen, Huasheng Su, Xiaoqing Ou, Yibing Lv","doi":"10.1007/s10898-023-01357-x","DOIUrl":"https://doi.org/10.1007/s10898-023-01357-x","url":null,"abstract":"<p>In this paper, we investigate optimality conditions of nonsmooth sparsity multiobjective optimization problem (shortly, SMOP) by the advanced variational analysis. We present the variational analysis characterizations, such as tangent cones, normal cones, dual cones and second-order tangent set, of the sparse set, and give the relationships among the sparse set and its tangent cones and second-order tangent set. The first-order necessary conditions for local weakly Pareto efficient solution of SMOP are established under some suitable conditions. We also obtain the equivalence between basic feasible point and stationary point defined by the Fréchet normal cone of SMOP. The sufficient optimality conditions of SMOP are derived under the pseudoconvexity. Moreover, the second-order necessary and sufficient optimality conditions of SMOP are established by the Dini directional derivatives of the objective function and the Bouligand tangent cone and second-order tangent set of the sparse set.</p>","PeriodicalId":15961,"journal":{"name":"Journal of Global Optimization","volume":null,"pages":null},"PeriodicalIF":1.8,"publicationDate":"2024-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139514724","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Interval constraint programming for globally solving catalog-based categorical optimization","authors":"","doi":"10.1007/s10898-023-01362-0","DOIUrl":"https://doi.org/10.1007/s10898-023-01362-0","url":null,"abstract":"<h3>Abstract</h3> <p>In this article, we propose an interval constraint programming method for globally solving catalog-based categorical optimization problems. It supports catalogs of arbitrary size and properties of arbitrary dimension, and does not require any modeling effort from the user. A novel catalog-based contractor (or filtering operator) guarantees consistency between the categorical properties and the existing catalog items. This results in an intuitive and generic approach that is exact, rigorous (robust to roundoff errors) and can be easily implemented in an off-the-shelf interval-based continuous solver that interleaves branching and constraint propagation. We demonstrate the validity of the approach on a numerical problem in which a categorical variable is described by a two-dimensional property space. A Julia prototype is available as open-source software under the MIT license at https://github.com/cvanaret/CateGOrical.jl.</p>","PeriodicalId":15961,"journal":{"name":"Journal of Global Optimization","volume":null,"pages":null},"PeriodicalIF":1.8,"publicationDate":"2024-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139518719","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Gaining or losing perspective for convex multivariate functions on a simplex","authors":"Luze Xu, Jon Lee","doi":"10.1007/s10898-023-01356-y","DOIUrl":"https://doi.org/10.1007/s10898-023-01356-y","url":null,"abstract":"<p>MINLO (mixed-integer nonlinear optimization) formulations of the disjunction between the origin and a polytope via a binary indicator variable have broad applicability in nonlinear combinatorial optimization, for modeling a fixed cost <i>c</i> associated with carrying out a set of <i>d</i> activities and a convex variable cost function <i>f</i> associated with the levels of the activities. The perspective relaxation is often used to solve such models to optimality in a branch-and-bound context, especially in the context in which <i>f</i> is univariate (e.g., in Markowitz-style portfolio optimization). But such a relaxation typically requires conic solvers and are typically not compatible with general-purpose NLP software which can accommodate additional classes of constraints. This motivates the study of weaker relaxations to investigate when simpler relaxations may be adequate. Comparing the volume (i.e., Lebesgue measure) of the relaxations as means of comparing them, we lift some of the results related to univariate functions <i>f</i> to the multivariate case. Along the way, we survey, connect and extend relevant results on integration over a simplex, some of which we concretely employ, and others of which can be used for further exploration on our main subject.</p>","PeriodicalId":15961,"journal":{"name":"Journal of Global Optimization","volume":null,"pages":null},"PeriodicalIF":1.8,"publicationDate":"2024-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139514938","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Robust second order cone conditions and duality for multiobjective problems under uncertainty data","authors":"Cao Thanh Tinh, Thai Doan Chuong","doi":"10.1007/s10898-023-01335-3","DOIUrl":"https://doi.org/10.1007/s10898-023-01335-3","url":null,"abstract":"<p>This paper studies a class of multiobjective convex polynomial problems, where both the constraint and objective functions involve uncertain parameters that reside in ellipsoidal uncertainty sets. Employing the robust deterministic approach, we provide necessary conditions and sufficient conditions, which are exhibited in relation to second order cone conditions, for robust (weak) Pareto solutions of the uncertain multiobjective optimization problem. A dual multiobjective problem is proposed to examine robust converse, robust weak and robust strong duality relations between the primal and dual problems. Moreover, we establish robust solution relationships between the uncertain multiobjective optimization program and a (scalar) second order cone programming relaxation problem of a corresponding weighted-sum optimization problem. This in particular shows that we can find a robust (weak) Pareto solution of the uncertain multiobjective optimization problem by solving a second order cone programming relaxation.\u0000</p>","PeriodicalId":15961,"journal":{"name":"Journal of Global Optimization","volume":null,"pages":null},"PeriodicalIF":1.8,"publicationDate":"2024-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139518645","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Single-lot, lot-streaming problem for a 1 + m hybrid flow shop","authors":"Sanchit Singh, Subhash C. Sarin, Ming Cheng","doi":"10.1007/s10898-023-01354-0","DOIUrl":"https://doi.org/10.1007/s10898-023-01354-0","url":null,"abstract":"<p>In this paper, we consider an application of lot-streaming for processing a lot of multiple items in a hybrid flow shop (HFS) for the objective of minimizing makespan. The HFS that we consider consists of two stages with a single machine available for processing in Stage 1 and <i>m</i> identical parallel machines in Stage 2. We call this problem a 1 + <i>m</i> TSHFS-LSP (two-stage hybrid flow shop, lot streaming problem), and show it to be NP-hard in general, except for the case when the sublot sizes are treated to be continuous. The novelty of our work is in obtaining closed-form expressions for optimal continuous sublot sizes that can be solved in polynomial time, for a given number of sublots. A fast linear search algorithm is also developed for determining the optimal number of sublots for the case of continuous sublot sizes. For the case when the sublot sizes are discrete, we propose a branch-and-bound-based heuristic to determine both the number of sublots and sublot sizes and demonstrate its efficacy by comparing its performance against that of a direct solution of a mixed-integer formulation of the problem by CPLEX<sup>®</sup>.</p>","PeriodicalId":15961,"journal":{"name":"Journal of Global Optimization","volume":null,"pages":null},"PeriodicalIF":1.8,"publicationDate":"2024-01-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139458859","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"On semidefinite programming relaxations for a class of robust SOS-convex polynomial optimization problems","authors":"Xiangkai Sun, Jiayi Huang, Kok Lay Teo","doi":"10.1007/s10898-023-01353-1","DOIUrl":"https://doi.org/10.1007/s10898-023-01353-1","url":null,"abstract":"<p>In this paper, we deal with a new class of SOS-convex (sum of squares convex) polynomial optimization problems with spectrahedral uncertainty data in both the objective and constraints. By using robust optimization and a weighted-sum scalarization methodology, we first present the relationship between robust solutions of this uncertain SOS-convex polynomial optimization problem and that of its corresponding scalar optimization problem. Then, by using a normal cone constraint qualification condition, we establish necessary and sufficient optimality conditions for robust weakly efficient solutions of this uncertain SOS-convex polynomial optimization problem based on scaled diagonally dominant sums of squares conditions and linear matrix inequalities. Moreover, we introduce a semidefinite programming relaxation problem of its weighted-sum scalar optimization problem, and show that robust weakly efficient solutions of the uncertain SOS-convex polynomial optimization problem can be found by solving the corresponding semidefinite programming relaxation problem.\u0000</p>","PeriodicalId":15961,"journal":{"name":"Journal of Global Optimization","volume":null,"pages":null},"PeriodicalIF":1.8,"publicationDate":"2024-01-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139373847","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Spectral projected subgradient method with a 1-memory momentum term for constrained multiobjective optimization problem","authors":"Jing-jing Wang, Li-ping Tang, Xin-min Yang","doi":"10.1007/s10898-023-01349-x","DOIUrl":"https://doi.org/10.1007/s10898-023-01349-x","url":null,"abstract":"<p>In this paper, we propose a spectral projected subgradient method with a 1-memory momentum term for solving constrained convex multiobjective optimization problem. This method combines the subgradient-type algorithm for multiobjective optimization problems with the idea of the spectral projected algorithm to accelerate the convergence process. Additionally, a 1-memory momentum term is added to the subgradient direction in the early iterations. The 1-memory momentum term incorporates, in the present iteration, some of the influence of the past iterations, and this can help to improve the search direction. Under suitable assumptions, we show that the sequence generated by the method converges to a weakly Pareto efficient solution and derive the sublinear convergence rates for the proposed method. Finally, computational experiments are given to demonstrate the effectiveness of the proposed method.</p>","PeriodicalId":15961,"journal":{"name":"Journal of Global Optimization","volume":null,"pages":null},"PeriodicalIF":1.8,"publicationDate":"2024-01-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139373808","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}