{"title":"An efficient hybrid conjugate gradient method for unconstrained optimization","authors":"A. Ibrahim, P. Kumam, A. Kamandi, A. Abubakar","doi":"10.1080/10556788.2021.1998490","DOIUrl":"https://doi.org/10.1080/10556788.2021.1998490","url":null,"abstract":"In this paper, we propose a hybrid conjugate gradient method for unconstrained optimization, obtained by a convex combination of the LS and KMD conjugate gradient parameters. A favourite property of the proposed method is that the search direction satisfies the Dai–Liao conjugacy condition and the quasi-Newton direction. In addition, this property does not depend on the line search. Under a modified strong Wolfe line search, we establish the global convergence of the method. Numerical comparison using a set of 109 unconstrained optimization test problems from the CUTEst library show that the proposed method outperforms the Liu–Storey and Hager–Zhang conjugate gradient methods.","PeriodicalId":124811,"journal":{"name":"Optimization Methods and Software","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-02-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130766390","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Optimal order multigrid preconditioners for the distributed control of parabolic equations with coarsening in space and time","authors":"Andrei Draganescu, M. Hajghassem","doi":"10.1080/10556788.2021.2022145","DOIUrl":"https://doi.org/10.1080/10556788.2021.2022145","url":null,"abstract":"We devise multigrid preconditioners for linear-quadratic space-time distributed parabolic optimal control problems. While our method is rooted in earlier work on elliptic control, the temporal dimension presents new challenges in terms of algorithm design and quality. Our primary focus is on the cG(s)dG(r) discretizations which are based on functions that are continuous in space and discontinuous in time, but our technique is applicable to various other space-time finite element discretizations. We construct and analyse two kinds of multigrid preconditioners: the first is based on full coarsening in space and time, while the second is based on semi-coarsening in space only. Our analysis, in conjunction with numerical experiments, shows that both preconditioners are of optimal order with respect to the discretization in case of cG(1)dG(r) for r = 0, 1 and exhibit a suboptimal behaviour in time for Crank–Nicolson. We also show that, under certain conditions, the preconditioner using full space-time coarsening is more efficient than the one involving semi-coarsening in space, a phenomenon that has not been observed previously. Our numerical results confirm the theoretical findings.","PeriodicalId":124811,"journal":{"name":"Optimization Methods and Software","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-02-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114538229","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Variants of the A-HPE and large-step A-HPE algorithms for strongly convex problems with applications to accelerated high-order tensor methods","authors":"M. Marques Alves","doi":"10.1080/10556788.2021.2022148","DOIUrl":"https://doi.org/10.1080/10556788.2021.2022148","url":null,"abstract":"For solving strongly convex optimization problems, we propose and study the global convergence of variants of the accelerated hybrid proximal extragradient (A-HPE) and large-step A-HPE algorithms of R.D.C. Monteiro and B.F. Svaiter [An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods, SIAM J. Optim. 23 (2013), pp. 1092–1125.]. We prove linear and the superlinear global rates for the proposed variants of the A-HPE and large-step A-HPE methods, respectively. The parameter appears in the (high-order) large-step condition of the new large-step A-HPE algorithm. We apply our results to high-order tensor methods, obtaining a new inexact (relative-error) tensor method for (smooth) strongly convex optimization with iteration-complexity . In particular, for p = 2, we obtain an inexact proximal-Newton algorithm with fast global convergence rate.","PeriodicalId":124811,"journal":{"name":"Optimization Methods and Software","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115081548","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
L. Pugliese, D. Ferone, P. Festa, F. Guerriero, Giusy Macrina
{"title":"Solution approaches for the vehicle routing problem with occasional drivers and time windows","authors":"L. Pugliese, D. Ferone, P. Festa, F. Guerriero, Giusy Macrina","doi":"10.1080/10556788.2021.2022142","DOIUrl":"https://doi.org/10.1080/10556788.2021.2022142","url":null,"abstract":"The efficient management of last-mile delivery is one of the main challenges faced by on-line retailers and logistic companies. The main aim is to offer personalized delivery services, that meet speed, flexibility, and control requirements and try to reduce environmental impacts as well. Crowd-sourced shipping is an emerging strategy that can be used to optimize the last-mile delivery process. The main idea is to deliver packages to customers with the aid of non-professional couriers, called occasional drivers. In this paper, we address the vehicle routing problem with occasional drivers, time window constraints and multiple deliveries. To handle this problem, we design some greedy randomized adaptive search procedures (GRASP). In order to assess the behaviour of the proposed algorithms, computational experiments are carried out on benchmark instances and new generated test sets. A comparison with previous published approaches, tailored for the problem at hand, is also provided. The numerical results are very encouraging and highlight the superiority, in terms of both efficiency and effectiveness, of the proposed GRASP algorithms.","PeriodicalId":124811,"journal":{"name":"Optimization Methods and Software","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114350151","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A quasi-Newton proximal bundle method using gradient sampling technique for minimizing nonsmooth convex functions","authors":"M. Maleknia, M. Shamsi","doi":"10.1080/10556788.2021.2023522","DOIUrl":"https://doi.org/10.1080/10556788.2021.2023522","url":null,"abstract":"This study aims to merge the well-established ideas of bundle and Gradient Sampling (GS) methods to develop an algorithm for locating a minimizer of a nonsmooth convex function. In the proposed method, with the help of the GS technique, we sample a number of differentiable auxiliary points around the current iterate. Then, by applying the standard techniques used in bundle methods, we construct a polyhedral (piecewise linear) model of the objective function. Moreover, by performing quasi-Newton updates on the set of auxiliary points, this polyhedral model is augmented with a regularization term that enjoys second-order information. If required, this initial model is improved by the techniques frequently used in GS and bundle methods. We analyse the global convergence of the proposed method. As opposed to the original GS method and some of its variants, our convergence analysis is independent of the size of the sample. In our numerical experiments, various aspects of the proposed method are examined using a variety of test problems. In particular, in contrast with many variants of bundle methods, we will see that the user can supply gradients approximately. Moreover, we compare the proposed method with some efficient variants of GS and bundle methods.","PeriodicalId":124811,"journal":{"name":"Optimization Methods and Software","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124663170","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Improved high-dimensional regression models with matrix approximations applied to the comparative case studies with support vector machines","authors":"M. Roozbeh, S. Babaie-Kafaki, Z. Aminifard","doi":"10.1080/10556788.2021.2022144","DOIUrl":"https://doi.org/10.1080/10556788.2021.2022144","url":null,"abstract":"Nowadays, high-dimensional data appear in many practical applications such as biosciences. In the regression analysis literature, the well-known ordinary least-squares estimation may be misleading when the full ranking of the design matrix is missed. As a popular issue, outliers may corrupt normal distribution of the residuals. Thus, since not being sensitive to the outlying data points, robust estimators are frequently applied in confrontation with the issue. Ill-conditioning in high-dimensional data is another common problem in modern regression analysis under which applying the least-squares estimator is hardly possible. So, it is necessary to deal with estimation methods to tackle these problems. As known, a successful approach for high-dimension cases is the penalized scheme with the aim of obtaining a subset of effective explanatory variables that predict the response as the best, while setting the other parameters to zero. Here, we develop several penalized mixed-integer nonlinear programming models to be used in high-dimension regression analysis. The given matrix approximations have simple structures, decreasing computational cost of the models. Moreover, the models are effectively solvable by metaheuristic algorithms. Numerical tests are made to shed light on performance of the proposed methods on simulated and real world high-dimensional data sets.","PeriodicalId":124811,"journal":{"name":"Optimization Methods and Software","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121494629","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Stochastic block projection algorithms with extrapolation for convex feasibility problems","authors":"I. Necoara","doi":"10.1080/10556788.2021.1998492","DOIUrl":"https://doi.org/10.1080/10556788.2021.1998492","url":null,"abstract":"The stochastic alternating projection (SP) algorithm is a simple but powerful approach for solving convex feasibility problems. At each step, the method projects the current iterate onto a random individual set from the intersection. Hence, it has simple iteration, but, usually, convergences slowly. In this paper, we develop accelerated variants of basic SP method. We achieve acceleration using two ingredients: blocks of sets and adaptive extrapolation. We propose SP-based algorithms based on extrapolated iterations of convex combinations of projections onto block of sets. Our approach is based on several new ideas and tools, including stochastic selection rules for the blocks, stochastic conditioning of feasibility problem, and novel strategies for designing adaptive extrapolated stepsizes. We prove that, under linear regularity of the sets, our stochastic block projection algorithms converge linearly in expectation, with a rate depending on the condition number of the (block) feasibility problem and on the size of the blocks. Otherwise, we prove that our methods converge sublinearly. Our convergence analysis reveals that such algorithms are most effective when a good sampling of the sets into well-conditioned blocks is given. The convergence rates also explain when algorithms combining block projections with adaptive extrapolation work better than their nonextrapolated variants.","PeriodicalId":124811,"journal":{"name":"Optimization Methods and Software","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-02-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125672742","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A dual active-set proximal Newton algorithm for sparse approximation of correlation matrices","authors":"X. Liu, Chungen Shen, Li Wang","doi":"10.1080/10556788.2021.1998491","DOIUrl":"https://doi.org/10.1080/10556788.2021.1998491","url":null,"abstract":"In this paper, we propose a novel dual active-set algorithm that is based on proximal gradient and semi-smooth Newton iterations for the sparse approximation of correlation matrices in the Frobenius norm. A new dual formulation with upper and lower bounds is derived. To solve the dual, the proximal gradient method is developed to guarantee global convergence. Also, it provides information to estimate active/inactive constraints. Then, the semi-smooth Newton method is applied to accelerate the convergence of the proximal gradient method, which is the key ingredient of our algorithm. It is shown that the proposed algorithm for the dual is globally convergent under certain conditions. Some preliminary numerical results are given to illustrate the effectiveness of our algorithm on synthetic and real data sets.","PeriodicalId":124811,"journal":{"name":"Optimization Methods and Software","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-02-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125820389","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An efficient semismooth Newton method for adaptive sparse signal recovery problems","authors":"Yanyun Ding, Haibin Zhang, P. Li, Yunhai Xiao","doi":"10.1080/10556788.2022.2120983","DOIUrl":"https://doi.org/10.1080/10556788.2022.2120983","url":null,"abstract":"We know that compressive sensing can establish stable sparse recovery results from highly undersampled data under a restricted isometry property condition. In reality, however, numerous problems are coherent, and vast majority conventional methods might work not so well. Recently, it was shown that using the difference between - and -norm as a regularization always has superior performance. In this paper, we consider an adaptive - model where the -norm with measures the data fidelity and the -term measures the sparsity. This proposed model has the ability to deal with different types of noises and extract the sparse property even under high coherent condition. We use a proximal majorization-minimization technique to handle the non-convex regularization term and then employ a semismooth Newton method to solve the corresponding convex relaxation subproblem. We prove that the sequence generated by the semismooth Newton method admits fast local convergence rate to the subproblem under some technical assumptions. Finally, we do some numerical experiments to demonstrate the superiority of the proposed model and the progressiveness of the proposed algorithm.","PeriodicalId":124811,"journal":{"name":"Optimization Methods and Software","volume":"75 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116904701","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Distributionally robust expected residual minimization for stochastic variational inequality problems","authors":"A. Hori, Yuya Yamakawa, N. Yamashita","doi":"10.1080/10556788.2023.2167995","DOIUrl":"https://doi.org/10.1080/10556788.2023.2167995","url":null,"abstract":"The stochastic variational inequality problem (SVIP) is an equilibrium model that includes random variables and has been widely applied in various fields such as economics and engineering. Expected residual minimization (ERM) is an established model for obtaining a reasonable solution for the SVIP, and its objective function is an expected value of a suitable merit function for the SVIP. However, the ERM is restricted to the case where the distribution is known in advance. We extend the ERM to ensure the attainment of robust solutions for the SVIP under the uncertainty distribution (the extended ERM is referred to as distributionally robust expected residual minimization (DRERM), where the worst-case distribution is derived from the set of probability measures in which the expected value and variance take the same sample mean and variance, respectively). Under suitable assumptions, we demonstrate that the DRERM can be reformulated as a deterministic convex nonlinear semidefinite programming to avoid numerical integration.","PeriodicalId":124811,"journal":{"name":"Optimization Methods and Software","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128480949","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}