{"title":"An efficient hybrid conjugate gradient method for unconstrained optimization","authors":"A. Ibrahim, P. Kumam, A. Kamandi, A. Abubakar","doi":"10.1080/10556788.2021.1998490","DOIUrl":null,"url":null,"abstract":"In this paper, we propose a hybrid conjugate gradient method for unconstrained optimization, obtained by a convex combination of the LS and KMD conjugate gradient parameters. A favourite property of the proposed method is that the search direction satisfies the Dai–Liao conjugacy condition and the quasi-Newton direction. In addition, this property does not depend on the line search. Under a modified strong Wolfe line search, we establish the global convergence of the method. Numerical comparison using a set of 109 unconstrained optimization test problems from the CUTEst library show that the proposed method outperforms the Liu–Storey and Hager–Zhang conjugate gradient methods.","PeriodicalId":124811,"journal":{"name":"Optimization Methods and Software","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-02-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optimization Methods and Software","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/10556788.2021.1998490","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
In this paper, we propose a hybrid conjugate gradient method for unconstrained optimization, obtained by a convex combination of the LS and KMD conjugate gradient parameters. A favourite property of the proposed method is that the search direction satisfies the Dai–Liao conjugacy condition and the quasi-Newton direction. In addition, this property does not depend on the line search. Under a modified strong Wolfe line search, we establish the global convergence of the method. Numerical comparison using a set of 109 unconstrained optimization test problems from the CUTEst library show that the proposed method outperforms the Liu–Storey and Hager–Zhang conjugate gradient methods.