{"title":"A Global Convergence of Spectral Conjugate Gradient Method for Large Scale Optimization","authors":"Ghada M. Al-Naemi","doi":"10.33899/edusj.2018.159323","DOIUrl":null,"url":null,"abstract":"In this paper, we are concerned with the conjugate gradient method for solving unconstrained optimization problems due to its simplicity and don’t store any matrices. We proposed two spectral modifications to the conjugate descent (CD). These two proposed methods produces sufficient descent directions for the objective function at every iteration with strong Wolfe line searches and with inexact line search, and also they are globally convergent for general non-convex functions can be guaranteed. Numerical results show the efficiency of these two proposed methods. A Global Convergence of Spectral Conjugate Gradient Method for Large ... 144 Introduction. Let R R f n → : be continuously differentiable function. Consider the unconstrained nonlinear optimization problem: Minimize f(x), n R x . (1) We use g(x) to denote to the gradient of f at x. Due to need less computer memory especially, conjugate gradient method is very appealing for solving (1) when the number of variables is large. A conjugate gradient (CG) method generates a sequence of iterates by letting 1 1 1 − − − + = k k k k d x x , k=0,1,2,... (2) where the step-length k is obtained by carrying out some line search, and the search direction k d is defined by + − = − = − 1 0 , , 1 k if d g k if g d k k k k k , (3) where k is scalar which determines the different CG methods [11]. There are many wellknown formula for k , such as the Fletcher-Reeves(FR) [7], Polak-Ribirere-Polyak (PRP) [13] and [14], Hesteness-Stiefel (HS) [10], conjugate descent (CD) [8], Liu-Story (LS) [12], and Dai-Yuan (DY) [5]. In survey paper Hager and Zhang in [9] reviewed the development of different various of nonlinear gradient methods, with especial attention given to global convergence properties. The standard CD method proposed by Fletcher [8], specifies the CD k by","PeriodicalId":15614,"journal":{"name":"JOURNAL OF EDUCATION AND SCIENCE","volume":"338 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"JOURNAL OF EDUCATION AND SCIENCE","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.33899/edusj.2018.159323","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
In this paper, we are concerned with the conjugate gradient method for solving unconstrained optimization problems due to its simplicity and don’t store any matrices. We proposed two spectral modifications to the conjugate descent (CD). These two proposed methods produces sufficient descent directions for the objective function at every iteration with strong Wolfe line searches and with inexact line search, and also they are globally convergent for general non-convex functions can be guaranteed. Numerical results show the efficiency of these two proposed methods. A Global Convergence of Spectral Conjugate Gradient Method for Large ... 144 Introduction. Let R R f n → : be continuously differentiable function. Consider the unconstrained nonlinear optimization problem: Minimize f(x), n R x . (1) We use g(x) to denote to the gradient of f at x. Due to need less computer memory especially, conjugate gradient method is very appealing for solving (1) when the number of variables is large. A conjugate gradient (CG) method generates a sequence of iterates by letting 1 1 1 − − − + = k k k k d x x , k=0,1,2,... (2) where the step-length k is obtained by carrying out some line search, and the search direction k d is defined by + − = − = − 1 0 , , 1 k if d g k if g d k k k k k , (3) where k is scalar which determines the different CG methods [11]. There are many wellknown formula for k , such as the Fletcher-Reeves(FR) [7], Polak-Ribirere-Polyak (PRP) [13] and [14], Hesteness-Stiefel (HS) [10], conjugate descent (CD) [8], Liu-Story (LS) [12], and Dai-Yuan (DY) [5]. In survey paper Hager and Zhang in [9] reviewed the development of different various of nonlinear gradient methods, with especial attention given to global convergence properties. The standard CD method proposed by Fletcher [8], specifies the CD k by