{"title":"A comparison of Quasi-Newton methods considering line search conditions in unconstrained minimization","authors":"K. Kiran","doi":"10.1080/02522667.2022.2103303","DOIUrl":null,"url":null,"abstract":"Abstract Optimization algorithms are essential tools in all fields such as engineering, business, science and even in nature. Their performance and efficiency may differ from problem-to-problem. In this respect, the current paper aims to provide an understanding of the three well-known Quasi-Newton methods relation with various line search conditions on a performance basis. To this end, the fifteen Quasi-Newton method-line search condition combinations, which are composed of Symmetric-Rank-1 (SR-1), Broyden-Fletcher-Goldfarb-Shanno (BFGS) and Davidon-Fletcher-Powell (DFP) Quasi-Newton methods and Backtracking (BC), Armijo-Backtracking (ABC), Goldstein (GC), Weak Wolfe (WWC) and Strong Wolfe (SWC) conditions, are subjected to totally 195 computational experiments using thirteen test functions including smooth and non-smooth ones. During the experiments, the number of function evaluations at every iteration for all the combinations are kept track and the total number of function evaluations, when each combination converges, are set as a first performance metric. In addition to that, the eigenvalues of Hessian approximation matrix are checked if the updated matrix is positive definite or not. In case of having a negative definite Hessian approximation, the eigenvalue modification procedure is employed to guarantee that the updated matrix is positive definite, which in turn progress along the descent direction. Besides, for purpose of using in the performance evaluations, the number of negative definite matrix generations, as a second performance metric, are recorded for all the combinations. To conduct a reliable and an efficient performance analysis on the combinations, the performance and data profiles are used along with the number of negative definite Hessian matrix update generations. Based on the mathematical analysis through those data, the BFGS-GC combination is the fastest one whereas the slowest one is the DFP-SWC combination. For an optimal choice, it is determined that the BFGS-WWC combination is a great candidate. It is also observed that the BFGS and DFP methods operate very well as long as the curvature condition is satisfied by any line search conditions. However, this statement is not valid for the SR-1 method.","PeriodicalId":46518,"journal":{"name":"JOURNAL OF INFORMATION & OPTIMIZATION SCIENCES","volume":null,"pages":null},"PeriodicalIF":1.1000,"publicationDate":"2022-11-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"JOURNAL OF INFORMATION & OPTIMIZATION SCIENCES","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/02522667.2022.2103303","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"INFORMATION SCIENCE & LIBRARY SCIENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Abstract Optimization algorithms are essential tools in all fields such as engineering, business, science and even in nature. Their performance and efficiency may differ from problem-to-problem. In this respect, the current paper aims to provide an understanding of the three well-known Quasi-Newton methods relation with various line search conditions on a performance basis. To this end, the fifteen Quasi-Newton method-line search condition combinations, which are composed of Symmetric-Rank-1 (SR-1), Broyden-Fletcher-Goldfarb-Shanno (BFGS) and Davidon-Fletcher-Powell (DFP) Quasi-Newton methods and Backtracking (BC), Armijo-Backtracking (ABC), Goldstein (GC), Weak Wolfe (WWC) and Strong Wolfe (SWC) conditions, are subjected to totally 195 computational experiments using thirteen test functions including smooth and non-smooth ones. During the experiments, the number of function evaluations at every iteration for all the combinations are kept track and the total number of function evaluations, when each combination converges, are set as a first performance metric. In addition to that, the eigenvalues of Hessian approximation matrix are checked if the updated matrix is positive definite or not. In case of having a negative definite Hessian approximation, the eigenvalue modification procedure is employed to guarantee that the updated matrix is positive definite, which in turn progress along the descent direction. Besides, for purpose of using in the performance evaluations, the number of negative definite matrix generations, as a second performance metric, are recorded for all the combinations. To conduct a reliable and an efficient performance analysis on the combinations, the performance and data profiles are used along with the number of negative definite Hessian matrix update generations. Based on the mathematical analysis through those data, the BFGS-GC combination is the fastest one whereas the slowest one is the DFP-SWC combination. For an optimal choice, it is determined that the BFGS-WWC combination is a great candidate. It is also observed that the BFGS and DFP methods operate very well as long as the curvature condition is satisfied by any line search conditions. However, this statement is not valid for the SR-1 method.