{"title":"无约束优化中最速下降法和共轭梯度法线搜索条件组合的基准研究","authors":"K. Kiran","doi":"10.17535/crorr.2022.0006","DOIUrl":null,"url":null,"abstract":"In this paper, it is aimed to computationally conduct a performance benchmarking for the steepest descent and the three well-known conjugate gradient methods (i.e., Fletcher-Reeves, Polak-Ribiere and Hestenes-Stiefel) along with six different step length calculation techniques/conditions, namely Backtracking, Armijo-Backtracking, Goldstein, weakWolfe, strongWolfe, Exact local minimizer in the unconstrained optimization. To this end, a series of computational experiments on a test function set is completed using the combinations of those optimization methods and line search conditions. During these experiments, the number of function evaluations for every iteration are monitored and recorded for all the optimization method-line search condition combinations. The total number of function evaluations are then set a performance measure when the combination in question converges to the functions minimums within the given convergence tolerance. Through those data, the performance and data profiles are created for all the optimization method-line search condition combinations with the purpose of a reliable and an efficient benchmarking. It has been determined that, for this test function set, the steepest descent-Goldstein combination is the fastest one whereas the steepest descent-exact local minimizer is the most robust one with a high convergence accuracy. By making a trade-off between convergence speed and robustness, it has been identified that the steepest descent-weak Wolfe combination is the optimal choice for this test function set.","PeriodicalId":44065,"journal":{"name":"Croatian Operational Research Review","volume":" ","pages":""},"PeriodicalIF":0.5000,"publicationDate":"2022-07-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Benchmark Study on Steepest Descent and Conjugate Gradient Methods-Line Search Conditions Combinations in Unconstrained Optimization\",\"authors\":\"K. Kiran\",\"doi\":\"10.17535/crorr.2022.0006\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, it is aimed to computationally conduct a performance benchmarking for the steepest descent and the three well-known conjugate gradient methods (i.e., Fletcher-Reeves, Polak-Ribiere and Hestenes-Stiefel) along with six different step length calculation techniques/conditions, namely Backtracking, Armijo-Backtracking, Goldstein, weakWolfe, strongWolfe, Exact local minimizer in the unconstrained optimization. To this end, a series of computational experiments on a test function set is completed using the combinations of those optimization methods and line search conditions. During these experiments, the number of function evaluations for every iteration are monitored and recorded for all the optimization method-line search condition combinations. The total number of function evaluations are then set a performance measure when the combination in question converges to the functions minimums within the given convergence tolerance. Through those data, the performance and data profiles are created for all the optimization method-line search condition combinations with the purpose of a reliable and an efficient benchmarking. It has been determined that, for this test function set, the steepest descent-Goldstein combination is the fastest one whereas the steepest descent-exact local minimizer is the most robust one with a high convergence accuracy. By making a trade-off between convergence speed and robustness, it has been identified that the steepest descent-weak Wolfe combination is the optimal choice for this test function set.\",\"PeriodicalId\":44065,\"journal\":{\"name\":\"Croatian Operational Research Review\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":0.5000,\"publicationDate\":\"2022-07-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Croatian Operational Research Review\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.17535/crorr.2022.0006\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"ECONOMICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Croatian Operational Research Review","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.17535/crorr.2022.0006","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ECONOMICS","Score":null,"Total":0}
A Benchmark Study on Steepest Descent and Conjugate Gradient Methods-Line Search Conditions Combinations in Unconstrained Optimization
In this paper, it is aimed to computationally conduct a performance benchmarking for the steepest descent and the three well-known conjugate gradient methods (i.e., Fletcher-Reeves, Polak-Ribiere and Hestenes-Stiefel) along with six different step length calculation techniques/conditions, namely Backtracking, Armijo-Backtracking, Goldstein, weakWolfe, strongWolfe, Exact local minimizer in the unconstrained optimization. To this end, a series of computational experiments on a test function set is completed using the combinations of those optimization methods and line search conditions. During these experiments, the number of function evaluations for every iteration are monitored and recorded for all the optimization method-line search condition combinations. The total number of function evaluations are then set a performance measure when the combination in question converges to the functions minimums within the given convergence tolerance. Through those data, the performance and data profiles are created for all the optimization method-line search condition combinations with the purpose of a reliable and an efficient benchmarking. It has been determined that, for this test function set, the steepest descent-Goldstein combination is the fastest one whereas the steepest descent-exact local minimizer is the most robust one with a high convergence accuracy. By making a trade-off between convergence speed and robustness, it has been identified that the steepest descent-weak Wolfe combination is the optimal choice for this test function set.
期刊介绍:
Croatian Operational Research Review (CRORR) is the journal which publishes original scientific papers from the area of operational research. The purpose is to publish papers from various aspects of operational research (OR) with the aim of presenting scientific ideas that will contribute both to theoretical development and practical application of OR. The scope of the journal covers the following subject areas: linear and non-linear programming, integer programing, combinatorial and discrete optimization, multi-objective programming, stohastic models and optimization, scheduling, macroeconomics, economic theory, game theory, statistics and econometrics, marketing and data analysis, information and decision support systems, banking, finance, insurance, environment, energy, health, neural networks and fuzzy systems, control theory, simulation, practical OR and applications. The audience includes both researchers and practitioners from the area of operations research, applied mathematics, statistics, econometrics, intelligent methods, simulation, and other areas included in the above list of topics. The journal has an international board of editors, consisting of more than 30 editors – university professors from Croatia, Slovenia, USA, Italy, Germany, Austria and other coutries.