{"title":"CONJUGATE GRADIENT WITH SUBSPACE MINIMIZATION BASED ON CUBIC REGULARIZATION MODEL OF THE MINIMIZING FUNCTION","authors":"N. Andrei","doi":"10.56082/annalsarsciinfo.2022.1-2.28","DOIUrl":null,"url":null,"abstract":"A new algorithm for unconstrained optimization based on the cubic regularization in two dimensional subspace is developed. Different strategies for search direction are also discussed. The stepsize is computed by means of the weak Wolfe line search. Under classical assumptions it is proved that the algorithm is convergent. Intensive numerical experiments with 800 unconstrained optimization test functions with the number of variables in the range [1000 - 10,000] show that the suggested algorithm is more efficient and more robust than the well established conjugate gradient algorithms CG-DESCENT, CONMIN and L-BFGS (m=5). Comparisons of the suggested algorithm versus CG-DESCENT for solving five applications from MINPACK-2 collection, each of them with 40,000 variables, show that CUBIC is 3.35 times faster than CG-DESCENT.","PeriodicalId":32445,"journal":{"name":"Annals Series on History and Archaeology Academy of Romanian Scientists","volume":"280 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Annals Series on History and Archaeology Academy of Romanian Scientists","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.56082/annalsarsciinfo.2022.1-2.28","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
A new algorithm for unconstrained optimization based on the cubic regularization in two dimensional subspace is developed. Different strategies for search direction are also discussed. The stepsize is computed by means of the weak Wolfe line search. Under classical assumptions it is proved that the algorithm is convergent. Intensive numerical experiments with 800 unconstrained optimization test functions with the number of variables in the range [1000 - 10,000] show that the suggested algorithm is more efficient and more robust than the well established conjugate gradient algorithms CG-DESCENT, CONMIN and L-BFGS (m=5). Comparisons of the suggested algorithm versus CG-DESCENT for solving five applications from MINPACK-2 collection, each of them with 40,000 variables, show that CUBIC is 3.35 times faster than CG-DESCENT.