Ahmad Alhawarat, Zabidin Salleh, Hanan Alolaiyan, Hamid El Hor, Shahrina Ismail
{"title":"A three-term conjugate gradient descent method with some applications","authors":"Ahmad Alhawarat, Zabidin Salleh, Hanan Alolaiyan, Hamid El Hor, Shahrina Ismail","doi":"10.1186/s13660-024-03142-0","DOIUrl":null,"url":null,"abstract":"The stationary point of optimization problems can be obtained via conjugate gradient (CG) methods without the second derivative. Many researchers have used this method to solve applications in various fields, such as neural networks and image restoration. In this study, we construct a three-term CG method that fulfills convergence analysis and a descent property. Next, in the second term, we employ a Hestenses-Stiefel CG formula with some restrictions to be positive. The third term includes a negative gradient used as a search direction multiplied by an accelerating expression. We also provide some numerical results collected using a strong Wolfe line search with different sigma values over 166 optimization functions from the CUTEr library. The result shows the proposed approach is far more efficient than alternative prevalent CG methods regarding central processing unit (CPU) time, number of iterations, number of function evaluations, and gradient evaluations. Moreover, we present some applications for the proposed three-term search direction in image restoration, and we compare the results with well-known CG methods with respect to the number of iterations, CPU time, as well as root-mean-square error (RMSE). Finally, we present three applications in regression analysis, image restoration, and electrical engineering.","PeriodicalId":16088,"journal":{"name":"Journal of Inequalities and Applications","volume":"66 1","pages":""},"PeriodicalIF":1.5000,"publicationDate":"2024-05-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Inequalities and Applications","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1186/s13660-024-03142-0","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS","Score":null,"Total":0}
引用次数: 0
Abstract
The stationary point of optimization problems can be obtained via conjugate gradient (CG) methods without the second derivative. Many researchers have used this method to solve applications in various fields, such as neural networks and image restoration. In this study, we construct a three-term CG method that fulfills convergence analysis and a descent property. Next, in the second term, we employ a Hestenses-Stiefel CG formula with some restrictions to be positive. The third term includes a negative gradient used as a search direction multiplied by an accelerating expression. We also provide some numerical results collected using a strong Wolfe line search with different sigma values over 166 optimization functions from the CUTEr library. The result shows the proposed approach is far more efficient than alternative prevalent CG methods regarding central processing unit (CPU) time, number of iterations, number of function evaluations, and gradient evaluations. Moreover, we present some applications for the proposed three-term search direction in image restoration, and we compare the results with well-known CG methods with respect to the number of iterations, CPU time, as well as root-mean-square error (RMSE). Finally, we present three applications in regression analysis, image restoration, and electrical engineering.
期刊介绍:
The aim of this journal is to provide a multi-disciplinary forum of discussion in mathematics and its applications in which the essentiality of inequalities is highlighted. This Journal accepts high quality articles containing original research results and survey articles of exceptional merit. Subject matters should be strongly related to inequalities, such as, but not restricted to, the following: inequalities in analysis, inequalities in approximation theory, inequalities in combinatorics, inequalities in economics, inequalities in geometry, inequalities in mechanics, inequalities in optimization, inequalities in stochastic analysis and applications.