{"title":"一种基于梯度抽样技术的基于精确和非精确梯度的非光滑优化束信任方法","authors":"Morteza Maleknia, Majid Soleimani-damaneh","doi":"10.1093/imanum/draf087","DOIUrl":null,"url":null,"abstract":"Based on the proximal bundle and gradient sampling (GS) methods, we develop a robust algorithm for minimizing the locally Lipschitz function $f:\\mathbb{R}^{n}\\to \\mathbb{R}$. As an interesting feature of the proposed method, thanks to the GS technique, we sample a set of differentiable auxiliary points from the vicinity of the current point to construct an initial piecewise linear model for the objective function. If necessary, inspired by bundle methods, we iteratively enrich the set of sampled points by using a single nonredundant auxiliary point suggested by a modified variant of Mifflin’s line search. However, we may terminate the enrichment process without achieving a descent step, which is different from classic bundle methods. Indeed, the proposed enrichment process only accepts those auxiliary points having a small gradient locality measure, which significantly improves the efficiency of the method in practice. In theory, our method keeps iterations where the objective function is differentiable, and consequently, it works only with the gradient vectors of the objective function. In contrast with existing GS methods, the radius of the sampling region is not monotone. More precisely, by proposing a nonmonotone proximity parameter based on the radius of the sampling region, we add some valuable features of the trust region philosophy to our algorithm. The convergence analysis of the proposed method is comprehensively studied using exact and inexact gradients. By means of various academic and semi-academic test problems, we demonstrate the reliability and efficiency of the proposed method in practice.1","PeriodicalId":56295,"journal":{"name":"IMA Journal of Numerical Analysis","volume":"5 1","pages":""},"PeriodicalIF":2.4000,"publicationDate":"2025-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A bundle-trust method via gradient sampling technique for nonsmooth optimization using exact and inexact gradients\",\"authors\":\"Morteza Maleknia, Majid Soleimani-damaneh\",\"doi\":\"10.1093/imanum/draf087\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Based on the proximal bundle and gradient sampling (GS) methods, we develop a robust algorithm for minimizing the locally Lipschitz function $f:\\\\mathbb{R}^{n}\\\\to \\\\mathbb{R}$. As an interesting feature of the proposed method, thanks to the GS technique, we sample a set of differentiable auxiliary points from the vicinity of the current point to construct an initial piecewise linear model for the objective function. If necessary, inspired by bundle methods, we iteratively enrich the set of sampled points by using a single nonredundant auxiliary point suggested by a modified variant of Mifflin’s line search. However, we may terminate the enrichment process without achieving a descent step, which is different from classic bundle methods. Indeed, the proposed enrichment process only accepts those auxiliary points having a small gradient locality measure, which significantly improves the efficiency of the method in practice. In theory, our method keeps iterations where the objective function is differentiable, and consequently, it works only with the gradient vectors of the objective function. In contrast with existing GS methods, the radius of the sampling region is not monotone. More precisely, by proposing a nonmonotone proximity parameter based on the radius of the sampling region, we add some valuable features of the trust region philosophy to our algorithm. The convergence analysis of the proposed method is comprehensively studied using exact and inexact gradients. By means of various academic and semi-academic test problems, we demonstrate the reliability and efficiency of the proposed method in practice.1\",\"PeriodicalId\":56295,\"journal\":{\"name\":\"IMA Journal of Numerical Analysis\",\"volume\":\"5 1\",\"pages\":\"\"},\"PeriodicalIF\":2.4000,\"publicationDate\":\"2025-09-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IMA Journal of Numerical Analysis\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1093/imanum/draf087\",\"RegionNum\":2,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IMA Journal of Numerical Analysis","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1093/imanum/draf087","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
A bundle-trust method via gradient sampling technique for nonsmooth optimization using exact and inexact gradients
Based on the proximal bundle and gradient sampling (GS) methods, we develop a robust algorithm for minimizing the locally Lipschitz function $f:\mathbb{R}^{n}\to \mathbb{R}$. As an interesting feature of the proposed method, thanks to the GS technique, we sample a set of differentiable auxiliary points from the vicinity of the current point to construct an initial piecewise linear model for the objective function. If necessary, inspired by bundle methods, we iteratively enrich the set of sampled points by using a single nonredundant auxiliary point suggested by a modified variant of Mifflin’s line search. However, we may terminate the enrichment process without achieving a descent step, which is different from classic bundle methods. Indeed, the proposed enrichment process only accepts those auxiliary points having a small gradient locality measure, which significantly improves the efficiency of the method in practice. In theory, our method keeps iterations where the objective function is differentiable, and consequently, it works only with the gradient vectors of the objective function. In contrast with existing GS methods, the radius of the sampling region is not monotone. More precisely, by proposing a nonmonotone proximity parameter based on the radius of the sampling region, we add some valuable features of the trust region philosophy to our algorithm. The convergence analysis of the proposed method is comprehensively studied using exact and inexact gradients. By means of various academic and semi-academic test problems, we demonstrate the reliability and efficiency of the proposed method in practice.1
期刊介绍:
The IMA Journal of Numerical Analysis (IMAJNA) publishes original contributions to all fields of numerical analysis; articles will be accepted which treat the theory, development or use of practical algorithms and interactions between these aspects. Occasional survey articles are also published.