{"title":"光滑凸函数的最优下界","authors":"Mihai I. Florea, Yurii E. Nesterov","doi":"10.1007/s10208-025-09712-y","DOIUrl":null,"url":null,"abstract":"<p>First order methods endowed with global convergence guarantees operate using global lower bounds on the objective. The tightening of the bounds leads to an increase in theoretical guarantees and in observed practical performance. In this work, we define a global lower bound for smooth objectives that is optimal with respect to the collected oracle information. Our bound can be readily employed by the Gradient Method with Memory to improve its performance. Further using the machinery underlying the optimal bounds, we introduce a modified version of the estimate sequence that we use to construct an Optimized Gradient Method with Memory possessing the best known convergence guarantees for its class of algorithms up to the proportionality constant. We additionally equip the method with an adaptive convergence guarantee adjustment procedure that is an effective replacement for line-search. Simulation results on synthetic but otherwise difficult smooth problems validate the theoretical properties of the bound and of the proposed methods.\n</p>","PeriodicalId":55151,"journal":{"name":"Foundations of Computational Mathematics","volume":"13 1","pages":""},"PeriodicalIF":2.7000,"publicationDate":"2025-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"An Optimal Lower Bound for Smooth Convex Functions\",\"authors\":\"Mihai I. Florea, Yurii E. Nesterov\",\"doi\":\"10.1007/s10208-025-09712-y\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>First order methods endowed with global convergence guarantees operate using global lower bounds on the objective. The tightening of the bounds leads to an increase in theoretical guarantees and in observed practical performance. In this work, we define a global lower bound for smooth objectives that is optimal with respect to the collected oracle information. Our bound can be readily employed by the Gradient Method with Memory to improve its performance. Further using the machinery underlying the optimal bounds, we introduce a modified version of the estimate sequence that we use to construct an Optimized Gradient Method with Memory possessing the best known convergence guarantees for its class of algorithms up to the proportionality constant. We additionally equip the method with an adaptive convergence guarantee adjustment procedure that is an effective replacement for line-search. Simulation results on synthetic but otherwise difficult smooth problems validate the theoretical properties of the bound and of the proposed methods.\\n</p>\",\"PeriodicalId\":55151,\"journal\":{\"name\":\"Foundations of Computational Mathematics\",\"volume\":\"13 1\",\"pages\":\"\"},\"PeriodicalIF\":2.7000,\"publicationDate\":\"2025-06-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Foundations of Computational Mathematics\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1007/s10208-025-09712-y\",\"RegionNum\":1,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, THEORY & METHODS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Foundations of Computational Mathematics","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s10208-025-09712-y","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
An Optimal Lower Bound for Smooth Convex Functions
First order methods endowed with global convergence guarantees operate using global lower bounds on the objective. The tightening of the bounds leads to an increase in theoretical guarantees and in observed practical performance. In this work, we define a global lower bound for smooth objectives that is optimal with respect to the collected oracle information. Our bound can be readily employed by the Gradient Method with Memory to improve its performance. Further using the machinery underlying the optimal bounds, we introduce a modified version of the estimate sequence that we use to construct an Optimized Gradient Method with Memory possessing the best known convergence guarantees for its class of algorithms up to the proportionality constant. We additionally equip the method with an adaptive convergence guarantee adjustment procedure that is an effective replacement for line-search. Simulation results on synthetic but otherwise difficult smooth problems validate the theoretical properties of the bound and of the proposed methods.
期刊介绍:
Foundations of Computational Mathematics (FoCM) will publish research and survey papers of the highest quality which further the understanding of the connections between mathematics and computation. The journal aims to promote the exploration of all fundamental issues underlying the creative tension among mathematics, computer science and application areas unencumbered by any external criteria such as the pressure for applications. The journal will thus serve an increasingly important and applicable area of mathematics. The journal hopes to further the understanding of the deep relationships between mathematical theory: analysis, topology, geometry and algebra, and the computational processes as they are evolving in tandem with the modern computer.
With its distinguished editorial board selecting papers of the highest quality and interest from the international community, FoCM hopes to influence both mathematics and computation. Relevance to applications will not constitute a requirement for the publication of articles.
The journal does not accept code for review however authors who have code/data related to the submission should include a weblink to the repository where the data/code is stored.