{"title":"改进的二阶凸最小化方法的全局性能保证","authors":"Pavel Dvurechensky, Yurii Nesterov","doi":"10.1007/s10208-025-09726-6","DOIUrl":null,"url":null,"abstract":"<p>In this paper, we attempt to compare two distinct branches of research on second-order optimization methods. The first one studies self-concordant functions and barriers, the main assumption being that the third derivative of the objective is bounded by the second derivative. The second branch studies cubic regularized Newton methods (CRNMs) with the main assumption that the second derivative is Lipschitz continuous. We develop a new theoretical analysis for a path-following scheme (PFS) for general self-concordant functions, as opposed to the classical path-following scheme developed for self-concordant barriers. We show that the complexity bound for this scheme is better than that of the Damped Newton Method (DNM) and show that our method has global superlinear convergence. We propose also a new predictor-corrector path-following scheme (PCPFS) that leads to further improvement of constant factors in the complexity guarantees for minimizing general self-concordant functions. We also apply path-following schemes to different classes of constrained optimization problems and obtain the resulting complexity bounds. Finally, we analyze an important subclass of general self-concordant functions, namely a class of strongly convex functions with Lipschitz continuous second derivative, and show that for this subclass CRNMs give even better complexity bounds.</p>","PeriodicalId":55151,"journal":{"name":"Foundations of Computational Mathematics","volume":"14 1","pages":""},"PeriodicalIF":2.7000,"publicationDate":"2025-08-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Improved global performance guarantees of second-order methods in convex minimization\",\"authors\":\"Pavel Dvurechensky, Yurii Nesterov\",\"doi\":\"10.1007/s10208-025-09726-6\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>In this paper, we attempt to compare two distinct branches of research on second-order optimization methods. The first one studies self-concordant functions and barriers, the main assumption being that the third derivative of the objective is bounded by the second derivative. The second branch studies cubic regularized Newton methods (CRNMs) with the main assumption that the second derivative is Lipschitz continuous. We develop a new theoretical analysis for a path-following scheme (PFS) for general self-concordant functions, as opposed to the classical path-following scheme developed for self-concordant barriers. We show that the complexity bound for this scheme is better than that of the Damped Newton Method (DNM) and show that our method has global superlinear convergence. We propose also a new predictor-corrector path-following scheme (PCPFS) that leads to further improvement of constant factors in the complexity guarantees for minimizing general self-concordant functions. We also apply path-following schemes to different classes of constrained optimization problems and obtain the resulting complexity bounds. Finally, we analyze an important subclass of general self-concordant functions, namely a class of strongly convex functions with Lipschitz continuous second derivative, and show that for this subclass CRNMs give even better complexity bounds.</p>\",\"PeriodicalId\":55151,\"journal\":{\"name\":\"Foundations of Computational Mathematics\",\"volume\":\"14 1\",\"pages\":\"\"},\"PeriodicalIF\":2.7000,\"publicationDate\":\"2025-08-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Foundations of Computational Mathematics\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1007/s10208-025-09726-6\",\"RegionNum\":1,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, THEORY & METHODS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Foundations of Computational Mathematics","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s10208-025-09726-6","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
Improved global performance guarantees of second-order methods in convex minimization
In this paper, we attempt to compare two distinct branches of research on second-order optimization methods. The first one studies self-concordant functions and barriers, the main assumption being that the third derivative of the objective is bounded by the second derivative. The second branch studies cubic regularized Newton methods (CRNMs) with the main assumption that the second derivative is Lipschitz continuous. We develop a new theoretical analysis for a path-following scheme (PFS) for general self-concordant functions, as opposed to the classical path-following scheme developed for self-concordant barriers. We show that the complexity bound for this scheme is better than that of the Damped Newton Method (DNM) and show that our method has global superlinear convergence. We propose also a new predictor-corrector path-following scheme (PCPFS) that leads to further improvement of constant factors in the complexity guarantees for minimizing general self-concordant functions. We also apply path-following schemes to different classes of constrained optimization problems and obtain the resulting complexity bounds. Finally, we analyze an important subclass of general self-concordant functions, namely a class of strongly convex functions with Lipschitz continuous second derivative, and show that for this subclass CRNMs give even better complexity bounds.
期刊介绍:
Foundations of Computational Mathematics (FoCM) will publish research and survey papers of the highest quality which further the understanding of the connections between mathematics and computation. The journal aims to promote the exploration of all fundamental issues underlying the creative tension among mathematics, computer science and application areas unencumbered by any external criteria such as the pressure for applications. The journal will thus serve an increasingly important and applicable area of mathematics. The journal hopes to further the understanding of the deep relationships between mathematical theory: analysis, topology, geometry and algebra, and the computational processes as they are evolving in tandem with the modern computer.
With its distinguished editorial board selecting papers of the highest quality and interest from the international community, FoCM hopes to influence both mathematics and computation. Relevance to applications will not constitute a requirement for the publication of articles.
The journal does not accept code for review however authors who have code/data related to the submission should include a weblink to the repository where the data/code is stored.