{"title":"自洽最小化的多层次方法","authors":"Nick Tsipinakis, Panos Parpas","doi":"10.1007/s10957-024-02509-z","DOIUrl":null,"url":null,"abstract":"<p>The analysis of second-order optimization methods based either on sub-sampling, randomization or sketching has two serious shortcomings compared to the conventional Newton method. The first shortcoming is that the analysis of the iterates has only been shown to be scale-invariant only under specific assumptions on the problem structure. The second shortfall is that the fast convergence rates of second-order methods have only been established by making assumptions regarding the input data. In this paper, we propose a randomized Newton method for self-concordant functions to address both shortfalls. We propose a Self-concordant Iterative-minimization-Galerkin-based Multilevel Algorithm (SIGMA) and establish its super-linear convergence rate using the theory of self-concordant functions. Our analysis is based on the connections between multigrid optimization methods, and the role of coarse-grained or reduced-order models in the computation of search directions. We take advantage of the insights from the analysis to significantly improve the performance of second-order methods in machine learning applications. We report encouraging initial experiments that suggest SIGMA outperforms other state-of-the-art sub-sampled/sketched Newton methods for both medium and large-scale problems.</p>","PeriodicalId":50100,"journal":{"name":"Journal of Optimization Theory and Applications","volume":"87 1","pages":""},"PeriodicalIF":1.6000,"publicationDate":"2024-09-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Multilevel Method for Self-Concordant Minimization\",\"authors\":\"Nick Tsipinakis, Panos Parpas\",\"doi\":\"10.1007/s10957-024-02509-z\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>The analysis of second-order optimization methods based either on sub-sampling, randomization or sketching has two serious shortcomings compared to the conventional Newton method. The first shortcoming is that the analysis of the iterates has only been shown to be scale-invariant only under specific assumptions on the problem structure. The second shortfall is that the fast convergence rates of second-order methods have only been established by making assumptions regarding the input data. In this paper, we propose a randomized Newton method for self-concordant functions to address both shortfalls. We propose a Self-concordant Iterative-minimization-Galerkin-based Multilevel Algorithm (SIGMA) and establish its super-linear convergence rate using the theory of self-concordant functions. Our analysis is based on the connections between multigrid optimization methods, and the role of coarse-grained or reduced-order models in the computation of search directions. We take advantage of the insights from the analysis to significantly improve the performance of second-order methods in machine learning applications. We report encouraging initial experiments that suggest SIGMA outperforms other state-of-the-art sub-sampled/sketched Newton methods for both medium and large-scale problems.</p>\",\"PeriodicalId\":50100,\"journal\":{\"name\":\"Journal of Optimization Theory and Applications\",\"volume\":\"87 1\",\"pages\":\"\"},\"PeriodicalIF\":1.6000,\"publicationDate\":\"2024-09-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Optimization Theory and Applications\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1007/s10957-024-02509-z\",\"RegionNum\":3,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Optimization Theory and Applications","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s10957-024-02509-z","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
A Multilevel Method for Self-Concordant Minimization
The analysis of second-order optimization methods based either on sub-sampling, randomization or sketching has two serious shortcomings compared to the conventional Newton method. The first shortcoming is that the analysis of the iterates has only been shown to be scale-invariant only under specific assumptions on the problem structure. The second shortfall is that the fast convergence rates of second-order methods have only been established by making assumptions regarding the input data. In this paper, we propose a randomized Newton method for self-concordant functions to address both shortfalls. We propose a Self-concordant Iterative-minimization-Galerkin-based Multilevel Algorithm (SIGMA) and establish its super-linear convergence rate using the theory of self-concordant functions. Our analysis is based on the connections between multigrid optimization methods, and the role of coarse-grained or reduced-order models in the computation of search directions. We take advantage of the insights from the analysis to significantly improve the performance of second-order methods in machine learning applications. We report encouraging initial experiments that suggest SIGMA outperforms other state-of-the-art sub-sampled/sketched Newton methods for both medium and large-scale problems.
期刊介绍:
The Journal of Optimization Theory and Applications is devoted to the publication of carefully selected regular papers, invited papers, survey papers, technical notes, book notices, and forums that cover mathematical optimization techniques and their applications to science and engineering. Typical theoretical areas include linear, nonlinear, mathematical, and dynamic programming. Among the areas of application covered are mathematical economics, mathematical physics and biology, and aerospace, chemical, civil, electrical, and mechanical engineering.