{"title":"变分不等式的近端外推梯度方法。","authors":"Yu Malitsky","doi":"10.1080/10556788.2017.1300899","DOIUrl":null,"url":null,"abstract":"<p><p>The paper concerns with novel first-order methods for monotone variational inequalities. They use a very simple linesearch procedure that takes into account a local information of the operator. Also, the methods do not require Lipschitz continuity of the operator and the linesearch procedure uses only values of the operator. Moreover, when the operator is affine our linesearch becomes very simple, namely, it needs only simple vector-vector operations. For all our methods, we establish the ergodic convergence rate. In addition, we modify one of the proposed methods for the case of a composite minimization. Preliminary results from numerical experiments are quite promising.</p>","PeriodicalId":54673,"journal":{"name":"Optimization Methods & Software","volume":"33 1","pages":"140-164"},"PeriodicalIF":1.4000,"publicationDate":"2017-03-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/10556788.2017.1300899","citationCount":"53","resultStr":"{\"title\":\"Proximal extrapolated gradient methods for variational inequalities.\",\"authors\":\"Yu Malitsky\",\"doi\":\"10.1080/10556788.2017.1300899\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>The paper concerns with novel first-order methods for monotone variational inequalities. They use a very simple linesearch procedure that takes into account a local information of the operator. Also, the methods do not require Lipschitz continuity of the operator and the linesearch procedure uses only values of the operator. Moreover, when the operator is affine our linesearch becomes very simple, namely, it needs only simple vector-vector operations. For all our methods, we establish the ergodic convergence rate. In addition, we modify one of the proposed methods for the case of a composite minimization. Preliminary results from numerical experiments are quite promising.</p>\",\"PeriodicalId\":54673,\"journal\":{\"name\":\"Optimization Methods & Software\",\"volume\":\"33 1\",\"pages\":\"140-164\"},\"PeriodicalIF\":1.4000,\"publicationDate\":\"2017-03-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1080/10556788.2017.1300899\",\"citationCount\":\"53\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Optimization Methods & Software\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1080/10556788.2017.1300899\",\"RegionNum\":3,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2018/1/1 0:00:00\",\"PubModel\":\"eCollection\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, SOFTWARE ENGINEERING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optimization Methods & Software","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1080/10556788.2017.1300899","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2018/1/1 0:00:00","PubModel":"eCollection","JCR":"Q3","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
Proximal extrapolated gradient methods for variational inequalities.
The paper concerns with novel first-order methods for monotone variational inequalities. They use a very simple linesearch procedure that takes into account a local information of the operator. Also, the methods do not require Lipschitz continuity of the operator and the linesearch procedure uses only values of the operator. Moreover, when the operator is affine our linesearch becomes very simple, namely, it needs only simple vector-vector operations. For all our methods, we establish the ergodic convergence rate. In addition, we modify one of the proposed methods for the case of a composite minimization. Preliminary results from numerical experiments are quite promising.
期刊介绍:
Optimization Methods and Software
publishes refereed papers on the latest developments in the theory and realization of optimization methods, with particular emphasis on the interface between software development and algorithm design.
Topics include:
Theory, implementation and performance evaluation of algorithms and computer codes for linear, nonlinear, discrete, stochastic optimization and optimal control. This includes in particular conic, semi-definite, mixed integer, network, non-smooth, multi-objective and global optimization by deterministic or nondeterministic algorithms.
Algorithms and software for complementarity, variational inequalities and equilibrium problems, and also for solving inverse problems, systems of nonlinear equations and the numerical study of parameter dependent operators.
Various aspects of efficient and user-friendly implementations: e.g. automatic differentiation, massively parallel optimization, distributed computing, on-line algorithms, error sensitivity and validity analysis, problem scaling, stopping criteria and symbolic numeric interfaces.
Theoretical studies with clear potential for applications and successful applications of specially adapted optimization methods and software to fields like engineering, machine learning, data mining, economics, finance, biology, or medicine. These submissions should not consist solely of the straightforward use of standard optimization techniques.