{"title":"变分不等式组合松弛方法的收敛性","authors":"I. Konnov","doi":"10.1080/10556789808805687","DOIUrl":null,"url":null,"abstract":"A general approach to constructing iterative methods that solve variational inequaliti under mild assumptions is proposed. It is based on combining and modifying ide contained in various relaxation methods. The conditions under which the proposed metho attain linear convergence or terminate with a solution are also given","PeriodicalId":54673,"journal":{"name":"Optimization Methods & Software","volume":"43 1","pages":"77-92"},"PeriodicalIF":1.4000,"publicationDate":"1998-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":"{\"title\":\"On the convergence of combined relaxation methods for variational inequalties\",\"authors\":\"I. Konnov\",\"doi\":\"10.1080/10556789808805687\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A general approach to constructing iterative methods that solve variational inequaliti under mild assumptions is proposed. It is based on combining and modifying ide contained in various relaxation methods. The conditions under which the proposed metho attain linear convergence or terminate with a solution are also given\",\"PeriodicalId\":54673,\"journal\":{\"name\":\"Optimization Methods & Software\",\"volume\":\"43 1\",\"pages\":\"77-92\"},\"PeriodicalIF\":1.4000,\"publicationDate\":\"1998-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"7\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Optimization Methods & Software\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1080/10556789808805687\",\"RegionNum\":3,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, SOFTWARE ENGINEERING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optimization Methods & Software","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1080/10556789808805687","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
On the convergence of combined relaxation methods for variational inequalties
A general approach to constructing iterative methods that solve variational inequaliti under mild assumptions is proposed. It is based on combining and modifying ide contained in various relaxation methods. The conditions under which the proposed metho attain linear convergence or terminate with a solution are also given
期刊介绍:
Optimization Methods and Software
publishes refereed papers on the latest developments in the theory and realization of optimization methods, with particular emphasis on the interface between software development and algorithm design.
Topics include:
Theory, implementation and performance evaluation of algorithms and computer codes for linear, nonlinear, discrete, stochastic optimization and optimal control. This includes in particular conic, semi-definite, mixed integer, network, non-smooth, multi-objective and global optimization by deterministic or nondeterministic algorithms.
Algorithms and software for complementarity, variational inequalities and equilibrium problems, and also for solving inverse problems, systems of nonlinear equations and the numerical study of parameter dependent operators.
Various aspects of efficient and user-friendly implementations: e.g. automatic differentiation, massively parallel optimization, distributed computing, on-line algorithms, error sensitivity and validity analysis, problem scaling, stopping criteria and symbolic numeric interfaces.
Theoretical studies with clear potential for applications and successful applications of specially adapted optimization methods and software to fields like engineering, machine learning, data mining, economics, finance, biology, or medicine. These submissions should not consist solely of the straightforward use of standard optimization techniques.