{"title":"基于规范最小化的凸向量优化算法的收敛性分析","authors":"Çağin Ararat, Firdevs Ulus, Muhammad Umer","doi":"10.1137/23m1574580","DOIUrl":null,"url":null,"abstract":"SIAM Journal on Optimization, Volume 34, Issue 3, Page 2700-2728, September 2024. <br/> Abstract. In this work, we propose an outer approximation algorithm for solving bounded convex vector optimization problems (CVOPs). The scalarization model solved iteratively within the algorithm is a modification of the norm-minimizing scalarization proposed in [Ç. Ararat, F. Ulus, and M. Umer, J. Optim. Theory Appl., 194 (2022), pp. 681–712]. For a predetermined tolerance [math], we prove that the algorithm terminates after finitely many iterations, and it returns a polyhedral outer approximation to the upper image of the CVOP such that the Hausdorff distance between the two is less than [math]. We show that for an arbitrary norm used in the scalarization models, the approximation error after [math] iterations decreases by the order of [math], where [math] is the dimension of the objective space. An improved convergence rate of [math] is proved for the special case of using the Euclidean norm.","PeriodicalId":49529,"journal":{"name":"SIAM Journal on Optimization","volume":"64 1","pages":""},"PeriodicalIF":2.6000,"publicationDate":"2024-07-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Convergence Analysis of a Norm Minimization-Based Convex Vector Optimization Algorithm\",\"authors\":\"Çağin Ararat, Firdevs Ulus, Muhammad Umer\",\"doi\":\"10.1137/23m1574580\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"SIAM Journal on Optimization, Volume 34, Issue 3, Page 2700-2728, September 2024. <br/> Abstract. In this work, we propose an outer approximation algorithm for solving bounded convex vector optimization problems (CVOPs). The scalarization model solved iteratively within the algorithm is a modification of the norm-minimizing scalarization proposed in [Ç. Ararat, F. Ulus, and M. Umer, J. Optim. Theory Appl., 194 (2022), pp. 681–712]. For a predetermined tolerance [math], we prove that the algorithm terminates after finitely many iterations, and it returns a polyhedral outer approximation to the upper image of the CVOP such that the Hausdorff distance between the two is less than [math]. We show that for an arbitrary norm used in the scalarization models, the approximation error after [math] iterations decreases by the order of [math], where [math] is the dimension of the objective space. An improved convergence rate of [math] is proved for the special case of using the Euclidean norm.\",\"PeriodicalId\":49529,\"journal\":{\"name\":\"SIAM Journal on Optimization\",\"volume\":\"64 1\",\"pages\":\"\"},\"PeriodicalIF\":2.6000,\"publicationDate\":\"2024-07-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"SIAM Journal on Optimization\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1137/23m1574580\",\"RegionNum\":1,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIAM Journal on Optimization","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1137/23m1574580","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
摘要
SIAM 优化期刊》,第 34 卷第 3 期,第 2700-2728 页,2024 年 9 月。 摘要在这项工作中,我们提出了一种求解有界凸向量优化问题(CVOPs)的外近似算法。算法中迭代求解的标量化模型是对[Ç. Ararat, F. Ulus, Ç...]中提出的规范最小化标量化的修正。Ararat, F. Ulus, and M. Umer, J. Optim.理论应用》,194 (2022),第 681-712 页]。对于预定公差 [math],我们证明该算法在有限次迭代后终止,并返回 CVOP 上像的多面体外近似,且两者之间的豪斯多夫距离小于 [math]。我们证明,对于标量化模型中使用的任意规范,[math] 次迭代后的近似误差会以 [math] 的数量级减小,其中 [math] 是目标空间的维度。对于使用欧氏规范的特殊情况,我们证明了[math]的收敛率有所提高。
Convergence Analysis of a Norm Minimization-Based Convex Vector Optimization Algorithm
SIAM Journal on Optimization, Volume 34, Issue 3, Page 2700-2728, September 2024. Abstract. In this work, we propose an outer approximation algorithm for solving bounded convex vector optimization problems (CVOPs). The scalarization model solved iteratively within the algorithm is a modification of the norm-minimizing scalarization proposed in [Ç. Ararat, F. Ulus, and M. Umer, J. Optim. Theory Appl., 194 (2022), pp. 681–712]. For a predetermined tolerance [math], we prove that the algorithm terminates after finitely many iterations, and it returns a polyhedral outer approximation to the upper image of the CVOP such that the Hausdorff distance between the two is less than [math]. We show that for an arbitrary norm used in the scalarization models, the approximation error after [math] iterations decreases by the order of [math], where [math] is the dimension of the objective space. An improved convergence rate of [math] is proved for the special case of using the Euclidean norm.
期刊介绍:
The SIAM Journal on Optimization contains research articles on the theory and practice of optimization. The areas addressed include linear and quadratic programming, convex programming, nonlinear programming, complementarity problems, stochastic optimization, combinatorial optimization, integer programming, and convex, nonsmooth and variational analysis. Contributions may emphasize optimization theory, algorithms, software, computational practice, applications, or the links between these subjects.