EURO Journal on Computational Optimization最新文献

筛选
英文 中文
Decentralized personalized federated learning: Lower bounds and optimal algorithm for all personalization modes 分散个性化联邦学习:所有个性化模式的下界和最优算法
IF 2.4
EURO Journal on Computational Optimization Pub Date : 2022-01-01 DOI: 10.1016/j.ejco.2022.100041
Abdurakhmon Sadiev , Ekaterina Borodich , Aleksandr Beznosikov , Darina Dvinskikh , Saveliy Chezhegov , Rachael Tappenden , Martin Takáč , Alexander Gasnikov
{"title":"Decentralized personalized federated learning: Lower bounds and optimal algorithm for all personalization modes","authors":"Abdurakhmon Sadiev ,&nbsp;Ekaterina Borodich ,&nbsp;Aleksandr Beznosikov ,&nbsp;Darina Dvinskikh ,&nbsp;Saveliy Chezhegov ,&nbsp;Rachael Tappenden ,&nbsp;Martin Takáč ,&nbsp;Alexander Gasnikov","doi":"10.1016/j.ejco.2022.100041","DOIUrl":"10.1016/j.ejco.2022.100041","url":null,"abstract":"<div><p>This paper considers the problem of decentralized, personalized federated learning. For centralized personalized federated learning, a penalty that measures the deviation from the local model and its average, is often added to the objective function. However, in a decentralized setting this penalty is expensive in terms of communication costs, so here, a different penalty — one that is built to respect the structure of the underlying computational network — is used instead. We present lower bounds on the communication and local computation costs for this problem formulation and we also present provably optimal methods for decentralized personalized federated learning. Numerical experiments are presented to demonstrate the practical performance of our methods.</p></div>","PeriodicalId":51880,"journal":{"name":"EURO Journal on Computational Optimization","volume":"10 ","pages":"Article 100041"},"PeriodicalIF":2.4,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S219244062200017X/pdfft?md5=e8af747bbfba4e47278ad3d8a99e3881&pid=1-s2.0-S219244062200017X-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122953591","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Twenty years of EUROPT, the EURO working group on Continuous Optimization 二十年的EUROPT,欧洲持续优化工作组
IF 2.4
EURO Journal on Computational Optimization Pub Date : 2022-01-01 DOI: 10.1016/j.ejco.2022.100039
Sonia Cafieri , Tatiana Tchemisova , Gerhard-Wilhelm Weber
{"title":"Twenty years of EUROPT, the EURO working group on Continuous Optimization","authors":"Sonia Cafieri ,&nbsp;Tatiana Tchemisova ,&nbsp;Gerhard-Wilhelm Weber","doi":"10.1016/j.ejco.2022.100039","DOIUrl":"10.1016/j.ejco.2022.100039","url":null,"abstract":"<div><p>EUROPT, the Continuous Optimization working group of EURO, celebrated its 20 years of activity in 2020. We trace the history of this working group by presenting the major milestones that have led to its current structure and organization and its major trademarks, such as the annual EUROPT workshop and the EUROPT Fellow recognition.</p></div>","PeriodicalId":51880,"journal":{"name":"EURO Journal on Computational Optimization","volume":"10 ","pages":"Article 100039"},"PeriodicalIF":2.4,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2192440622000156/pdfft?md5=d70136dd19d5184ddafe323e89eb1929&pid=1-s2.0-S2192440622000156-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129955865","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
New neighborhoods and an iterated local search algorithm for the generalized traveling salesman problem 广义旅行商问题的新邻域及迭代局部搜索算法
IF 2.4
EURO Journal on Computational Optimization Pub Date : 2022-01-01 DOI: 10.1016/j.ejco.2022.100029
Jeanette Schmidt, Stefan Irnich
{"title":"New neighborhoods and an iterated local search algorithm for the generalized traveling salesman problem","authors":"Jeanette Schmidt,&nbsp;Stefan Irnich","doi":"10.1016/j.ejco.2022.100029","DOIUrl":"10.1016/j.ejco.2022.100029","url":null,"abstract":"<div><p>For a given graph with a vertex set that is partitioned into clusters, the generalized traveling salesman problem (GTSP) is the problem of finding a cost-minimal cycle that contains exactly one vertex of every cluster. We introduce three new GTSP neighborhoods that allow the simultaneous permutation of the sequence of the clusters and the selection of vertices from each cluster. The three neighborhoods and some known neighborhoods from the literature are combined into an effective iterated local search (ILS) for the GTSP. The ILS performs a straightforward random neighborhood selection within the local search and applies an ordinary record-to-record ILS acceptance criterion. The computational experiments on four symmetric standard GTSP libraries show that, with some purposeful refinements, the ILS can compete with state-of-the-art GTSP algorithms.</p></div>","PeriodicalId":51880,"journal":{"name":"EURO Journal on Computational Optimization","volume":"10 ","pages":"Article 100029"},"PeriodicalIF":2.4,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2192440622000053/pdfft?md5=f5688517686dac40484c0d65534f3440&pid=1-s2.0-S2192440622000053-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130152340","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Chance-constrained optimization under limited distributional information: A review of reformulations based on sampling and distributional robustness 有限分布信息下的机会约束优化:基于抽样和分布鲁棒性的重新表述综述
IF 2.4
EURO Journal on Computational Optimization Pub Date : 2022-01-01 DOI: 10.1016/j.ejco.2022.100030
Simge Küçükyavuz , Ruiwei Jiang
{"title":"Chance-constrained optimization under limited distributional information: A review of reformulations based on sampling and distributional robustness","authors":"Simge Küçükyavuz ,&nbsp;Ruiwei Jiang","doi":"10.1016/j.ejco.2022.100030","DOIUrl":"10.1016/j.ejco.2022.100030","url":null,"abstract":"<div><p>Chance-constrained programming (CCP) is one of the most difficult classes of optimization problems that has attracted the attention of researchers since the 1950s. In this survey, we focus on cases when only limited information on the distribution is available, such as a sample from the distribution, or the moments of the distribution. We first review recent developments in mixed-integer linear formulations of chance-constrained programs that arise from finite discrete distributions (or sample average approximation). We highlight successful reformulations and decomposition techniques that enable the solution of large-scale instances. We then review active research in distributionally robust CCP, which is a framework to address the ambiguity in the distribution of the random data. The focal point of our review is on scalable formulations that can be readily implemented with state-of-the-art optimization software. Furthermore, we highlight the prevalence of CCPs with a review of applications across multiple domains.</p></div>","PeriodicalId":51880,"journal":{"name":"EURO Journal on Computational Optimization","volume":"10 ","pages":"Article 100030"},"PeriodicalIF":2.4,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2192440622000065/pdfft?md5=9e123764c3bf29d9a58fa0f64cbc4b9a&pid=1-s2.0-S2192440622000065-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131035749","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
Direct nonlinear acceleration 直接非线性加速度
IF 2.4
EURO Journal on Computational Optimization Pub Date : 2022-01-01 DOI: 10.1016/j.ejco.2022.100047
Aritra Dutta , El Houcine Bergou , Yunming Xiao , Marco Canini , Peter Richtárik
{"title":"Direct nonlinear acceleration","authors":"Aritra Dutta ,&nbsp;El Houcine Bergou ,&nbsp;Yunming Xiao ,&nbsp;Marco Canini ,&nbsp;Peter Richtárik","doi":"10.1016/j.ejco.2022.100047","DOIUrl":"10.1016/j.ejco.2022.100047","url":null,"abstract":"<div><p>Optimization acceleration techniques such as momentum play a key role in state-of-the-art machine learning algorithms. Recently, generic vector sequence extrapolation techniques, such as regularized nonlinear acceleration (RNA) of Scieur et al. <span>[22]</span>, were proposed and shown to accelerate fixed point iterations. In contrast to RNA which computes extrapolation coefficients by (approximately) setting the gradient of the objective function to zero at the extrapolated point, we propose a more direct approach, which we call <em>direct nonlinear acceleration (DNA)</em>. In DNA, we aim to minimize (an approximation of) the function value at the extrapolated point instead. We adopt a regularized approach with regularizers designed to prevent the model from entering a region in which the functional approximation is less precise. While the computational cost of DNA is comparable to that of RNA, our direct approach significantly outperforms RNA on both synthetic and real-world datasets. While the focus of this paper is on convex problems, we obtain very encouraging results in accelerating the training of neural networks.</p></div>","PeriodicalId":51880,"journal":{"name":"EURO Journal on Computational Optimization","volume":"10 ","pages":"Article 100047"},"PeriodicalIF":2.4,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2192440622000235/pdfft?md5=1af83969ee833bb0a8954f808f6ca4ee&pid=1-s2.0-S2192440622000235-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131687887","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
First-Order Methods for Convex Optimization 凸优化的一阶方法
IF 2.4
EURO Journal on Computational Optimization Pub Date : 2021-01-01 DOI: 10.1016/j.ejco.2021.100015
Pavel Dvurechensky , Shimrit Shtern , Mathias Staudigl
{"title":"First-Order Methods for Convex Optimization","authors":"Pavel Dvurechensky ,&nbsp;Shimrit Shtern ,&nbsp;Mathias Staudigl","doi":"10.1016/j.ejco.2021.100015","DOIUrl":"10.1016/j.ejco.2021.100015","url":null,"abstract":"<div><p>First-order methods for solving convex optimization problems have been at the forefront of mathematical optimization in the last 20 years. The rapid development of this important class of algorithms is motivated by the success stories reported in various applications, including most importantly machine learning, signal processing, imaging and control theory. First-order methods have the potential to provide low accuracy solutions at low computational complexity which makes them an attractive set of tools in large-scale optimization problems. In this survey, we cover a number of key developments in gradient-based optimization methods. This includes non-Euclidean extensions of the classical proximal gradient method, and its accelerated versions. Additionally we survey recent developments within the class of projection-free methods, and proximal versions of primal-dual schemes. We give complete proofs for various key results, and highlight the unifying aspects of several optimization algorithms.</p></div>","PeriodicalId":51880,"journal":{"name":"EURO Journal on Computational Optimization","volume":"9 ","pages":"Article 100015"},"PeriodicalIF":2.4,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2192440621001428/pdfft?md5=19763cbf839252d3f78a91ae92c0f36f&pid=1-s2.0-S2192440621001428-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128767137","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 20
Conic optimization: A survey with special focus on copositive optimization and binary quadratic problems 二次优化:特别关注组合优化和二元二次问题的调查
IF 2.4
EURO Journal on Computational Optimization Pub Date : 2021-01-01 DOI: 10.1016/j.ejco.2021.100021
Mirjam Dür , Franz Rendl
{"title":"Conic optimization: A survey with special focus on copositive optimization and binary quadratic problems","authors":"Mirjam Dür ,&nbsp;Franz Rendl","doi":"10.1016/j.ejco.2021.100021","DOIUrl":"https://doi.org/10.1016/j.ejco.2021.100021","url":null,"abstract":"<div><p>A conic optimization problem is a problem involving a constraint that the optimization variable be in some closed convex cone. Prominent examples are linear programs (LP), second order cone programs (SOCP), semidefinite problems (SDP), and copositive problems. We survey recent progress made in this area. In particular, we highlight the connections between nonconvex quadratic problems, binary quadratic problems, and copositive optimization. We review how tight bounds can be obtained by relaxing the copositivity constraint to semidefiniteness, and we discuss the effect that different modelling techniques have on the quality of the bounds. We also provide some new techniques for lifting linear constraints and show how these can be used for stable set and coloring relaxations.</p></div>","PeriodicalId":51880,"journal":{"name":"EURO Journal on Computational Optimization","volume":"9 ","pages":"Article 100021"},"PeriodicalIF":2.4,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2192440621001489/pdfft?md5=2fd9af7537cd98f646e5236b30d3d05f&pid=1-s2.0-S2192440621001489-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91979793","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 16
Pareto front approximation through a multi-objective augmented Lagrangian method 基于多目标增广拉格朗日方法的Pareto前逼近
IF 2.4
EURO Journal on Computational Optimization Pub Date : 2021-01-01 DOI: 10.1016/j.ejco.2021.100008
Guido Cocchi , Matteo Lapucci , Pierluigi Mansueto
{"title":"Pareto front approximation through a multi-objective augmented Lagrangian method","authors":"Guido Cocchi ,&nbsp;Matteo Lapucci ,&nbsp;Pierluigi Mansueto","doi":"10.1016/j.ejco.2021.100008","DOIUrl":"https://doi.org/10.1016/j.ejco.2021.100008","url":null,"abstract":"<div><p>In this manuscript, we consider smooth multi-objective optimization problems with convex constraints. We propose an extension of a multi-objective augmented Lagrangian Method from recent literature. The new algorithm is specifically designed to handle sets of points and produce good approximations of the whole Pareto front, as opposed to the original one which converges to a single solution. We prove properties of global convergence to Pareto stationarity for the sequences of points generated by our procedure. We then compare the performance of the proposed method with those of the main state-of-the-art algorithms available for the considered class of problems. The results of our experiments show the effectiveness and general superiority w.r.t. competitors of our proposed approach.</p></div>","PeriodicalId":51880,"journal":{"name":"EURO Journal on Computational Optimization","volume":"9 ","pages":"Article 100008"},"PeriodicalIF":2.4,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.ejco.2021.100008","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91979837","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Robust flows with adaptive mitigation 具有自适应缓解的健壮流
IF 2.4
EURO Journal on Computational Optimization Pub Date : 2021-01-01 DOI: 10.1016/j.ejco.2020.100002
Heiner Ackermann , Erik Diessel , Sven O. Krumke
{"title":"Robust flows with adaptive mitigation","authors":"Heiner Ackermann ,&nbsp;Erik Diessel ,&nbsp;Sven O. Krumke","doi":"10.1016/j.ejco.2020.100002","DOIUrl":"https://doi.org/10.1016/j.ejco.2020.100002","url":null,"abstract":"<div><p>We consider an adjustable robust optimization problem arising in the area of supply chains: given sets of suppliers and demand nodes, we wish to find a flow that is robust with respect to failures of the suppliers. The objective is to determine a flow that minimizes the amount of shortage in the worst-case after an optimal mitigation has been performed. An optimal mitigation is an additional flow in the residual network that mitigates as much shortage at the demand sites as possible. For this problem we give a mathematical formulation, yielding a robust flow problem with three stages where the mitigation of the last stage can be chosen adaptively depending on the scenario. We show that already evaluating the robustness of a solution is <span><math><mi>NP</mi></math></span>-hard. For optimizing with respect to this <span><math><mi>NP</mi></math></span>-hard objective function, we compare three algorithms. Namely an algorithm based on iterative cut generation that solves medium-sized instances efficiently, a simple Outer Linearization Algorithm and a Scenario Enumeration algorithm. We illustrate the performance by numerical experiments. The results show that this instance of fully adjustable robust optimization problems can be solved exactly with a reasonable performance. We also describe possible extensions to the model and the algorithm.</p></div>","PeriodicalId":51880,"journal":{"name":"EURO Journal on Computational Optimization","volume":"9 ","pages":"Article 100002"},"PeriodicalIF":2.4,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.ejco.2020.100002","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91979864","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Some contributions of Ailsa H. Land to the study of the traveling salesman problem 艾尔萨·h·兰德对旅行商问题研究的一些贡献
IF 2.4
EURO Journal on Computational Optimization Pub Date : 2021-01-01 DOI: 10.1016/j.ejco.2021.100018
Gilbert Laporte
{"title":"Some contributions of Ailsa H. Land to the study of the traveling salesman problem","authors":"Gilbert Laporte","doi":"10.1016/j.ejco.2021.100018","DOIUrl":"10.1016/j.ejco.2021.100018","url":null,"abstract":"<div><p>Ailsa H. Land, who received the 2021 EURO Gold Medal, made some important contributions to the study of the Traveling Salesman Problem, which were published in a 1955 journal article and in a 1979 working paper. The purpose of this introductory note is to describe these contributions.</p></div>","PeriodicalId":51880,"journal":{"name":"EURO Journal on Computational Optimization","volume":"9 ","pages":"Article 100018"},"PeriodicalIF":2.4,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2192440621001453/pdfft?md5=44e328d6324bf9179a1e115333a0d1cb&pid=1-s2.0-S2192440621001453-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54300215","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信