Distributed Sub-gradient Algorithms with Limited Communications

S. Rini, Milind Rao, A. Goldsmith
{"title":"Distributed Sub-gradient Algorithms with Limited Communications","authors":"S. Rini, Milind Rao, A. Goldsmith","doi":"10.1109/IEEECONF44664.2019.9048683","DOIUrl":null,"url":null,"abstract":"We consider the distributed convex optimization scenario in which nodes in a network collectively find the minimum of a function utilizing only local communications and computations. Various sub-gradient algorithms have been developed for this optimization setting for the case in which the global function factorizes as the sum of local functions to be distributed to the nodes in network, including the distributed (i) online gradient descent, (ii) Nesterov gradient descent, and (iii) dual averaging algorithms. Generally speaking, these subgradient algorithms assume that, in each communication round, nodes can exchange messages of size comparable to that of the optimization variable. For many high-dimensional optimization problems, this communication requirement is beyond the capabilities of the communication network supporting the distributed optimization. For this reason, we propose a dimensionality reduction technique based on random projections to adapt these distributed optimization algorithms to networks with communication links of limited capacity. In particular, we analyze the performance of the proposed dimensionality reduction technique for the three algorithms above, i.e. (i)–(iii). Numerical simulations are presented that illustrate the performance of the proposed approach for these three algorithms.","PeriodicalId":6684,"journal":{"name":"2019 53rd Asilomar Conference on Signals, Systems, and Computers","volume":"184 1","pages":"2171-2175"},"PeriodicalIF":0.0000,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 53rd Asilomar Conference on Signals, Systems, and Computers","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IEEECONF44664.2019.9048683","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

We consider the distributed convex optimization scenario in which nodes in a network collectively find the minimum of a function utilizing only local communications and computations. Various sub-gradient algorithms have been developed for this optimization setting for the case in which the global function factorizes as the sum of local functions to be distributed to the nodes in network, including the distributed (i) online gradient descent, (ii) Nesterov gradient descent, and (iii) dual averaging algorithms. Generally speaking, these subgradient algorithms assume that, in each communication round, nodes can exchange messages of size comparable to that of the optimization variable. For many high-dimensional optimization problems, this communication requirement is beyond the capabilities of the communication network supporting the distributed optimization. For this reason, we propose a dimensionality reduction technique based on random projections to adapt these distributed optimization algorithms to networks with communication links of limited capacity. In particular, we analyze the performance of the proposed dimensionality reduction technique for the three algorithms above, i.e. (i)–(iii). Numerical simulations are presented that illustrate the performance of the proposed approach for these three algorithms.
有限通信的分布式子梯度算法
我们考虑分布式凸优化场景,其中网络中的节点仅利用本地通信和计算共同找到函数的最小值。针对这种优化设置,已经开发了各种子梯度算法,其中全局函数被分解为要分布到网络中节点的局部函数的和,包括分布式(i)在线梯度下降,(ii) Nesterov梯度下降和(iii)对偶平均算法。一般来说,这些次梯度算法假设,在每一轮通信中,节点可以交换与优化变量相当大小的消息。对于许多高维优化问题,这种通信需求超出了支持分布式优化的通信网络的能力。因此,我们提出了一种基于随机投影的降维技术,以使这些分布式优化算法适应通信链路容量有限的网络。特别地,我们分析了上述三种算法(即(i) - (iii)所提出的降维技术的性能。数值模拟表明了这三种算法所提出的方法的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信