Towards Distributed Machine Learning in Shared Clusters: A Dynamically-Partitioned Approach

Peng Sun, Yonggang Wen, T. Duong, Shengen Yan
{"title":"Towards Distributed Machine Learning in Shared Clusters: A Dynamically-Partitioned Approach","authors":"Peng Sun, Yonggang Wen, T. Duong, Shengen Yan","doi":"10.1109/SMARTCOMP.2017.7947053","DOIUrl":null,"url":null,"abstract":"Many cluster management systems (CMSs) have been proposed to share a single cluster with multiple distributed computing systems. However, none of the existing approaches can handle distributed machine learning (ML) workloads given the following criteria: high resource utilization, fair resource allocation and low sharing overhead. To solve this problem, we propose a new CMS named Dorm, incorporating a dynamically-partitioned cluster management mechanism and an utilization-fairness optimizer. Specifically, Dorm uses the container-based virtualization technique to partition a cluster, runs one application per partition, and can dynamically resize each partition at application runtime for resource efficiency and fairness. Each application directly launches its tasks on the assigned partition without petitioning for resources frequently, so Dorm imposes flat sharing overhead. Extensive performance evaluations showed that Dorm could simultaneously increase the resource utilization by a factor of up to 2.32, reduce the fairness loss by a factor of up to 1.52, and speed up popular distributed ML applications by a factor of up to 2.72, compared to existing approaches. Dorm's sharing overhead is less than 5% in most cases.","PeriodicalId":193593,"journal":{"name":"2017 IEEE International Conference on Smart Computing (SMARTCOMP)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"29","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE International Conference on Smart Computing (SMARTCOMP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SMARTCOMP.2017.7947053","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 29

Abstract

Many cluster management systems (CMSs) have been proposed to share a single cluster with multiple distributed computing systems. However, none of the existing approaches can handle distributed machine learning (ML) workloads given the following criteria: high resource utilization, fair resource allocation and low sharing overhead. To solve this problem, we propose a new CMS named Dorm, incorporating a dynamically-partitioned cluster management mechanism and an utilization-fairness optimizer. Specifically, Dorm uses the container-based virtualization technique to partition a cluster, runs one application per partition, and can dynamically resize each partition at application runtime for resource efficiency and fairness. Each application directly launches its tasks on the assigned partition without petitioning for resources frequently, so Dorm imposes flat sharing overhead. Extensive performance evaluations showed that Dorm could simultaneously increase the resource utilization by a factor of up to 2.32, reduce the fairness loss by a factor of up to 1.52, and speed up popular distributed ML applications by a factor of up to 2.72, compared to existing approaches. Dorm's sharing overhead is less than 5% in most cases.
面向共享集群的分布式机器学习:一种动态分区方法
许多集群管理系统(cms)已经被提出与多个分布式计算系统共享一个集群。然而,现有的方法都不能处理分布式机器学习(ML)工作负载,因为有以下标准:高资源利用率,公平资源分配和低共享开销。为了解决这个问题,我们提出了一个新的CMS,名为Dorm,它结合了一个动态分区的集群管理机制和一个利用公平优化器。具体来说,Dorm使用基于容器的虚拟化技术对集群进行分区,每个分区运行一个应用程序,并且可以在应用程序运行时动态调整每个分区的大小,以提高资源效率和公平性。每个应用程序直接在分配的分区上启动它的任务,而不需要频繁地请求资源,因此Dorm增加了平面共享开销。广泛的性能评估表明,与现有方法相比,Dorm可以同时将资源利用率提高2.32倍,将公平性损失降低1.52倍,并将流行的分布式ML应用程序的速度提高2.72倍。大多数情况下,宿舍的共享开销低于5%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信