Locality-Aware Load Sharing in Mobile Cloud Computing

A. Jonathan, A. Chandra, J. Weissman
{"title":"Locality-Aware Load Sharing in Mobile Cloud Computing","authors":"A. Jonathan, A. Chandra, J. Weissman","doi":"10.1145/3147213.3147228","DOIUrl":null,"url":null,"abstract":"The past few years have seen a growing number of mobile and sensor applications that rely on Cloud support. The role of the Cloud is to allow these resource-limited devices to offload and execute some of their compute-intensive tasks in the Cloud for energy saving and/or faster processing. However, such offloading to the Cloud may result in high network overhead which is not suitable for many mobile/sensor applications that require low latency. So, people have looked at an alternative Cloud design whose resources are located at the edge of the Internet, called Edge Cloud. Although the use of Edge Cloud can mitigate the offloading overhead, the computational power and network bandwidth of Edge Cloud's resources are typically much more limited compared to the centralized Cloud and hence are more sensitive to workload variation (e.g., due to CPU or I/O contention). In this paper, we propose a locality-aware load sharing technique that allows edge resources to share their workload in order to maintain the low latency requirement of Mobile-Cloud applications. Specifically, we study how to determine which edge nodes should be used to share the workload with and how much of the workload should be shared to each node. Our experiments show that our locality-aware load sharing technique is able to maintain low average end-to-end latency of mobile applications with low latency variation, while achieving good utilization of resources in the presence of a dynamic workload.","PeriodicalId":341011,"journal":{"name":"Proceedings of the10th International Conference on Utility and Cloud Computing","volume":"60 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"19","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the10th International Conference on Utility and Cloud Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3147213.3147228","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 19

Abstract

The past few years have seen a growing number of mobile and sensor applications that rely on Cloud support. The role of the Cloud is to allow these resource-limited devices to offload and execute some of their compute-intensive tasks in the Cloud for energy saving and/or faster processing. However, such offloading to the Cloud may result in high network overhead which is not suitable for many mobile/sensor applications that require low latency. So, people have looked at an alternative Cloud design whose resources are located at the edge of the Internet, called Edge Cloud. Although the use of Edge Cloud can mitigate the offloading overhead, the computational power and network bandwidth of Edge Cloud's resources are typically much more limited compared to the centralized Cloud and hence are more sensitive to workload variation (e.g., due to CPU or I/O contention). In this paper, we propose a locality-aware load sharing technique that allows edge resources to share their workload in order to maintain the low latency requirement of Mobile-Cloud applications. Specifically, we study how to determine which edge nodes should be used to share the workload with and how much of the workload should be shared to each node. Our experiments show that our locality-aware load sharing technique is able to maintain low average end-to-end latency of mobile applications with low latency variation, while achieving good utilization of resources in the presence of a dynamic workload.
移动云计算中的位置感知负载共享
在过去的几年里,越来越多的移动和传感器应用程序依赖于云的支持。云的作用是允许这些资源有限的设备卸载并在云中执行一些计算密集型任务,以节省能源和/或更快地处理。然而,这样的云卸载可能会导致高网络开销,这并不适合许多需要低延迟的移动/传感器应用程序。因此,人们开始关注另一种云设计,其资源位于互联网的边缘,称为边缘云。虽然使用边缘云可以减轻卸载开销,但与集中式云相比,边缘云资源的计算能力和网络带宽通常要有限得多,因此对工作负载变化(例如,由于CPU或I/O争用)更敏感。在本文中,我们提出了一种位置感知负载共享技术,该技术允许边缘资源共享其工作负载,以保持移动云应用程序的低延迟要求。具体来说,我们研究了如何确定应该使用哪些边缘节点来共享工作负载,以及每个节点应该共享多少工作负载。我们的实验表明,我们的位置感知负载共享技术能够在低延迟变化的情况下保持移动应用程序的低平均端到端延迟,同时在动态工作负载存在的情况下实现良好的资源利用率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信