Prediction Based Sub-Task Offloading in Mobile Edge Computing

Kitae Kim, Jared Lynskey, S. Kang, C. Hong
{"title":"Prediction Based Sub-Task Offloading in Mobile Edge Computing","authors":"Kitae Kim, Jared Lynskey, S. Kang, C. Hong","doi":"10.1109/ICOIN.2019.8718183","DOIUrl":null,"url":null,"abstract":"Mobile Edge Cloud Computing has been developed and introduced to provide low-latency service in close proximity to users. In this environment., resource constrained UE (user equipment) incapable to execute complex applications (i.e VR/AR., Deep Learning., Image Processing Applications) can dynamically offload computationally demanding tasks to neighboring MEC nodes. To process tasks even faster with MEC nodes., we can divide one task into several sub-tasks and offload to multiple MEC nodes simultaneously., thereby each sub-task will be processed in parallel. In this paper., we predict the total processing duration of each task on each candidate MEC node using Linear Regression. According to the previously observed state of each MEC node., we offload sub-tasks to their respective edge node. We also developed a monitoring module at core cloud. The results show a decrease in execution duration when we offload an entire application to one edge node compared with local execution.","PeriodicalId":422041,"journal":{"name":"2019 International Conference on Information Networking (ICOIN)","volume":"61 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"14","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 International Conference on Information Networking (ICOIN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICOIN.2019.8718183","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 14

Abstract

Mobile Edge Cloud Computing has been developed and introduced to provide low-latency service in close proximity to users. In this environment., resource constrained UE (user equipment) incapable to execute complex applications (i.e VR/AR., Deep Learning., Image Processing Applications) can dynamically offload computationally demanding tasks to neighboring MEC nodes. To process tasks even faster with MEC nodes., we can divide one task into several sub-tasks and offload to multiple MEC nodes simultaneously., thereby each sub-task will be processed in parallel. In this paper., we predict the total processing duration of each task on each candidate MEC node using Linear Regression. According to the previously observed state of each MEC node., we offload sub-tasks to their respective edge node. We also developed a monitoring module at core cloud. The results show a decrease in execution duration when we offload an entire application to one edge node compared with local execution.
移动边缘计算中基于预测的子任务卸载
移动边缘云计算已经被开发和引入,以在用户附近提供低延迟的服务。在这种环境下。,资源受限的UE(用户设备)无法执行复杂的应用程序(如VR/AR)。深度学习。(图像处理应用程序)可以动态地将计算要求高的任务卸载到邻近的MEC节点。使用MEC节点更快地处理任务。,我们可以将一个任务分成若干个子任务,并同时卸载到多个MEC节点。,因此每个子任务将并行处理。在本文中。,我们使用线性回归预测每个候选MEC节点上每个任务的总处理时间。根据之前观测到的每个MEC节点的状态。,我们将子任务卸载到各自的边缘节点。我们还在core cloud开发了一个监控模块。结果表明,与本地执行相比,将整个应用程序卸载到一个边缘节点可以减少执行时间。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信