EdgeBuffer:在MobilityFirst未来互联网架构的边缘缓存和预取内容

Feixiong Zhang, Chenren Xu, Yanyong Zhang, Kadangode K. Ramakrishnan, Shreyasee Mukherjee, R. Yates, Thu D. Nguyen
{"title":"EdgeBuffer:在MobilityFirst未来互联网架构的边缘缓存和预取内容","authors":"Feixiong Zhang, Chenren Xu, Yanyong Zhang, Kadangode K. Ramakrishnan, Shreyasee Mukherjee, R. Yates, Thu D. Nguyen","doi":"10.1109/WoWMoM.2015.7158137","DOIUrl":null,"url":null,"abstract":"The prevalence of mobile devices especially smartphones has attracted research on mobile content delivery techniques. In this paper, we propose to take advantage of the storage available at wireless access points to bring content closer to mobile devices, hence improving the downloading performance. Specifically, we propose to have a separate popularity based cache and a prefetch buffer at the network edge to capture both long-term and short-term content access patterns. Further, we point out that it is insufficient to rely on a device's past history to predict when and where to prefetch, especially in urban settings; instead, we propose to derive a prediction model based on the aggregated network-level statistics. We discuss the proposed mobile content caching/prefetching method in the context of the MobilityFirst future Internet architecture. In MobilityFirst, when mobile clients move between network attachment points (e.g., Wi-Fi access points), their network association records are logged by the network, which then naturally facilitates the network-level mobility prediction. Through detailed simulations with real taxi mobility traces, we show that such a strategy is more effective than earlier schemes in satisfying content requests at the edge (higher cache hit ratios), leading to shorter content download latencies. Specifically, the fraction of requests satisfied at the edge increases by a factor of 2.9 compared to a caching only approach, and by 45% compared to individual user-based prediction and prefetching.","PeriodicalId":221796,"journal":{"name":"2015 IEEE 16th International Symposium on A World of Wireless, Mobile and Multimedia Networks (WoWMoM)","volume":"47 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"77","resultStr":"{\"title\":\"EdgeBuffer: Caching and prefetching content at the edge in the MobilityFirst future Internet architecture\",\"authors\":\"Feixiong Zhang, Chenren Xu, Yanyong Zhang, Kadangode K. Ramakrishnan, Shreyasee Mukherjee, R. Yates, Thu D. Nguyen\",\"doi\":\"10.1109/WoWMoM.2015.7158137\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The prevalence of mobile devices especially smartphones has attracted research on mobile content delivery techniques. In this paper, we propose to take advantage of the storage available at wireless access points to bring content closer to mobile devices, hence improving the downloading performance. Specifically, we propose to have a separate popularity based cache and a prefetch buffer at the network edge to capture both long-term and short-term content access patterns. Further, we point out that it is insufficient to rely on a device's past history to predict when and where to prefetch, especially in urban settings; instead, we propose to derive a prediction model based on the aggregated network-level statistics. We discuss the proposed mobile content caching/prefetching method in the context of the MobilityFirst future Internet architecture. In MobilityFirst, when mobile clients move between network attachment points (e.g., Wi-Fi access points), their network association records are logged by the network, which then naturally facilitates the network-level mobility prediction. Through detailed simulations with real taxi mobility traces, we show that such a strategy is more effective than earlier schemes in satisfying content requests at the edge (higher cache hit ratios), leading to shorter content download latencies. Specifically, the fraction of requests satisfied at the edge increases by a factor of 2.9 compared to a caching only approach, and by 45% compared to individual user-based prediction and prefetching.\",\"PeriodicalId\":221796,\"journal\":{\"name\":\"2015 IEEE 16th International Symposium on A World of Wireless, Mobile and Multimedia Networks (WoWMoM)\",\"volume\":\"47 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-06-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"77\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2015 IEEE 16th International Symposium on A World of Wireless, Mobile and Multimedia Networks (WoWMoM)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/WoWMoM.2015.7158137\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 IEEE 16th International Symposium on A World of Wireless, Mobile and Multimedia Networks (WoWMoM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WoWMoM.2015.7158137","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 77

摘要

移动设备尤其是智能手机的普及吸引了对移动内容交付技术的研究。在本文中,我们建议利用无线接入点可用的存储空间,使内容更接近移动设备,从而提高下载性能。具体来说,我们建议在网络边缘设置一个单独的基于流行度的缓存和一个预取缓冲区,以捕获长期和短期内容访问模式。此外,我们指出,依靠设备的过去历史来预测何时何地预取是不够的,特别是在城市环境中;相反,我们建议推导一个基于聚合网络级统计的预测模型。我们在MobilityFirst未来互联网架构的背景下讨论了提出的移动内容缓存/预取方法。在MobilityFirst中,当移动客户端在网络附着点(例如Wi-Fi接入点)之间移动时,网络将记录其网络关联记录,这自然有助于网络级移动性预测。通过对真实出租车移动轨迹的详细模拟,我们表明,在满足边缘(更高的缓存命中率)的内容请求方面,这种策略比以前的方案更有效,从而缩短了内容下载延迟。具体来说,与仅缓存方法相比,在边缘满足的请求比例增加了2.9倍,与基于个人用户的预测和预取相比,增加了45%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
EdgeBuffer: Caching and prefetching content at the edge in the MobilityFirst future Internet architecture
The prevalence of mobile devices especially smartphones has attracted research on mobile content delivery techniques. In this paper, we propose to take advantage of the storage available at wireless access points to bring content closer to mobile devices, hence improving the downloading performance. Specifically, we propose to have a separate popularity based cache and a prefetch buffer at the network edge to capture both long-term and short-term content access patterns. Further, we point out that it is insufficient to rely on a device's past history to predict when and where to prefetch, especially in urban settings; instead, we propose to derive a prediction model based on the aggregated network-level statistics. We discuss the proposed mobile content caching/prefetching method in the context of the MobilityFirst future Internet architecture. In MobilityFirst, when mobile clients move between network attachment points (e.g., Wi-Fi access points), their network association records are logged by the network, which then naturally facilitates the network-level mobility prediction. Through detailed simulations with real taxi mobility traces, we show that such a strategy is more effective than earlier schemes in satisfying content requests at the edge (higher cache hit ratios), leading to shorter content download latencies. Specifically, the fraction of requests satisfied at the edge increases by a factor of 2.9 compared to a caching only approach, and by 45% compared to individual user-based prediction and prefetching.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信