在线性网络中通过多服务器编码缓存进行内容传递

S. P. Shariatpanahi, A. Motahari, B. Khalaj
{"title":"在线性网络中通过多服务器编码缓存进行内容传递","authors":"S. P. Shariatpanahi, A. Motahari, B. Khalaj","doi":"10.1109/ITWF.2015.7360777","DOIUrl":null,"url":null,"abstract":"We consider a content delivery network where multiple servers are connected to multiple cache-enabled clients. Clients request their corresponding contents from the servers and servers collaboratively transmit packets to fulfill all the requests. It is assumed that some contents are stored in the caches in off-peak time of the network without knowing the actual requests, the so called cache content placement phase. The goal is to minimize the worst case delay in the content delivery phase. Considering a random linear network, we propose a coding strategy which exploits servers' multiplexing gains as well as caches' global and local coding gains. The main idea in our coding scheme is to expand the number of users benefiting from a single packet by using zero-forcing techniques. This results in an increase in multi-casting gain which in turn provides faster delivery of the contents to the users. In addition, we show that our scheme is optimal for a certain regime of parameters.","PeriodicalId":281890,"journal":{"name":"2015 IEEE Information Theory Workshop - Fall (ITW)","volume":"35 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-12-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Content delivery via multi-server coded caching in linear networks\",\"authors\":\"S. P. Shariatpanahi, A. Motahari, B. Khalaj\",\"doi\":\"10.1109/ITWF.2015.7360777\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We consider a content delivery network where multiple servers are connected to multiple cache-enabled clients. Clients request their corresponding contents from the servers and servers collaboratively transmit packets to fulfill all the requests. It is assumed that some contents are stored in the caches in off-peak time of the network without knowing the actual requests, the so called cache content placement phase. The goal is to minimize the worst case delay in the content delivery phase. Considering a random linear network, we propose a coding strategy which exploits servers' multiplexing gains as well as caches' global and local coding gains. The main idea in our coding scheme is to expand the number of users benefiting from a single packet by using zero-forcing techniques. This results in an increase in multi-casting gain which in turn provides faster delivery of the contents to the users. In addition, we show that our scheme is optimal for a certain regime of parameters.\",\"PeriodicalId\":281890,\"journal\":{\"name\":\"2015 IEEE Information Theory Workshop - Fall (ITW)\",\"volume\":\"35 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-12-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2015 IEEE Information Theory Workshop - Fall (ITW)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ITWF.2015.7360777\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 IEEE Information Theory Workshop - Fall (ITW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITWF.2015.7360777","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

摘要

我们考虑一个内容交付网络,其中多个服务器连接到多个启用缓存的客户端。客户端向服务器请求相应的内容,服务器协同传输数据包以满足所有请求。假设在网络的非高峰时段,在不知道实际请求的情况下,将一些内容存储在缓存中,即所谓的缓存内容放置阶段。目标是最小化内容交付阶段的最坏情况延迟。考虑到随机线性网络,我们提出了一种利用服务器复用增益以及缓存的全局和局部编码增益的编码策略。我们的编码方案的主要思想是通过使用零强迫技术来扩大从单个数据包中受益的用户数量。这导致了多播增益的增加,这反过来又提供了更快的内容交付给用户。此外,我们还证明了我们的方案对于一定的参数范围是最优的。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Content delivery via multi-server coded caching in linear networks
We consider a content delivery network where multiple servers are connected to multiple cache-enabled clients. Clients request their corresponding contents from the servers and servers collaboratively transmit packets to fulfill all the requests. It is assumed that some contents are stored in the caches in off-peak time of the network without knowing the actual requests, the so called cache content placement phase. The goal is to minimize the worst case delay in the content delivery phase. Considering a random linear network, we propose a coding strategy which exploits servers' multiplexing gains as well as caches' global and local coding gains. The main idea in our coding scheme is to expand the number of users benefiting from a single packet by using zero-forcing techniques. This results in an increase in multi-casting gain which in turn provides faster delivery of the contents to the users. In addition, we show that our scheme is optimal for a certain regime of parameters.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信