Cashing in on caching: on-demand contract design with linear pricing

Richard T. B. Ma, D. Towsley
{"title":"Cashing in on caching: on-demand contract design with linear pricing","authors":"Richard T. B. Ma, D. Towsley","doi":"10.1145/2716281.2836093","DOIUrl":null,"url":null,"abstract":"There has been increasing interest in designing and developing highly scalable infrastructures to support the efficient distribution of content. This has led to the recent development of content-oriented network architectures that rely on on-demand caching. This paper addresses the question of how a cache provider can monetize its service. Standard cache management policies such as least recently used (LRU) treat different content in a strongly coupled manner that makes it difficult for a cache provider to design individualized contracts. We propose the use of timer-based caching for the purpose of designing contracts, which allow providers to monetize caching. We focus on on-demand request-based contracts that allow content providers (CPs) to negotiate contracts at the time that requests are made. We propose and analyze three variations, one where a contract is negotiated only at the time of a miss, and two where contracts are negotiated at the times of both misses and hits. The latter two differ from one another according to whether pricing is based on cache occupancy (time content spends in the cache) or on request rate. We conclude that the first one is least preferable and that the last one provides the provider greater opportunity for profit and greater flexibility to CPs.","PeriodicalId":169539,"journal":{"name":"Proceedings of the 11th ACM Conference on Emerging Networking Experiments and Technologies","volume":"35 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"31","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 11th ACM Conference on Emerging Networking Experiments and Technologies","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2716281.2836093","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 31

Abstract

There has been increasing interest in designing and developing highly scalable infrastructures to support the efficient distribution of content. This has led to the recent development of content-oriented network architectures that rely on on-demand caching. This paper addresses the question of how a cache provider can monetize its service. Standard cache management policies such as least recently used (LRU) treat different content in a strongly coupled manner that makes it difficult for a cache provider to design individualized contracts. We propose the use of timer-based caching for the purpose of designing contracts, which allow providers to monetize caching. We focus on on-demand request-based contracts that allow content providers (CPs) to negotiate contracts at the time that requests are made. We propose and analyze three variations, one where a contract is negotiated only at the time of a miss, and two where contracts are negotiated at the times of both misses and hits. The latter two differ from one another according to whether pricing is based on cache occupancy (time content spends in the cache) or on request rate. We conclude that the first one is least preferable and that the last one provides the provider greater opportunity for profit and greater flexibility to CPs.
利用缓存获利:线性定价的按需合约设计
人们对设计和开发高度可伸缩的基础设施以支持有效的内容分发越来越感兴趣。这导致了最近依赖于按需缓存的面向内容的网络体系结构的发展。本文解决了缓存提供商如何将其服务货币化的问题。标准的缓存管理策略,如最近最少使用(least recently used, LRU),以强耦合的方式处理不同的内容,这使得缓存提供者很难设计个性化的契约。我们建议使用基于定时器的缓存来设计合约,这允许提供商将缓存货币化。我们关注的是基于按需请求的契约,它允许内容提供者(CPs)在发出请求时协商契约。我们提出并分析了三种变化,一种是只在未命中时谈判合同,另一种是在未命中和命中时谈判合同。后两者根据定价是基于缓存占用(内容在缓存中花费的时间)还是基于请求率而有所不同。我们得出的结论是,第一种方法是最不可取的,而最后一种方法为CPs提供了更大的盈利机会和更大的灵活性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信