一种高性能预测预取的成本效益方案

V. Vellanki, A. Chervenak
{"title":"一种高性能预测预取的成本效益方案","authors":"V. Vellanki, A. Chervenak","doi":"10.1145/331532.331582","DOIUrl":null,"url":null,"abstract":"High-performance computing systems will increasingly rely on prefetching data from disk to overcome long disk access times and maintain high utilization of parallel I/O systems. This paper evaluates a prefetching technique that chooses which blocks to prefetch based on their probability of access and decides whether to prefetch a particular block at a given time using a cost-benefit analysis. The algorithm uses a probability tree to record past accesses and to predict future access patterns. We simulate this prefetching algorithm with a variety of I/O traces. We show that our predictive prefetching scheme combined with simple one-block-lookahead prefetching produces good performance for a variety of workloads. The scheme reduces file cache miss rates by up to 36% for workloads that receive no benefit from sequential prefetching. We showthat the memory requirements for building the probability tree are reasonable, requiring about a megabyte for good performance. The probability tree constructed by the prefetching scheme predicts around 60-70% of the accesses. Next, we discuss ways of improving the performance of the prefetching scheme. Finally, we show that the cost-benefit analysis enables the tree-based prefetching scheme to perform an optimal amount of prefetching.","PeriodicalId":354898,"journal":{"name":"ACM/IEEE SC 1999 Conference (SC'99)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"30","resultStr":"{\"title\":\"A Cost-Benefit Scheme for High Performance Predictive Prefetching\",\"authors\":\"V. Vellanki, A. Chervenak\",\"doi\":\"10.1145/331532.331582\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"High-performance computing systems will increasingly rely on prefetching data from disk to overcome long disk access times and maintain high utilization of parallel I/O systems. This paper evaluates a prefetching technique that chooses which blocks to prefetch based on their probability of access and decides whether to prefetch a particular block at a given time using a cost-benefit analysis. The algorithm uses a probability tree to record past accesses and to predict future access patterns. We simulate this prefetching algorithm with a variety of I/O traces. We show that our predictive prefetching scheme combined with simple one-block-lookahead prefetching produces good performance for a variety of workloads. The scheme reduces file cache miss rates by up to 36% for workloads that receive no benefit from sequential prefetching. We showthat the memory requirements for building the probability tree are reasonable, requiring about a megabyte for good performance. The probability tree constructed by the prefetching scheme predicts around 60-70% of the accesses. Next, we discuss ways of improving the performance of the prefetching scheme. Finally, we show that the cost-benefit analysis enables the tree-based prefetching scheme to perform an optimal amount of prefetching.\",\"PeriodicalId\":354898,\"journal\":{\"name\":\"ACM/IEEE SC 1999 Conference (SC'99)\",\"volume\":\"3 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1900-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"30\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACM/IEEE SC 1999 Conference (SC'99)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/331532.331582\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM/IEEE SC 1999 Conference (SC'99)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/331532.331582","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 30

摘要

高性能计算系统将越来越依赖于从磁盘预取数据,以克服长时间的磁盘访问时间,并保持并行I/O系统的高利用率。本文评估了一种预取技术,该技术根据访问的概率选择要预取的块,并使用成本效益分析来决定是否在给定时间预取特定块。该算法使用概率树来记录过去的访问并预测未来的访问模式。我们用各种I/O轨迹模拟这种预取算法。我们表明,我们的预测预取方案与简单的单块预取相结合,可以为各种工作负载提供良好的性能。对于无法从顺序预取中获益的工作负载,该方案最多可将文件缓存丢失率降低36%。我们表明,构建概率树所需的内存是合理的,大约需要1兆字节才能获得良好的性能。由预取方案构建的概率树预测了大约60-70%的访问。接下来,我们讨论了提高预取方案性能的方法。最后,我们证明了成本效益分析使基于树的预取方案能够执行最优的预取量。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A Cost-Benefit Scheme for High Performance Predictive Prefetching
High-performance computing systems will increasingly rely on prefetching data from disk to overcome long disk access times and maintain high utilization of parallel I/O systems. This paper evaluates a prefetching technique that chooses which blocks to prefetch based on their probability of access and decides whether to prefetch a particular block at a given time using a cost-benefit analysis. The algorithm uses a probability tree to record past accesses and to predict future access patterns. We simulate this prefetching algorithm with a variety of I/O traces. We show that our predictive prefetching scheme combined with simple one-block-lookahead prefetching produces good performance for a variety of workloads. The scheme reduces file cache miss rates by up to 36% for workloads that receive no benefit from sequential prefetching. We showthat the memory requirements for building the probability tree are reasonable, requiring about a megabyte for good performance. The probability tree constructed by the prefetching scheme predicts around 60-70% of the accesses. Next, we discuss ways of improving the performance of the prefetching scheme. Finally, we show that the cost-benefit analysis enables the tree-based prefetching scheme to perform an optimal amount of prefetching.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信