S. Sarwar, Y. Mahmood, H. F. Ahmed, Raihan-Ur-Rasool, H. Takahashi
{"title":"基于使用模式的机械大容量存储预取","authors":"S. Sarwar, Y. Mahmood, H. F. Ahmed, Raihan-Ur-Rasool, H. Takahashi","doi":"10.1109/HONET.2008.4810223","DOIUrl":null,"url":null,"abstract":"Cache being the fastest medium in memory hierarchy has a vital role to play in concealing delays and access latencies during 10 operations and hence in improving system response time. One of the most substantial approaches to fully exploit the significance of cache memory is data prefetching, where we envisage future requests of users and take data to memory in advance. Current prefetching techniques, performing limited prefetching, are based upon locality of reference principle (situation specific); Markov series (slow for practical implementation) or dual data caching (quite burdensome for programmer) with biased cache replacement policies. So we present a novel 'usage pattern based' approach for predictive prefetching; employing proven neural networks to broaden the scope of prefetching at user level. The efficacy of approach is revealed by its accuracy and minimal resource usage as affirmed by preliminary results.","PeriodicalId":433243,"journal":{"name":"2008 International Symposium on High Capacity Optical Networks and Enabling Technologies","volume":"41 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2008-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Usage Pattern Based Prefetching For Mechanical Mass Storage\",\"authors\":\"S. Sarwar, Y. Mahmood, H. F. Ahmed, Raihan-Ur-Rasool, H. Takahashi\",\"doi\":\"10.1109/HONET.2008.4810223\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Cache being the fastest medium in memory hierarchy has a vital role to play in concealing delays and access latencies during 10 operations and hence in improving system response time. One of the most substantial approaches to fully exploit the significance of cache memory is data prefetching, where we envisage future requests of users and take data to memory in advance. Current prefetching techniques, performing limited prefetching, are based upon locality of reference principle (situation specific); Markov series (slow for practical implementation) or dual data caching (quite burdensome for programmer) with biased cache replacement policies. So we present a novel 'usage pattern based' approach for predictive prefetching; employing proven neural networks to broaden the scope of prefetching at user level. The efficacy of approach is revealed by its accuracy and minimal resource usage as affirmed by preliminary results.\",\"PeriodicalId\":433243,\"journal\":{\"name\":\"2008 International Symposium on High Capacity Optical Networks and Enabling Technologies\",\"volume\":\"41 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2008-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2008 International Symposium on High Capacity Optical Networks and Enabling Technologies\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/HONET.2008.4810223\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2008 International Symposium on High Capacity Optical Networks and Enabling Technologies","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HONET.2008.4810223","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Usage Pattern Based Prefetching For Mechanical Mass Storage
Cache being the fastest medium in memory hierarchy has a vital role to play in concealing delays and access latencies during 10 operations and hence in improving system response time. One of the most substantial approaches to fully exploit the significance of cache memory is data prefetching, where we envisage future requests of users and take data to memory in advance. Current prefetching techniques, performing limited prefetching, are based upon locality of reference principle (situation specific); Markov series (slow for practical implementation) or dual data caching (quite burdensome for programmer) with biased cache replacement policies. So we present a novel 'usage pattern based' approach for predictive prefetching; employing proven neural networks to broaden the scope of prefetching at user level. The efficacy of approach is revealed by its accuracy and minimal resource usage as affirmed by preliminary results.