实时系统的可预测代码和数据分页

D. Hardy, I. Puaut
{"title":"实时系统的可预测代码和数据分页","authors":"D. Hardy, I. Puaut","doi":"10.1109/ECRTS.2008.16","DOIUrl":null,"url":null,"abstract":"There is a need for using virtual memory in real-time applications: using virtual addressing provides isolation between concurrent processes; in addition, paging allows the execution of applications whose size is larger than main memory capacity, which is useful in embedded systems where main memory is expensive and thus scarce. However, virtual memory is generally avoided when developing real-time and embedded applications due to predictability issues. In this paper we propose a predictable paging system in which the page loading and page eviction points are selected at compile-time. The contents of main memory is selected using an Integer Linear Programming (ILP) formulation. Our approach is applied to code, static data and stack regions of individual tasks. We show that the time required for selecting memory contents is reasonable for all applications including the largest ones, demonstrating the scalability of our approach. Experimental results compare our approach with a previous one, based on graph coloring. It shows that quality of page allocation is generally improved, with an average improvement of 30% over the previous approach. Another comparison with a state-of-the-art demand-paging system shows that predictability does not come at the price of performance loss.","PeriodicalId":176327,"journal":{"name":"2008 Euromicro Conference on Real-Time Systems","volume":"22 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2008-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"18","resultStr":"{\"title\":\"Predictable Code and Data Paging for Real Time Systems\",\"authors\":\"D. Hardy, I. Puaut\",\"doi\":\"10.1109/ECRTS.2008.16\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"There is a need for using virtual memory in real-time applications: using virtual addressing provides isolation between concurrent processes; in addition, paging allows the execution of applications whose size is larger than main memory capacity, which is useful in embedded systems where main memory is expensive and thus scarce. However, virtual memory is generally avoided when developing real-time and embedded applications due to predictability issues. In this paper we propose a predictable paging system in which the page loading and page eviction points are selected at compile-time. The contents of main memory is selected using an Integer Linear Programming (ILP) formulation. Our approach is applied to code, static data and stack regions of individual tasks. We show that the time required for selecting memory contents is reasonable for all applications including the largest ones, demonstrating the scalability of our approach. Experimental results compare our approach with a previous one, based on graph coloring. It shows that quality of page allocation is generally improved, with an average improvement of 30% over the previous approach. Another comparison with a state-of-the-art demand-paging system shows that predictability does not come at the price of performance loss.\",\"PeriodicalId\":176327,\"journal\":{\"name\":\"2008 Euromicro Conference on Real-Time Systems\",\"volume\":\"22 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2008-07-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"18\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2008 Euromicro Conference on Real-Time Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ECRTS.2008.16\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2008 Euromicro Conference on Real-Time Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ECRTS.2008.16","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 18

摘要

在实时应用程序中需要使用虚拟内存:使用虚拟寻址提供并发进程之间的隔离;此外,分页允许执行大小大于主内存容量的应用程序,这在主内存昂贵因而稀缺的嵌入式系统中非常有用。但是,由于可预测性问题,在开发实时和嵌入式应用程序时通常避免使用虚拟内存。在本文中,我们提出了一个可预测的分页系统,其中页面加载点和页面退出点在编译时选择。使用整数线性规划(ILP)公式选择主存的内容。我们的方法适用于代码、静态数据和单个任务的堆栈区域。我们表明,选择内存内容所需的时间对于所有应用程序(包括最大的应用程序)都是合理的,这证明了我们的方法的可伸缩性。实验结果将我们的方法与先前的基于图着色的方法进行了比较。结果表明,页面分配的质量总体上得到了改善,比以前的方法平均提高了30%。另一个与最先进的需求分页系统的比较表明,可预测性并不是以性能损失为代价的。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Predictable Code and Data Paging for Real Time Systems
There is a need for using virtual memory in real-time applications: using virtual addressing provides isolation between concurrent processes; in addition, paging allows the execution of applications whose size is larger than main memory capacity, which is useful in embedded systems where main memory is expensive and thus scarce. However, virtual memory is generally avoided when developing real-time and embedded applications due to predictability issues. In this paper we propose a predictable paging system in which the page loading and page eviction points are selected at compile-time. The contents of main memory is selected using an Integer Linear Programming (ILP) formulation. Our approach is applied to code, static data and stack regions of individual tasks. We show that the time required for selecting memory contents is reasonable for all applications including the largest ones, demonstrating the scalability of our approach. Experimental results compare our approach with a previous one, based on graph coloring. It shows that quality of page allocation is generally improved, with an average improvement of 30% over the previous approach. Another comparison with a state-of-the-art demand-paging system shows that predictability does not come at the price of performance loss.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信