基于监督学习的可变性时序误差预测模型

Xun Jiao, Abbas Rahimi, Balakrishnan Narayanaswamy, H. Fatemi, J. P. D. Gyvez, Rajesh K. Gupta
{"title":"基于监督学习的可变性时序误差预测模型","authors":"Xun Jiao, Abbas Rahimi, Balakrishnan Narayanaswamy, H. Fatemi, J. P. D. Gyvez, Rajesh K. Gupta","doi":"10.1109/NEWCAS.2015.7182029","DOIUrl":null,"url":null,"abstract":"Circuit designers typically combat variations in hardware and workload by increasing conservative guardbanding that leads to operational inefficiency. Reducing this excessive guardband is highly desirable, but causes timing errors in synchronous circuits. We propose a methodology for supervised learning based models to predict timing errors at bit-level. We show that a logistic regression based model can effectively predict timing errors, for a given amount of guardband reduction. The proposed methodology enables a model-based rule method to reduce guardband subject to a required bit-level reliability specification. For predicting timing errors at bit-level, the proposed model generation automatically uses a binary classifier per output bit that captures the circuit path sensitization. We train and test our model on gate-level simulations with timing error information extracted from an ASIC flow that considers physical details of placed-and-routed single-precision pipelined floating-point units (FPUs) in 45nm TSMC technology. We further assess the robustness of our modeling methodology by considering various operating voltage and temperature corners. Our model predicts timing errors with an average accuracy of 95% for unseen input workload. This accuracy can be used to achieve a 0%-15% guardband reduction for FPUs, while satisfying the reliability specification for four error-tolerant applications.","PeriodicalId":404655,"journal":{"name":"2015 IEEE 13th International New Circuits and Systems Conference (NEWCAS)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"15","resultStr":"{\"title\":\"Supervised learning based model for predicting variability-induced timing errors\",\"authors\":\"Xun Jiao, Abbas Rahimi, Balakrishnan Narayanaswamy, H. Fatemi, J. P. D. Gyvez, Rajesh K. Gupta\",\"doi\":\"10.1109/NEWCAS.2015.7182029\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Circuit designers typically combat variations in hardware and workload by increasing conservative guardbanding that leads to operational inefficiency. Reducing this excessive guardband is highly desirable, but causes timing errors in synchronous circuits. We propose a methodology for supervised learning based models to predict timing errors at bit-level. We show that a logistic regression based model can effectively predict timing errors, for a given amount of guardband reduction. The proposed methodology enables a model-based rule method to reduce guardband subject to a required bit-level reliability specification. For predicting timing errors at bit-level, the proposed model generation automatically uses a binary classifier per output bit that captures the circuit path sensitization. We train and test our model on gate-level simulations with timing error information extracted from an ASIC flow that considers physical details of placed-and-routed single-precision pipelined floating-point units (FPUs) in 45nm TSMC technology. We further assess the robustness of our modeling methodology by considering various operating voltage and temperature corners. Our model predicts timing errors with an average accuracy of 95% for unseen input workload. This accuracy can be used to achieve a 0%-15% guardband reduction for FPUs, while satisfying the reliability specification for four error-tolerant applications.\",\"PeriodicalId\":404655,\"journal\":{\"name\":\"2015 IEEE 13th International New Circuits and Systems Conference (NEWCAS)\",\"volume\":\"24 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-06-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"15\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2015 IEEE 13th International New Circuits and Systems Conference (NEWCAS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/NEWCAS.2015.7182029\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 IEEE 13th International New Circuits and Systems Conference (NEWCAS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NEWCAS.2015.7182029","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 15

摘要

电路设计人员通常通过增加保守的保护带来应对硬件和工作负载的变化,这会导致操作效率低下。减少这种过度的保护带是非常可取的,但会导致同步电路中的定时错误。我们提出了一种基于监督学习模型的方法来预测比特级的时序误差。我们证明了基于逻辑回归的模型可以有效地预测给定数量的保护带减少的定时误差。所提出的方法使基于模型的规则方法能够根据所需的位级可靠性规范减少保护带。为了预测比特级的时序误差,所提出的模型生成自动使用捕获电路路径敏化的每个输出比特的二进制分类器。我们在门级仿真中训练和测试了我们的模型,该模型使用从ASIC流程中提取的时序误差信息,该流程考虑了45纳米台积电技术中放置和路由的单精度流水线浮点单元(fpu)的物理细节。我们进一步通过考虑各种工作电压和温度角来评估我们的建模方法的鲁棒性。我们的模型以95%的平均准确率预测未知输入工作负载的定时误差。这种精度可以使fpu的保护带减少0%-15%,同时满足四种容错应用的可靠性规范。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Supervised learning based model for predicting variability-induced timing errors
Circuit designers typically combat variations in hardware and workload by increasing conservative guardbanding that leads to operational inefficiency. Reducing this excessive guardband is highly desirable, but causes timing errors in synchronous circuits. We propose a methodology for supervised learning based models to predict timing errors at bit-level. We show that a logistic regression based model can effectively predict timing errors, for a given amount of guardband reduction. The proposed methodology enables a model-based rule method to reduce guardband subject to a required bit-level reliability specification. For predicting timing errors at bit-level, the proposed model generation automatically uses a binary classifier per output bit that captures the circuit path sensitization. We train and test our model on gate-level simulations with timing error information extracted from an ASIC flow that considers physical details of placed-and-routed single-precision pipelined floating-point units (FPUs) in 45nm TSMC technology. We further assess the robustness of our modeling methodology by considering various operating voltage and temperature corners. Our model predicts timing errors with an average accuracy of 95% for unseen input workload. This accuracy can be used to achieve a 0%-15% guardband reduction for FPUs, while satisfying the reliability specification for four error-tolerant applications.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信