{"title":"随机计算","authors":"J. Sartori, Rakesh Kumar","doi":"10.1561/1000000021","DOIUrl":null,"url":null,"abstract":"As device sizes shrink, manufacturing challenges at the device level are resulting in increased variability in physical circuit characteristics. Exponentially increasing circuit density has not only brought about concerns in the reliable manufacturing of circuits but also has exaggerated variations in dynamic circuit behavior. The resulting uncertainty in performance, power, and reliability imposed by compounding static and dynamic nondeterminism threatens the continuation of Moore's law, which has been arguably the primary driving force behind technology and innovation for decades. This situation is exacerbated by emerging computing applications, which exert considerable power and performance pressure on processors. Paradoxically, the problem is not nondeterminism, per se, but rather the approaches that designers have used to deal with it. The traditional response to variability has been to enforce determinism on an increasingly nondeterministic substrate through guardbands. As variability in circuit behavior increases, achieving deterministic behavior becomes increasingly expensive, as performance and energy penalties must be paid to ensure that all devices work correctly under all possible conditions. As such, the benefits of technology scaling are vanishing, due to the overheads of dealing with hardware variations through traditional means. Clearly, status quo cannot continue. \n \nDespite the above trends, the contract between hardware and software has, for the most part, remained unchanged. Software expects flawless results from hardware under all possible operating conditions. This rigid contract leaves potential performance gains and energy savings on the table, sacrificing efficiency in the common case in exchange for guaranteed correctness in all cases. However, as the marginal benefits of technology scaling continue to languish, a new vision for computing has begun to emerge. Rather than hiding variations under expensive guardbands, designers have begun to relax traditional correctness constraints and deliberately expose hardware variability to higher levels of the compute stack, thus tapping into potentially significant performance and energy benefits and also opening the potential for errors. Rather than paying the increasing price of hiding the true, stochastic nature of hardware, emerging stochastic computing techniques account for the inevitable variability and exploit it to increase efficiency. Stochastic computing techniques have been proposed at nearly all levels of the computing stack, including stochastic design optimizations, architecture frameworks, compiler optimizations, application transformations, programming language support, and testing techniques. In this monograph, we review work in the area of stochastic computing and discuss the promise and challenges of the field.","PeriodicalId":42137,"journal":{"name":"Foundations and Trends in Electronic Design Automation","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2011-12-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":"{\"title\":\"Stochastic Computing\",\"authors\":\"J. Sartori, Rakesh Kumar\",\"doi\":\"10.1561/1000000021\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"As device sizes shrink, manufacturing challenges at the device level are resulting in increased variability in physical circuit characteristics. Exponentially increasing circuit density has not only brought about concerns in the reliable manufacturing of circuits but also has exaggerated variations in dynamic circuit behavior. The resulting uncertainty in performance, power, and reliability imposed by compounding static and dynamic nondeterminism threatens the continuation of Moore's law, which has been arguably the primary driving force behind technology and innovation for decades. This situation is exacerbated by emerging computing applications, which exert considerable power and performance pressure on processors. Paradoxically, the problem is not nondeterminism, per se, but rather the approaches that designers have used to deal with it. The traditional response to variability has been to enforce determinism on an increasingly nondeterministic substrate through guardbands. As variability in circuit behavior increases, achieving deterministic behavior becomes increasingly expensive, as performance and energy penalties must be paid to ensure that all devices work correctly under all possible conditions. As such, the benefits of technology scaling are vanishing, due to the overheads of dealing with hardware variations through traditional means. Clearly, status quo cannot continue. \\n \\nDespite the above trends, the contract between hardware and software has, for the most part, remained unchanged. Software expects flawless results from hardware under all possible operating conditions. This rigid contract leaves potential performance gains and energy savings on the table, sacrificing efficiency in the common case in exchange for guaranteed correctness in all cases. However, as the marginal benefits of technology scaling continue to languish, a new vision for computing has begun to emerge. Rather than hiding variations under expensive guardbands, designers have begun to relax traditional correctness constraints and deliberately expose hardware variability to higher levels of the compute stack, thus tapping into potentially significant performance and energy benefits and also opening the potential for errors. Rather than paying the increasing price of hiding the true, stochastic nature of hardware, emerging stochastic computing techniques account for the inevitable variability and exploit it to increase efficiency. Stochastic computing techniques have been proposed at nearly all levels of the computing stack, including stochastic design optimizations, architecture frameworks, compiler optimizations, application transformations, programming language support, and testing techniques. In this monograph, we review work in the area of stochastic computing and discuss the promise and challenges of the field.\",\"PeriodicalId\":42137,\"journal\":{\"name\":\"Foundations and Trends in Electronic Design Automation\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-12-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"7\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Foundations and Trends in Electronic Design Automation\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1561/1000000021\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"Computer Science\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Foundations and Trends in Electronic Design Automation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1561/1000000021","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Computer Science","Score":null,"Total":0}
As device sizes shrink, manufacturing challenges at the device level are resulting in increased variability in physical circuit characteristics. Exponentially increasing circuit density has not only brought about concerns in the reliable manufacturing of circuits but also has exaggerated variations in dynamic circuit behavior. The resulting uncertainty in performance, power, and reliability imposed by compounding static and dynamic nondeterminism threatens the continuation of Moore's law, which has been arguably the primary driving force behind technology and innovation for decades. This situation is exacerbated by emerging computing applications, which exert considerable power and performance pressure on processors. Paradoxically, the problem is not nondeterminism, per se, but rather the approaches that designers have used to deal with it. The traditional response to variability has been to enforce determinism on an increasingly nondeterministic substrate through guardbands. As variability in circuit behavior increases, achieving deterministic behavior becomes increasingly expensive, as performance and energy penalties must be paid to ensure that all devices work correctly under all possible conditions. As such, the benefits of technology scaling are vanishing, due to the overheads of dealing with hardware variations through traditional means. Clearly, status quo cannot continue.
Despite the above trends, the contract between hardware and software has, for the most part, remained unchanged. Software expects flawless results from hardware under all possible operating conditions. This rigid contract leaves potential performance gains and energy savings on the table, sacrificing efficiency in the common case in exchange for guaranteed correctness in all cases. However, as the marginal benefits of technology scaling continue to languish, a new vision for computing has begun to emerge. Rather than hiding variations under expensive guardbands, designers have begun to relax traditional correctness constraints and deliberately expose hardware variability to higher levels of the compute stack, thus tapping into potentially significant performance and energy benefits and also opening the potential for errors. Rather than paying the increasing price of hiding the true, stochastic nature of hardware, emerging stochastic computing techniques account for the inevitable variability and exploit it to increase efficiency. Stochastic computing techniques have been proposed at nearly all levels of the computing stack, including stochastic design optimizations, architecture frameworks, compiler optimizations, application transformations, programming language support, and testing techniques. In this monograph, we review work in the area of stochastic computing and discuss the promise and challenges of the field.
期刊介绍:
Foundations and Trends® in Electronic Design Automation publishes survey and tutorial articles in the following topics: - System Level Design - Behavioral Synthesis - Logic Design - Verification - Test - Physical Design - Circuit Level Design - Reconfigurable Systems - Analog Design Each issue of Foundations and Trends® in Electronic Design Automation comprises a 50-100 page monograph written by research leaders in the field.