{"title":"Design-Technology Interface: What will replace design rules for DDSM?","authors":"M. Lavin","doi":"10.1109/ICICDT.2006.220819","DOIUrl":null,"url":null,"abstract":"Summary form only given. It is now well-accepted that CMOS technology has entered a new era where the rapid, steady quantitative scaling of performance, density, etc. appears to be slowing, and is subject to new impediments (noise, static and dynamic power) as well as qualitative, sometimes disruptive change in processes, materials, and devices. One of the main contributors to this slowing and complication is the increasing impact of variability. What I want to discuss in this paper is how variability also impacts the interface between chip design and technology development. Historically, this interface was represented by design rules and device/wire models that scaled smoothly over time; this was reflected by the fact that IBM's processor technologies from the frac12 micron node down to the 130nm node used a (mostly) stable set of scalable design rules and circuit models in which there was a single \"NRN\" dimension of variability. Going forward, it is clear that the models and the design tools that use them have to capture a more complete understanding of systematic and random variability, and conventional design rules have to replaced by other means for representing to designers what the new technologies are (and are not) capable of. I spent most of my talk describing some potential replacements for conventional design rules","PeriodicalId":447050,"journal":{"name":"2006 IEEE International Conference on IC Design and Technology","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2006-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2006 IEEE International Conference on IC Design and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICICDT.2006.220819","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Summary form only given. It is now well-accepted that CMOS technology has entered a new era where the rapid, steady quantitative scaling of performance, density, etc. appears to be slowing, and is subject to new impediments (noise, static and dynamic power) as well as qualitative, sometimes disruptive change in processes, materials, and devices. One of the main contributors to this slowing and complication is the increasing impact of variability. What I want to discuss in this paper is how variability also impacts the interface between chip design and technology development. Historically, this interface was represented by design rules and device/wire models that scaled smoothly over time; this was reflected by the fact that IBM's processor technologies from the frac12 micron node down to the 130nm node used a (mostly) stable set of scalable design rules and circuit models in which there was a single "NRN" dimension of variability. Going forward, it is clear that the models and the design tools that use them have to capture a more complete understanding of systematic and random variability, and conventional design rules have to replaced by other means for representing to designers what the new technologies are (and are not) capable of. I spent most of my talk describing some potential replacements for conventional design rules