Abstract
Summary form only given. It is now well-accepted that CMOS technology has entered a new era where the rapid, steady quantitative scaling of performance, density, etc. appears to be slowing, and is subject to new impediments (noise, static and dynamic power) as well as qualitative, sometimes disruptive change in processes, materials, and devices. One of the main contributors to this slowing and complication is the increasing impact of variability. What I want to discuss in this paper is how variability also impacts the interface between chip design and technology development. Historically, this interface was represented by design rules and device/wire models that scaled smoothly over time; this was reflected by the fact that IBM's processor technologies from the frac12 micron node down to the 130nm node used a (mostly) stable set of scalable design rules and circuit models in which there was a single NRN dimension of variability. Going forward, it is clear that the models and the design tools that use them have to capture a more complete understanding of systematic and random variability, and conventional design rules have to replaced by other means for representing to designers what the new technologies are (and are not) capable of. I spent most of my talk describing some potential replacements for conventional design rules
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.