Interview: Dr. Bruce McGaughy Discusses Design For Yield

May 9, 2014
Technology Editor Bill Wong talks with ProPlus Design Solutions’ Dr. Bruce McGaughy about Design for yield (DFY) and the company’s technology.

Yields typically improve over time but low yields are always an issue as designs move to finer geometries. Design for yield (DFY) helps improve this situation.

ProPlus Design Solutions is one company with tools that address DFY. I spoke with Dr. Bruce McGaughy, Chief Technology Officer and Senior Vice President of Engineering, about DFY trends and ProPlus’ tools for address DFY.

Wong: Please describe design for yield and why DFY tools are important now. Why do they need to be integrated?

McGaughy: The complexities of managing process variations (or DFY) of advanced technology nodes –– 28nm bulk CMOS and 16- and 14nm FinFET, for example –– can adversely affect chip yield. That means they need to be understood physically and accurately modeled in SPICE models.

These smaller nodes are forcing foundries to better characterize and model process variations, and pushing designers to use better DFY tools and methodologies to deal with increasing device and design complexities (Fig. 1).

Figure 1. There are a number of design and manufacturing challenges faced foundries, fabless design houses and EDA vendors.

Related Articles

Circuit designers are concerned about the increasing giga-scale circuit size. Semiconductor CMOS technology downscaled to nano-scale, making DFY mandatory and compelling them to re-evaluate how they design and verify their chips. Most are concerned about the challenges of good design –– modeling small transistors, putting multi-billion nano-scale transistors together and making it functional.

Process variations create difficulties when accurately modeling nano-scale transistors because they create multi-dimensional uncertainties on device characteristics. Moving to 16- and 14-nm nodes, 3D FinFET structure adds in more modeling challenges due to its new structure and complicated parasitics. Circuit designers are required to understand the coverage, usage and limitations of foundry SPICE models. They also need to find a way to put a large number of elements together.

Using nano-scale elements to design giga-scale circuits presents its own challenges, mainly due to variability, a DFY issue. Having a large amount of extremely small elements –– nanometer-sized transistors –– tightly packed together is a variability nightmare because every variation could cause the function, performance or yield to change on the whole product. Such challenge increases with the technology advancement.

Such variation can be accounted for in the design phase. It’s a matter of how to accurately model small variations, efficiently simulate the large-sized circuit with small variations on each small element, and with variation modeling and simulation capabilities, how to improve designs to achieve optimum performance and yield.

Wong: At what stage in the design flow do DFY tools become critical? What are design teams who don’t have DFY tools doing?

McGaughy: Designers need highly efficient and reliable yield analysis tools to handle different applications requiring yield ranges from 3s to very high sigma (7s) for FinFET designs. Otherwise, designers will end up respining the design or re-designing it or, worse yet, they will be forced to address a low-yield wafer.

Finally, post-layout simulation is fairly commonplace for circuit verification and driving the need for giga-scale SPICE simulators. Accurate and integrated yield analysis and optimization DFY solutions can meet the increasing need for giga-scale SPICE simulations.

Design teams without a comprehensive and integrated DFY solution typically have a set of cobbled together software tools that are serviceable, but not efficient and may not be accurate.

Wong: What’s the importance of modeling tools? What about yield prediction software?

McGaughy: Yield prediction software was not that critical in the past as designers had large margins, and process variation effects were not that significant. When the industry moved into smaller nodes, the increase of process variation and the impact to chip yield become severe, plus the fabrication cost is increasing. Designers don’t have the luxury of being able to take risks. Yield prediction software is the only designers are able understand what’s going on with their designs in the target process platform. It is able to help them if simulation tells them the chip is likely to fail or gives low yield. Yield prediction will improve their designs, instead of risking production and wasting millions of dollars. 

Several ProPlus engineers were part of a research team at the University of California, Berkeley, 20 years ago that invented the physics-based, accurate, scalable, robust and predictive MOSFET SPICE model called BSIM3. The availability of standardized compact models contributed to the success of the foundry-fabless business model. The BSIM3 model was used initially for circuit simulation and CMOS technology development. Since then, most ICs designed have used BSIM3 and other BSIM family models.

Wong: You mention giga-scale simulation. Please explain to Electronic Design readers what giga-scale simulation is.

McGaughy: A giga-scale circuit simulator lets designers simulate complex designs, such as 100-million element or more memory blocks or post-layout analog circuits without sacrificing accuracy. Giga-scale simulators are high-capacity, high-performance parallel SPICE simulator able to deliver 10-100X or more speedup over traditional SPICE through core engine and parallelization performance.

Traditionally, designers had to make trade-offs between accuracy and performance/capacity for their SPICE simulations. FastSPICE simulators can boost speed/capacity but forego accuracy. A giga-scale circuit simulator pushes pure SPICE accuracy into the FastSPICE application space, handling design sizes that other Parallel SPICE simulators cannot.

Large designs and process complexity require more post-layout simulation and the need for giga-scale simulation. Tighter design margins at leading-edge process nodes means high-accuracy simulation is important because it is not viable to have simulator accuracy exceed design margin.

Wong: Why is FinFET technology becoming so important?

McGaughy: Pushing planar devices into vertical structures has helped overcome fundamental device physics limitations. Three-dimensional FinFETs transistors will enable the continuation of Moore’s Law with the potential to deliver greater levels of scalability.

Wong: What trends are you seeing?

McGaughy: The most obvious is the move to FinFET. The achievement of 16nm readiness across the design flow is a significant accomplishment, though experienced IC design teams recognize potential and challenges that come with the 3D FinFET technology. Many had the foresight to immediately adopt advanced DFY methodologies and tools.

The second trend is the widespread adoption of giga-scale simulation, replacing SPICE simulators. After all, SPICE is 40 years old and showing its age. Circuit designers are moving to giga-scale simulation because of complexity, the increased simulation needed for post-layout designs and the large number of simulations required to design for variation effects.

About the Author

Bruce McGaughy | Senior Vice President of Engineering

Bruce is both Chief Technology Officer and Senior Vice President of Engineering, and passionate about linking silicon to design, creating a category of tools called design for yield (DFY). A ProPlus employee since 2008, he’s been responsible for rolling out a series of DFY products to offer a complete and integrated platform. They include modeling software, DFY tools to predict yield and better understand process variations, and a circuit simulator called NanoSpice.

Dr. McGaughy received the Bachelor of Science degree in Electrical Engineering from the University of Illinois at Urbana/Champaign and Master of Science and Ph.D. degrees in Electrical Engineering and Computer Science from the University of California at Berkeley, in 1994, 1995 and 1997, respectively. He has conducted and published research in the fields of circuit simulation, device physics, reliability, electronic design automation, computer architecture and fault tolerant computing. Prior to his current assignment, he worked for Integrated Device Technology (IDT), Siemens, Intel, Berkeley Technology Associates, and Celestry. In 2003, Dr. McGaughy was the group director in charge of circuit simulation R&D at Cadence, including Spectre, SpectreRF and UltraSim. In 2006, Dr. McGaughy became the distinguished engineer and chief architect for Cadence’s simulation products, including Spectre, SpectreRF, UltraSim and AMS Designer. In 2008, he joined ProPlus Design Solutions as the senior vice president of Engineering and chief technology officer. He is in charge of all of ProPlus R&D efforts, including the BsimProPlus model extraction platform, the NanoSpice parallel spice simulator, and the NanoYield DFY platform.

Sponsored Recommendations


To join the conversation, and become an exclusive member of Electronic Design, create an account today!