Until recently, statistical static timing analysis (SSTA) had been the darling of EDA-centric technical conferences and symposia for several years. At those conferences, SSTA was touted as "The Answer" to the most vexing problem facing the semiconductor industry: the inability of traditional corner-based signoff methodologies to keep up with the effects of process variability on yields and performance.
But, alas, the hype surrounding SSTA is beginning to die down. There's little room left for innovation in the fundamental algorithms for statistical analysis, so the technical conferences are moving on to other emerging technology topics. However, as 65-nm processes become the high-end mainstream and 45-nm lines are being shaken out, the questions surrounding process variation loom larger than ever.
Good old-fashioned static timing analysis (STA) based on process corners remains the bedrock methodology for design teams signing off on 65-nm projects. While no one is abandoning STA just yet, designers are starting to realize that SSTA can be a valuable addition to their signoff flow. And although the basic technology of SSTA is more or less stable, infrastructure support is shaping up.
So who's using SSTA, and how? What's going on with statistical models and cell characterization? Should design teams look at it for their next-generation designs? Consider this a followup to an earlier report on SSTA, in which readers can find more basic information (see "Timing Analysis Rounds The Corner To Statistics" at www.electronicdesign. com, ED Online 11664).
A use model emerges
SSTA's path from conference topic to practical technology has been slower than anticipated, perhaps painfully slow for the EDA vendors who have invested in it.
"SSTA has been typical of many emerging EDA technologies over the years," says Tom Ferry, vice president of business development for signoff products at Magma Design Automation. "At first, there's lots of hype. Then people look more closely and realize it's more difficult than they thought."
For Synopsys, which released its PrimeTime VX and Star- RCXT VX statistical signoff tools at DAC in 2006, adoption of statistical timing signoff has reflected this pattern.
"It's a matter of trust," says Robert Hoogenstryd, director of marketing in Synopsys' Implementation Group. "People aren't abandoning their traditional signoff flows. But people are beginning to invest in SSTA at 65 nm."
Further, says Hoogenstryd, many would-be adopters are still on the learning curve. "Some companies are using the 65-nm node as an experimental or test platform for SSTA and are really targeting deployment for production at 45 nm," he says.
The emerging model uses SSTA alongside corner-based STA. According to both Magma Design Automation and Synopsys, SSTA customers continue to sign off on their designs with traditional process corners and derating. They'll use SSTA in parallel with that flow to build confidence in it and get a feel for how statistical results correlate.
Given that few, if any, design teams will be dropping their existing signoff flows, SSTA adoption will incrementally creep into design methodologies.
"Instead of going from plain-vanilla STA plus signal integrity to full-blown SSTA, you can take some incremental steps to minimize the disruption in your flow and adopt some value," says Magma's Ferry.
An incremental and limited adoption of SSTA can enable designers to reduce the pessimism imposed by traditional static- timing flows. In such flows, variation is represented in terms of process, voltage, and temperature (PVT) and gets lumped into a global parameter known as on-chip variation (OCV).
Continued on Page 2...
OCV, essentially a global pessimism factor, is used to add in margin. Designers know that the temperature will vary from area to area on the chip, as will the power-supply voltage and the transistor channel length.
What OCV really amounts to is a blind approach to overdesign. Savvy designers find that by adding SSTA to their flow, they can reduce their margin assumptions and not leave as much performance on the table. "SSTA lets you do that by more intelligently dealing with variation," says Ferry. Instead of assuming some arbitrary value for margin, say 20%, a post-STA statistical run might reveal that it's safe to go with a 15% OCV value.
"The end result is that you get 5% better timing without changing your design," adds Ferry. Running SSTA in addition to STA provides more design-specific information about the actual margin.
Opinions differ on when SSTA might begin to edge its way more concretely into signoff flows. "For one thing, the corner signoff capabilities today are still pretty good, even down to most designs at 65 nm," says Tom Quan, director of EDA and design services marketing at TSMC. "For designs pushing the leading edge of 65 nm, teams start looking to statistical to augment their corner methodology."
Obviously, integrated device manufacturers (IDMs) such as Intel and IBM are further along in making full use of SSTA. For IDMs, according to Mustafa Celik, CEO of Extreme Design Automation, the most important application of SSTA is to look at their progress in design robustness against variation. "That doesn't require statistical signoff. You still sign off with corners," says Celik. "It's a parallel check with SSTA tools."
Besides the need to build confidence in SSTA, other factors have kept it from reaching the mainstream.
Statistical timing analysis requires statistical models, which have been sluggish in coming (see "Requirements For Successful SSTA Characterization"). Again, IDMs, with their unfettered access to real manufacturing data, have the upper hand in their ability to create statistical libraries. But foundries have begun supplying statistical Spice models to customers with an eye toward enabling statistical signoff.
"We see providing statistical models as getting more accurate data to designers," says TSMC's Quan. "Starting at the 65-nm node, we have been providing statistical Spice models with Gaussian distributions."
UMC, another pure-play foundry, is also providing statistical models. According to Patrick T. Lin, UMC's chief SoC architect in the foundry's System and Architecture Support group, statistical "views" of UMC's cell libraries are available to select customers/ partners for collaborative usage.
Some in the industry feel the models being provided by foundries don't provide much in terms of visibility into the source and nature of variability.
"TSMC's models are well done, but they tell you absolutely nothing," says Isadore Katz, president and CEO of CLK Design Automation. "But even with a black-box variation setting, I think people doing ASICs in a worstcase design methodology will make better decisions."
UMC's Lin, like TSMC's Quan, reports that UMC's statistical libraries are not "completely black-box models." Rather, he says, users will be able to discern what parameters are used to characterize the variability in UMC's models.
Above and beyond the statistical models for devices, designers would be wise to consider more elaborate modeling of interconnects rather than the traditional RC extraction. TSMC's models provide facilities for the extraction tools to handle the interconnect.
"If you look at the 2D version of the interconnect, it has a lot of variation due to factors such as lithography changes in the width of the lines. But in 3D, it's the thickness of the line, which might change due to chemical-mechanical polishing (CMP)," says Quan. "The extraction tools from the EDA side need to extract the interconnect with sensitivities and statistical distribution. Combining that with the SSTA gives a true picture."
Continued on Page 3...
Statistical library characterization had been a hole in the SSTA ecosystem, but it's been filled, at least in part, by the emergence of Altos Design Automation. Magma also provides statistical characterization within its flow.
Altos' Variety cell-characterization tool calculates nonlinear sensitivities for cells that account for both systematic and random variation in any set of correlated or non-correlated process parameters. The resulting libraries can be used to model both local (within cell and within die) variations and global die-todie variations.
For random intra-cell variation, Variety uses a pre-analysis step that Altos calls the "inside view" (Fig. 1). "Typically, characterization tools provide a black-box .lib file," says Jim McCanny, Altos' CEO.
"We try to understand what we're characterizing and to optimize for simulation," says McCanny. "For certain types of data, you don't need to model everything because under certain conditions, some things won't be active. So we use predictive methods to come up with data that lets us reduce the runtime significantly."
Altos claims correlation to within a few percentage points of Monte Carlo Spice characterization.
Extending statistical analysis
Statistical analysis has, to date, been used largely in the timing domain. There are potentially broader uses for statistical techniques, particularly in the signal-integrity (SI) realm.
The use model described earlier for SSTA, in which users of a classic corner- based signoff flow might use statistical analysis to shave margin, might be best employed by ASIC designers of 200- to 500-MHz digital parts fabricated on processes capable of well over 1 GHz. For such designers, who might only sign off using one or two corners, such a methodology is state of the art. At the other extreme is the bleeding edge, where designers look to extract every bit of speed from their fab.
"At the high end, the flow is mix and match," says CLK's Katz. "These designers can't treat the manufacturing process as one grey lump. There's a reason why people here look at many, many corners."
For these high-end designers, says Katz, the number-one source of chip failure isn't process variation but rather signal integrity. "SSTA does not work particularly well with SI," explains Katz. "This is a first-generation (SSTA) issue. For the high-end guys, if it doesn't have the statistical accuracy for SI, they're hosed."
Synopsys claims to have a flow that accounts for SI factors. "For many designers, signoff means crosstalk delay as well as timing," says Hoogenstryd of Synopsys. To that end, Synopsys built a signal-integrity capability into its PrimeTime and Prime- Time SI tools (Fig. 2).
"We have a couple of different approaches," says Hoogenstryd. "Users want to do a full statistical SI analysis in which the crosstalk delay portion has Gaussian distributions. While this isn't a challenge from an algorithmic standpoint, validation is almost impossible. To validate one lead aggressor on a net would take about a million Spice runs. We know our approach is safe, slightly pessimistic, but very safe."
Nanno Solutions' DFM-Aware sSTA flow comprises another option for a variation-aware flow (Fig. 3). This environment can improve accuracy and efficiency for designs at 90 nm and below. Also, it can be added to existing design flows and help reduce the number of engineering change orders (ECO) during the pre-layout stage. It accomplishes this by reducing the design gap between the front and back ends of the design cycle by using the same models for both.
Going beyond analysis
Another strand in the discussion involves how statistical techniques might be extended beyond timing concerns to encompass optimization, particularly for power (see "Optimize For Power Before RTL Synthesis To Ease Timing Closure").
"It's one thing to find a problem, but can you fix it?" asks Dave Desharnais, product group director for Cadence Design Systems' Encounter line. "Leakage is a bigger issue than timing alone." One technique for dealing with leakage involves adding a placement swapping optimization step, in which high- and low-VT cells are substituted for each other as needed.
The Silicon Integration Initiative's (Si2's) Open Modeling Coalition has been working on statistical extensions to the effective current-source modeling (ECSM) format. Soon to be released as an Si2 standard, the extensions build on the ECSM format for modeling timing, noise, and power.