Electronic Design

45 nm Is On The Way—Can EDA Methodoogies Cope?

At 65 nm and below, approximations are out; detailed physical models are in.

With 65-nm IC processes up and running, designers are nervously eyeing the 45-nm node. The physical design challenges at 45 nm are exponentially greater than they are at 65 nm. Gate-oxide thicknesses are only atoms wide. A variance of just a few atoms can mean a 20% difference in leakage current.

To respond to the growing predominance of leakage power and interconnect delays, the next generation of tools must deliver as much hard physical data as possible into the front end of the design process. The uncertainty of approximations must yield to solid physical data based on detailed models.

Meanwhile, in the front end, the verification crisis seems to worsen in perpetuity. Complexity breeds bugs, and verification flows must find them faster and sooner (see the figure).

Some alteration is already under way. A harbinger of these changes occurred late in 2005 with the launch of an open current-source model. Synopsys' Composite Current Source modeling technology is an example of where verification must head to account for physical effects. Designers will rally to this standard, and other EDA vendors will be forced to support it.

The shifting emphasis in EDA tools and methodologies toward physical phenomena will ripple across the design flow. Already, the design-for-manufacturing (DFM) parade has stepped off, with emerging tools pushing true DFM to fruition.

In 2006, foundries will build their partnerships with EDA vendors as designers clamor for more foundry data on which to base their physical models. Watch for encryption standards to safeguard that data, coupled with tools and flows that abstract away nitty-gritty process details. That's okay by designers, who will want such details presented by the tools in terms of their effects on timing and signal integrity.

The design flow is shifting to a statistical paradigm. It's starting with statistical static timing analysis, which gives users a much better handle on how variations in process parameters will affect yields across the process window. Some vendors already refer to "variation-aware" tools and methodologies.

Timing analysis is just the beginning, though. Statistical methods will eventually become pervasive, reaching out into parasitic extraction and library characterization.

Statistical approaches will creep into many aspects of design very soon. As they do, designers will gain far more insight into the vagaries of process technology. The trick, again, is for EDA flows to render statistics transparent.

In the front end of the design cycle, watch for emerging flows in which chip, package, and board design are executed concurrently. A more holistic approach can avoid iteration.

Test is being blended into the front end, with a smooth twoway flow of information between verification engines, virtual test equipment, and high-level models. The result will be shortened design cycles.

Additionally, the vast complexity of 65-nm SoC designs will fuel interest in electronicsystem-level (ESL) methodologies. Tools and technologies that form a seamless flow from the algorithmic level through RTL will gain momentum. Meanwhile, designers will turn increasingly to SystemVerilogbased flows because of their ability to handle advanced functional-verification constructs.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish