Electronic Design

Down Come The Walls Between Software And Hardware Design

It's becoming increasingly obvious: As 2007 begins, the prevailing paradigm in design is that software is king. Sure, state-of-the-art hardware is a must. But with market windows shrinking, there's no longer wiggle room to wait around for hardware prototypes to check out new firmware and drivers.

We're entering the era of platform design, in which hardware modeling must begin at levels of abstraction higher than the register-transfer level (RTL). There are strong signs that transaction-level modeling (TLM) is taking off.

Furthermore, the emergence of new standards from industry bodies such as the Open SystemC Initiative (OSCI) is helping to form the underpinnings for a modeling infrastructure that will result in greater model portability as well as tool interoperability.

With the complexity of both hardware and software reaching a crisis level, many large systems houses have begun developing both in tandem without enduring the interminable wait for even fully timed simulation models.

ESL's Year
2007 may be the year when the industry embraces electronic system-level (ESL) methodologies. The availability of more TLMs, and the growth of an intellectual-property (IP) infrastructure that supports abstraction levels higher than RTL, will make it more feasible for designers to model entire systems and use the resulting virtual prototype to make architectural tradeoffs, evaluate performance, and more.

SoC design is beginning to shift toward a multiprocessor-oriented phase, in which an ESL-based approach will become ever more critical. Look for the emergence of some "killer apps" for the multiprocessor SoC (MPSoC) design space.

All of the complexity of SoC design lands squarely on the shoulders of verification teams, who struggle with a disjointed verification flow rife with numerous discrete steps, various abstraction levels, and myriad tools. The verification effort will be aided in 2007 by more tools and technologies for ensuring that high-level models are consistent with hand-generated hardware models, or with models generated by ESL synthesis flows.

DFM's Role
At the back end of the design process is the market's confusing array of design-for-manufacturing (DFM) approaches. Technology to address parametric yield will be extremely important as the industry begins migrating to 45 nm.

Technologies that impact parametric yield are statistical static timing analysis, statistical leakage, design optimization based on lithography, and chemical-mechanical polishing (CMP) simulation to address interconnect variability. These technologies are sometimes called electrical DFM (see the figure).

To resolve parametric yield challenges, we'll need to adopt a complete statistical analysis methodology that extracts and leverages intelligence from process information.

Process data is moving up into the design realm, a trend that will accelerate in the coming year. Although physical flows have begun to incorporate "process awareness," it's unlikely that this is enough to solve the entire DFM issue. That's largely because the assessment of some process variabilities can be done only after the full-chip layout is completed.

Powerful Choices
Finally, 2007 will see a huge push toward low-power design in the frontend design flow. Decisions made in the highest strata of the ESL realm can determine up to 80% of a chip's total power budget. Thus, designers will see more power-aware ESL tools.

Also, power is more crucial than ever in the physical domain. Leakage power is already a major problem at 90 nm. One can only imagine how bad it will be at 65 nm if effective powermanagement techniques aren't used.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish