Electronic Design

Verifying The Entire System</A><BR><FONT CLASS=body11>Sponsored by: <A HREF="http://www.verisity.com" TARGET=_blank CLASS=body11>VERISITY CORP.</A></FONT><A>

A quick take on automating the verification process

Focus On Infrastructure
Given the escalation in system complexities in today's designs, an effective system-level verification strategy depends heavily on infrastructure, process, and methodology. HDL simulation technology has matured to the point where large performance jumps aren't likely. So verification processes must be directed toward gaining more value from available simulation cycles. The best way to do that is with a coverage-driven methodology that focuses effort on design areas that haven't been adequately exercised. This is better than a brute-force methodology that relies on quantity of testing and not quality.

Challenges Abound
Verification at the system level spawns challenges at a number of levels. For one, adherence to the functional specification must be established at multiple levels of abstraction: high-level modeling, RTL, gate level, hardware prototype, and silicon. Each of these design representations calls for a different execution engine. Also, these representations are in a multitude of languages, from verification languages to C/C++/SystemC to HDLs. Then there's the fragmented nature of the design process, with separate teams for architecture, hardware, and software. A divide-and-conquer approach to verification becomes a necessity.

Why Duplicate Efforts?
Using machine-readable executable specifications as a "golden reference" is the time-honored approach to verification, although not always the answer. An executable spec describes one possible implementation of the design, but only one. It doesn't provide a mechanism for confirming correctness, nor does it provide a means of debugging and visualizing its own execution and cause of failures that might occur in corner cases. It includes no definition of different use scenarios to be exercised to verify adherence to the specification. It doesn't contain a definition of coverage goals, which are needed to establish that a device-under-test is bug-free. Worst of all, it requires an enormous duplication of effort in creating testbenches at all levels of abstraction and laboriously verifying consistency from level to level.

Click here to download the PDF version of this entire article.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish