Throughout the past two decades, the semiconductor industry has seen gate counts rise in accordance with Moore's Law. Only recently has there been a rethinking of this model, as the realities of nanoscale chip design and manufacturing have hit home.
While we used to keep score by numbers of transistors, it's becoming more apparent that profitability might be a better way to measure business success. And the surest way to achieve profitability is to reduce the cost of design and do everything possible to ensure that working silicon comes in on time.
This is where the electronic design automation (EDA) industry comes in. Design software and hardware is a nearly $4 billion business today, dedicated to streamlining the design process and reliably verifying that what gets taped out can be reliably manufactured.
Unfortunately, it doesn't always work the way it should. Many developers building EDA tools and engineering teams using these same design tools get stuck in a rut. They're doing things the same way because it's easier not to change, or because that's the way they've always done it, or because the process works—somewhat.
The classic example is design verification, the most time-consuming and expensive part of a modern ASIC design flow. By consensus within the electronics industry, design verification burns 70% or more of the development cycle. And, it's only getting worse.
As a result, the electronics industry should begin thinking of design verification as a new kind of "gate count." That is, how much per gate is your design team willing to spend to do a thorough verification?
On average, it is not unusual to spend $100 to simulate 1 million cycles, and that figure can rise tenfold for a large, complex design. On larger designs of 10 million gates or more, that cost skyrockets to $1000. Those same system-on-a-chip designs, mixing hardware and software components, are so complex that only simulation of billions of cycles can build confidence in their correctness.
Booting a real-time operating system to the point where peripherals may be accessed and tested typically requires hundreds of thousands of cycles. Costs quickly add up to hundreds of thousands of dollars in Verilog simulator licenses alone. This is an expensive proposition made even more inadequate, considering that Verilog simulators are orders of magnitude too slow to run any kind of embedded software code.
One solution is to raise the level of abstraction and use C models during system-level simulation. But this solution has its own limitations: Models need to be available, accurate, and easily plugged together. Rather than adopting a new language, and because Verilog is the de facto standard for everything digital, wouldn't it be nice if one could instead simulate any register transfer-level (RTL) code as fast as those behavioral models? Fortunately, help that could have dramatic impacts on the design verification "dollar gate count" is on the way.
New ways of attacking the problem blend emulation solutions—hardware platforms used to accelerate the verification process of complex application-specific integrated circuits (ASICs)—with simulation. This combination promises to slash the cost of debugging a million cycles to just pennies. Design teams are quickly adopting this methodology as a way to accelerate the execution speed of test suites, accurately verify interactions between software and hardware, and reduce the time allocated for design verification.
With Moore's Law under siege and the design community pressed for time, these new solutions are a welcome answer to the vexing design-verification problem and dropping the gate count to a dollar.