Looking back at the history of commercial EDA, specifically verification technology, I think of the old saying: "The more things change, the more they stay the same." Yes, the chip design challenges can be bigger and more complex, and the tools we use keep improving. But in general, the EDA industry continues to chase the complexity curve, trying to close the gap even a little bit. This is especially true in the area of verification, where I've been involved with three different companies.
When I began in what was then called the CAD industry, there were very few commercial tools available. In fact, I cofounded ECAD Systems in 1982 after heading the internal CAD tool efforts at National Semiconductor and Gould Systems. The challenges at both companies—and the opportunity we saw when starting ECAD—were driven by an important milestone in chip design: the 10-kgate threshold.
That level of complexity sounds small today, but we recognized that at that point, the human eye becomes basically obsolete. Remember, in those days, verification was being performed with a piece of paper on a light table! So we developed a product to automate the verification task of comparing a final layout to a schematic, and check for correctness. Called Dracula, it was one of the first big changes in the methodology of verifying chips.
If you look back over time, there are points where just changing tools wasn't enough. A whole new approach was needed. In the early 1980s at 10 kgates, that methodology change was about automating the verification process in ways that conventional wisdom and some leading thinkers didn't necessarily agree with. (Those were also the days of silicon compilers.) Back then, people talked a lot about correct by construction. But when we reached a certain level of complexity with smaller die size and higher performance requirements, that idea became infeasible. We simply needed a different way to address the problem.
I like to think that we proved ourselves partly due to our very focused mission. But there were other factors in our success: good timing, common sense, and a little luck. We had all of those too, and a big factor in the success of ECAD (and later Cadence) was that we realized EDA needed to be a software business, and not a turnkey workstation business, which was a strategy taken by some of the early EDA companies.
Meanwhile, complexity has continued to race with Moore's Law. In 1985 we talked about 1 million transistors; now we talk about 1 billion. That's 1000× over 15 years. Imagine the verification challenges of the future! I've always believed that design is the easy part, but making sure the design works is the hard part. Design or implementation is very linear in terms of its evolution, but verification is exponential.
In addition, verification challenges aren't only compounded by size and complexity, but by the fact that you must verify at every step and every level of design. That's why it remains so appealing to me. There are so many problems to solve!
When it comes to verification, we tend to focus on the things that we know and the things that we know we don't know. Traditional simulation is good for verifying things we know. My current EDA company, Novas Software, has become very proficient at helping designers to find out reasons for things they don't know. But I always ask, "What about the things we don't know that we don't know?"
From a verification standpoint, system-on-a-chip (SoC) design introduces a lot of things we don't know that we don't know. It's the next major point where existing methodologies break down. I'm a strong believer in the EDA industry's ability to continue helping to solve tough verification challenges. Will we ever close that mythical "design gap?" Probably not, but if we don't continue to try, while challenging the status quo, we'll find ourselves with today's equivalent of paper and a light table.