Is An Emulation Hockey Stick Coming?

Sept. 16, 2011
SoCs are no longer part of the system; they are the system. The explosion in SoC operating modes points toward a verification crunch. Emulation is a technology whose time in the spotlight may have finally arrived. EVE's Lauro Rizzatti explains why.

Growth resembling a hockey stick isn’t out of the question for emulation. Indeed, note that the numbers for 2011 only include one quarter’s worth of data. (courtesy of EDAC)

It’s what every marketing manager predicts for a new product at the end of a five-year projection.

It’s what every entrepreneur promises and what every venture capitalist expects.

It’s the sign of success and riches ahead.

It’s the hockey stick.

There are a few unfortunate products or categories that never catch on and fail pretty much immediately. Much more common are the products that tease with promises of great success, but amble along at a constant level or a modest constant growth rate, with that much-looked-for knee never materializing. No one plans for that. It’s go big or go home.

Everyone has to promise a hockey stick. If one weren’t going to happen, why would you embark on the venture? But in those private wakeful moments in the wee hours of the morning, when you can risk being completely honest about the numbers, you can rehash whether you really believe it will happen and revalidate the reasons why.

Game On?

If you look at the history of the emulation market, it has grown, sometimes in fits and starts (see the figure). To date, though, it hasn’t shown evidence of that coveted inflection point. Frankly, the overall compound annual growth rate (CAGR) has been modest around 1.5%, even if you discount the go-go times in 2000 and start at 2002, which yields about 5%. So, given this chart, why on earth would one even think in terms of a hockey stick?

There are a number of reasons to be optimistic, and they can be found in the fundamental reasons why emulation is used and how that’s changing, as well as within evidence in some of the numbers and commitments to the technology being made by dominant players besides EVE.

Some of the fundamental drivers are incremental, but some represent a discontinuity. On the incremental side are the familiar issues of complexity and cost. Chips aren’t just getting bigger, they’re substantially more complicated than they used to be.

More Of The Same

The increasing complexity is driven partly by the increasing cohabitation of large chunks of circuitry that march to a different drummer—the so-called “modes” that are proliferating like crazy. And modes may no longer be exclusive to a chip. There may be “regions” within the chip that have exclusive modes, but the modes of two regions may co-exist, such as when a phone call and Internet browsing are allowed concurrently on a cell-phone chip. Now you have combinations and permutations of local modes to take into account.

All of this creates a verification challenge that can theoretically be solved through simulation. But for a robust verification plan, you need to establish a set of regression tests that are repeated on a regular basis. This means they need to execute quickly so you can realistically repeat the suite on a nightly or weekly basis. An emulator interacting with a virtual environment via high-speed transactions gets you through the tests much more quickly, allowing for more thorough vetting and greater confidence in your system.

The other contributor to complexity is the need to reduce power. Beyond the usage-derived modes, you end up with power islands and modes that are “artificially” derived, as they may represent opportunities for power savings that are orthogonal to the usage modes. This makes it even harder to test, since power-up and power-down of different segments of the chip may or may not directly track the usage modes, making it harder to be confident about the coverage of the test suite—which again makes emulation attractive, since you can put the system through real-world paces.

Of course, this all matters only due to the increases in the cost of making a mistake. Just as it gets easier to screw up the chip, it is also becoming significantly more costly when that happens. Probabilities of failure are up, and the consequences are also up, making the risk that much higher.

None of these trends is new. We could probably have made much of this argument (minus the focus on modes) five years ago or more. It’s just more of the same, with the stakes continually rising. But there is one new driving factor that is much more of a game changer, and it can be expressed in one word: software.

A Dramatic New Consideration

Today’s systems-on-a-chip (SoCs) no longer contribute to the assembly of an embedded system. They are the embedded system. They no longer assist in the overall execution of software. They host the software execution. This adds an enormous degree of freedom to the overall test stimulus set, as now you have to be able to prove that the software works.

The problem is moderately mitigated in an embedded system since there is typically a specialized, fixed body of software that will be intimately tied to the system. It’s not like you have to prove that any random program will execute. Even that’s changing with the advent of apps on mobile phones that may be written by anyone anywhere in the world randomly.

Still, validating the set of captive, dedicated software destined to run on the system is challenging enough. And it’s not just about confirming that the software works or not. It’s also about validating that there are no major performance bottlenecks and that power consumption will remain within the expected range. That means running and rerunning the software under a variety of conditions.

While systems can be simulated, software can’t. It simply takes too long. A simulator will execute software with the rough equivalent of a 10-Hz CPU clock, on the order of 100 million times slower than your PC CPU clock. There’s just no way to exercise a program of any significant extent using a simulator. It gets even worse if the program has to run over Linux, since Linux itself has to boot, which could literally take years on a simulator.

If all of that weren’t enough, the advent of multicore adds an entire new suite of things to worry about. Now a given program will be distributed across multiple cores, and they all have to play nicely together to make sure that nothing falls through the cracks.

Using the simplest, most common multicore operating system arrangement, the scheduling of software on the cores may not even be deterministic. This may mean that you need to run the programs with many different schedules to make sure that no corner scheduling case will cause an unexpected burp, multiplying the testing burden.

All of this demands a verification methodology that’s faster than simulation, and that’s where emulation comes in. For example, the years-long process of booting Linux on a simulator can be managed in 15 minutes on an emulator.

The incorporation of software into what is likely to be a multicore SoC makes emulation a must-have technology to provide the kind of confidence needed to go off and buy a mask set and start building the chip.

Any Evidence?

As compelling as the reasons to use emulation may be, are there any signs in the numbers that suggest that any of this may be playing out? That’s a little bit tricky, because the revenue numbers reflect a combination of units sold and average selling price. And, the price has been coming down as lower-cost alternatives to some of the traditional big-box systems become available.

Looking at the market in terms of units instead of revenue may shed more light. Unfortunately, however, industry unit numbers aren’t available. But if EVE’s new ZeBu-Server is any indication of the overall market, units are up. This platform has outperformed all previous ones both in terms of new customers and installed units. Its first 18 months yielded 22 customers, of which 11 were new design teams, and brought the number of installations to 65. Since the beginning of its history, EVE has shipped the equivalent of 20-billion ASIC gate capacity. If the other players are seeing a similar trend, then units are definitely up, indicating a greater uptake than revenues would indicate.

Finally, a new focus on emulation can be found in the quarterly conference call notes of a major EDA player that could choose among many different elements of its business to put in the spotlight: Cadence. Emulation has the attention of its CEO, Lip-Bu Tan, who sees its emulation offering as “very, very needed and is essential for any complex chip design anything below 40 nanometer.”1

This is echoed by CFO Geoff Ribar, who says that “It’s obviously a secular trend. Emulation is becoming more and more important.”1

And, should anything be left uncertain, again, “if you look closely into \\[Cadence’s\\] Development Suite, it’s really built around a whole emulations (sic),” Tan said.1 In other words, at its CEO level, a major EDA player has emulation at the center of its tools strategy. This reflects new thinking and brings to center stage what had before been a bit player.

Meanwhile, most recently, even Mentor Graphics, another diverse company, is pointing to emulation in its earnings report, indicating a “flurry of new orders in the last two days” of its second quarter.2

For all of these reasons—the secular changes in the industry, clues in the numbers, and even renewed commitment by a diverse company that could spend its energy on any of a number of technologies—we see a hockey stick in the offing. The exact timing may be tough to nail down if you’re worried about some of the larger economic issues at play during the summer of 2011. But, EVE’s continued success gives us no reason to doubt that the emulation game is changing.

And the new game is hockey.

References

1. Cadence Design Systems Q2 2011 Earnings Conference Call Transcript

2. Mentor Graphics Corporation Q2 2011 Earning Conference Call Transcript

Sponsored Recommendations

What are the Important Considerations when Assessing Cobot Safety?

April 16, 2024
A review of the requirements of ISO/TS 15066 and how they fit in with ISO 10218-1 and 10218-2 a consideration the complexities of collaboration.

Wire & Cable Cutting Digi-Spool® Service

April 16, 2024
Explore DigiKey’s Digi-Spool® professional cutting service for efficient and precise wire and cable management. Custom-cut to your exact specifications for a variety of cable ...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!