Electronic Design
Dealing With The Pains Of Technology Adoption

Dealing With The Pains Of Technology Adoption

Technologies for system-level design and their adoption have been a topic of debate in EDA well over a decade. Users and providers of system-level design technologies alike are watching the state of adoption, and like children on a road trip, they’re asking, “Are we there yet?”

So where are we? And what can we learn from regular consumer product adoption? Product managers in Silicon Valley often refer to Geoffrey Moore’s Crossing the Chasm, which is full of examples how technology adoption works and how it fails. One of the key elements for adoption is finding the capability a user cannot live without.

In my discussions with other product managers, we often categorize the different types of capabilities we discuss as vitamins, painkillers, and cures for cancer. Vitamins are the features that are really only nice to have but don’t drive adoption. The cures for cancer are features users must have, will pay real money for, and will pretty much do anything to adopt. Painkiller features are somewhere in between. Whether or not they drive adoption depends on how big the pain of existing legacy technologies for users really is.

Unfortunately, capabilities that users can’t live without aren’t easy to come by. Once found, though, users will do pretty much anything needed to adopt them. In EDA, the core capabilities of logic synthesis from RTL and its simulation come to mind. Users reached a point at which designing, simulating, and verifying at the gate level became simply impossible.

It became too time consuming and too error prone. It also was too risky not to adopt the new technologies, as design teams would miss time-to-market, resulting in the competition using the new technologies eating their lunch. Once that point was reached, the appropriate investments, both monetary and in methodology changes, were made willingly.

Today’s Painkillers

In contrast, most cases of system-level design technology adoption seem to be in the middle domain, the painkillers. Putting up money for a tool purchase to adopt technology is one thing. But for designers to change the way they are currently developing, to change the existing proven approaches, is often more difficult. It can be a pivotal moment in a project manager’s career afterall, especially when it goes wrong.

The ratio between pain and effort to change takes a crucial role and determines technology adoption. Only if the pain is big enough are users willing to invest both the budget to purchase tools and the effort in the actual technology adoption. The dilemma for product managers is that we can only control the latter. We judge the pain when we drive decisions to invest in technology development, but we cannot control the pain or desire for adoption. What we can control better, though, is the ease of adoption for a new technology.

A good example from the consumer space is how we as consumers enjoy media at home. We dropped video rentals long time ago for services like Netflix. The pain to drive to the store to rent a video wasn’t really that bad, but the Netflix adoption was so easy and its queue concept so smart that it allowed us to keep a couple of DVDs at home. Voila, the adoption was made so easy that even limited pain of driving to the rental store were overcome.

But what about the wait and having to keep the Netflix queue updated? These days the connectivity of our homes has made us impatient. I now stream online or download rentals when I want them and have dropped the Netlfix DVD delivery overall. It is very telling that I have the actual technology capabilities in almost 10 devices I own, but I have only adopted them on the two devices for which adoption was easiest.

Apple has mastered ease of adoption though its brilliant user interfaces and wizards and as a result my Apple TVs are in full use for downloading and streaming. The second device is my home PC for which the download and setup of video streaming was similarly easy. Now I have all of the seasons and episodes of Battlestar Galactica at my fingertips. All the other devices, ranging from the cable box to the Blu-ray disc player to the TV, are also running the Netflix apps, but their setup wasn’t easy enough or the handling was clunkier.

A Look At The System Level

So what can we learn from the Netflix example for system-level design technologies? The good news is that customers recognize the need for change and the need for system-level design adoption. It is especially driven by software requirements. SemiCo recently reported that the total software design cost increases 42.5% at the 28-nm node and estimates its cost to show a compound annual growth rate (CAGR) of 138.9% through the 14-nm node. If that isn’t enough pain for users, then I don’t know what is.

As a result, technologies like virtual system platforms—enabling software development before silicon is available and enabling more productive software development even after silicon is back—are in high demand. Most importantly, in addition with the pain driving adoption having become big enough, we also have made dramatic improvements over the last decade to make virtual platform technologies adoptable.

Looking back 10 years, one of the trailblazer system-level technologies—Cadence Virtual Component Co-Design—had just been released. We were addressing hardware and software designers with a combined methodology, had developed techniques to abstract hardware (processors, buses, peripherals, etc.), to abstract software (real-time operating systems, drivers, and the actual functionality), and we could even create the implementations from the complete executable system-level specification.

We were demonstrating a flow in which the customer told us whether a task should run in hardware or software and then we automatically rebuilt the design and did show the resulting layout of the hardware together with the implemented software image. The results were awesome. We achieved much faster development times and more robust designs meeting the customer’s specifications.

It’s too bad that software and hardware developers had to change the way they’re doing things completely. Also the complete ecosystem of processor, intellectual property (IP), real-time operating system, and middleware providers had to participate by providing models to make it all work. Of course the technology, while technically great, wasn’t adopted. The lesson is that the ability to adopt a new design technology is inversely proportional to the number of changes it requires.

So where are we today? We have learned from the past. The industry has made virtual system platforms for embedded software development much more adoptable. IP providers like ARM are providing models of their processors, and so do peripheral IP providers.

Most importantly we only ask for the hardware to be abstracted, the real software runs on virtual system platforms, and as a result the software developers not to have to know anymore whether they run on a simulation, an FPGA prototype, or an emulator. They don’t have to change anything! With the need for system-level design technologies like virtual system platforms having arrived and becoming easier and easier to adopt, interesting times lie ahead!

Hide comments


  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.