At least for the last decade, we have been hearing about the worlds of embedded software and hardware growing closer, but there has been very little measurable evidence to back up those claims. During a panel at the Embedded Systems Conference, Micheal “Mac” MacNamara, general manager of the C to Silicon group at Cadence Design Systems, drew a good analogy.
MacNamara compared hardware and software teams to two camps on different sides of a river full of sharks. That’s quite a good visual of what’s really happening. When I visit larger customers, it’s not uncommon for the software team members to introduce themselves to the hardware team members with business cards during the meeting even though they work at the same company. They simply don’t interact as much as you would think.
Still, I’m cautiously optimistic that we are starting to see a shift toward true co-development of hardware and software, or at least its co-debug. My optimism is driven by the results of a couple of user surveys Synopsys conducted last year. Much to my liking—I can’t really deny my German genes here—Synopsys measures everything.
At the Synopsys Users Group (SNUG) event in San Jose last year, we began surveying attendees on the percentage of their total project effort that was being put toward software. We repeated this same question throughout the other regional user group meetings that took place. Finally, we asked the same question again at DVCon in February 2009, as part of a survey inviting users to the conference.
The results confirmed what market researchers had predicted: the percentage of users claiming that their overall software effort was above 50% of their total project effort grew from 16% in March 2008 at SNUG SJ to 35% in February 2009 at DVCon. What is truly remarkable is that these numbers are coming from a traditionally hardware-centric audience. Both the Synopsys User Groups and DVCon audiences actually talk very little about software during their respective conferences.
Another statistic, measured in the context of DVCon, was even more astonishing. We asked users about the usage of embedded processors for verification. Specifically, we asked whether developers are using embedded processors in their designs for hardware verification, i.e., run testbenches on the embedded processor to verify surrounding hardware.
Much to my surprise, 54% of the surveyed audience answered “yes.” An additional 6% answered that they used embedded software on the processors for post-silicon validation. Overall, only 9% of the respondents stated that their design did not have processors in it, which is an interesting fact by itself.
These results, which were not only supported by market research forecasts, but also by the surveys we conducted with developers using EDA tools, put me in a quite optimistic mood that we are facing a fundamental shift towards more hardware and software interaction. Specifically at the interface between hardware and software, chip architects have to consider the effects of software and, in exchange, software architects gain more influence over the overall chip architecture.
In general, this drives requirements for more productive software development. Hardware and software have to meet as early as possible to make sure that they work within same the context of each other and also, as the second survey indicated, to help with verification. We have long heard the statistics in the hardware world that 70% of the effort is spent on verification. At least some of that effort seems to be now applied to efficient mixed software-hardware use models.
Hardware-centric testbenches utilizing specific language extensions for verification are now augmented with embedded directed tests running on software in the embedded processors of the design. The drivers, firmware, and operating systems running on embedded processors become an essential part of the testbench.
As a result, prototyping techniques gain much more importance. Virtual platforms offer software-based solutions now for verification as well. They can be augmented with FPGA prototypes offering more fidelity at RTL execution speed. Both platforms for early and productive software development can be combined with classical RTL simulation to achieve the optimal insight into and execution control of hardware and software for debug.
They have the potential to form the bridges between the two camps. It remains to be seen, however, whether the two sides of the river are also moving closer to each other. In any case, productive hardware-software bridges offer effective protection against the sharks, even though it has been pointed out to me that there aren’t any sharks in any rivers. It must be a school of killer trout.
Related Articles
Virtualization Innovations Drive Cost Optimization