As the software content in today's 2.5G and 3G phones rapidly increases, timely software development is becoming critical for product success. The traditional development flow—in which software design isn't started until after hardware design is complete or nearly complete—simply breaks down. The resulting design cycle is too long for the competitive wireless market. In addition, dealing with the hardware-software interaction after most of the hardware has been defined can yield a less than optimal solution.
Today's wireless design flow often relies on FPGA-based hardware prototyping to start the software-development tasks in earnest. Clearly, this approach has one indisputable advantage: The same Verilog or VHDL hardware model that's going to be synthesized into silicon creates the prototype. This methodology requires a single Verilog or VHDL source for both the FPGA and system-on-a-chip (SoC) implementations. By using the same model as the source for both, the developers have a high level of confidence that the software developed on the prototype also will work on the final hardware.
The downside is that the software engineers have to delay the design and testing of their software until after the register transfer level (RTL) is available. This delay lengthens the overall development cycle. Competitive pressure to "get the product out" can lead to situations in which a product is placed on the market even though it is architected in a sub-optimal fashion or exhibits a variety of hardware and/or software problems. The only way to address this dilemma is to find a way to start software development much earlier than usual in today's wireless projects.
The ideal solution must permit the development team to perform the software tasks before the hardware or a physical prototype is available. Instead of a real hardware-based prototype, the team needs a software-based prototype or virtual prototype on which to perform the various software-development tasks. This virtual prototype must be a simulatable, fully functional software model of the target system. The virtual prototype must be available many months prior to a hardware prototype—even before the architecture is fully frozen.
At the same time that the software team is working on the virtual prototype, the hardware designers can be completing their hardware model and synthesizing the chip implementation. Thus, a virtual prototype enables true hardware-software co-design and co-verification. The shift toward a concurrent hardware- and software-development flow fundamentally addresses the problems that are created by the increasing software content in today's wireless products.
This article explores a mixed-level modeling methodology and the tools required to build a virtual prototype. Such a methodology provides critical data to the designers up front in the development process. It helps them make key decisions about the architecture early on. This article also presents a verification solution that enables designers to catch potential functional differences between the virtual prototype and the actual hardware while it's being developed. That verification maximizes the confidence that the software developed on the virtual prototype also will run on the hardware model and the final hardware itself.
Today's typical advanced wireless product utilizes a wide variety of intellectual-property (IP) blocks to provide the functionality demanded by the market (FIG. 1). Such blocks include digital signal processors (DSPs), processor cores, complete subsystems, modems, and multimedia blocks. They also house a slew of interfaces ranging from USB to Bluetooth and Wi-Fi. On the software side, standard operating systems need to be extended to incorporate new hardware capabilities. Meanwhile, new applications need to be developed and validated.
Building a virtual prototype entails building a software model of the target product and its corresponding high-level testbench. Typically, high-level functional blocks, which are related to the various IP blocks, are used along with models for their interconnect fabric. In the virtual prototype, the structure of the interconnect models is important for simulation performance.
From a software perspective, real-time requirements need to be merged with application needs under the control of a real-time operating system (RTOS). In order to be able to answer critical questions concerning the performance of the architecture, the virtual prototype must model the detailed timing behavior of the interconnect fabric (typically on a cycle-by-cycle basis). Very fast and functionally correct models may strip away too much of the timing behavior to provide such answers. On the other hand, accurate cycle-by-cycle modeling of the interconnect fabric may have a severe impact on the overall simulation performance.
Essentially, designers need a solution that lets them choose the level of detail that will be used to model the interconnect fabric. The required details will depend on the answers that they hope to uncover by simulation. If there are questions about the interaction between the application processor and the modem subsystem, for example, that part of the chip needs to be modeled at a detailed cycle-accurate level. Yet other parts can remain at a functional level.
From a software perspective, all of the chip's functionality is available in the virtual prototype regardless of which detailed models and functional models are running. However, users can maximize the simulation speed by keeping most of the models on an abstract functional level. They'll still have the detailed information for a particular interaction available in order to answer key questions about the interaction.
The mixed-level modeling methodology and supporting tools allow designers to build a virtual prototype of the target design. They can connect it to the stimulus/response environment (testbench), load a set of software executables (one for each microprocessor and DSP), and simulate at very high speeds. The tools should allow the virtual prototype to be configured easily for different types of measurements and development needs.
A software developer, for instance, usually wants to maximize performance. He or she would therefore model the interconnect fabric purely on a functional level. In contrast, a system architect might want to measure bus utilization and other aspects of the architecture's performance. Such measurements might require modeling the interaction between the application processor and DSP in a cycle-accurate fashion.
One of the keys to mixed-level modeling is that the same stimulus environment (testbench) and software can be used to simulate the design—regardless of the configuration of the virtual prototype and the abstraction levels that are used to model the target system. In the solution shown in FIGURE 2, the individual blocks of the virtual prototype would typically be modeled on an abstract functional level in C or C++. They would then be connected through SystemC (FIG. 3). The external interfaces are modeled on an abstract functional level to communicate with the testbench.
This modeling style works for most designs because at the system level, designers aren't interested in the details of individual IP blocks. These blocks are typically reused from previous projects or acquired from an IP provider. Instead, designers are interested in creating functionally accurate data streams in the testbench. They want to explore how these data streams are handled by the virtual prototype while looking at what impact they have on a common bus or other resources. For example, the details of the USB block aren't critical. But getting pictures downloaded through the USB interface and measuring performance is.
Modeling the external interfaces like USB at a high level of abstraction ensures the speed required by software developers. It also allows them to incorporate real-world data streams. Designers can actually "borrow" the host workstation hardware, such as the USB interface, and connect it to the USB port in the virtual prototype. Real-world data sources and sinks, such as displays, can then be connected to the virtual prototype. In addition, they can be used during software development or architectural analysis.
At the heart of this mixed-level modeling technology is the connection of functional models to cycle-accurate models of the interconnect fabric. For instance, Synopsys and Virtio have developed a proprietary wrapping technology that allows a high-level function call to be annotated with information regarding cycle count. That function call can then be mapped to a series of calls to the cycle-accurate SystemC application programming interface (API) of the cycle-accurate interconnect model. The simulations using IP blocks that are modeled in this manner are cycle approximate rather than cycle accurate. Yet they yield sufficiently specific information to make architectural decisions when connected to a cycle-accurate bus.
Such solutions also address another key requirement for software development: the ability to interface and work with common software debuggers. Because software developers have different debugger preferences, a mixed-level verification environment has to provide interfaces to multiple software debuggers. In addition, these interfaces have to operate together with the debuggers in the hardware and verification domains.
Once a virtual prototype has been defined and handed off to a software-development team, it is of key importance that the actual hardware being developed remains consistent with this virtual prototype. Clearly, a lot of software work could be wasted if the hardware diverged from the prototype.
Together, the software and the testbench form a "golden" verification environment to test user-definable configurations of the target system's mixed-level model. This model then gets refined and gradually replaced by RTL representations of the various blocks. At this point, the same verification environment can and should be used to ensure consistency between the high-level model and the RTL model of the target system.
For performance reasons, it's likely that not all software can be run on the RTL model. However, the verification team can carefully craft a set of tests covering all of the functionality that should run and pass on the following: a high-level functional model, a more detailed architectural model, and the design's RTL model. Such a set of tests constitutes a regression suite. This suite will spot any divergence between the RTL model and the virtual prototype used by the software developers to validate and develop their applications.
The approach described in this article establishes a true concurrent hardware- and software-development flow. It ensures that by the time a hardware prototype can be created, the software is available to be downloaded and tested for integration. All of the critical components have been analyzed up front. In addition, the consistency between the hardware and software has been monitored throughout the process. As a result, the final integration phase becomes merely a validation step with a very high probability of first-time success.
The rapidly increasing software content in today's wireless devices requires the availability of a virtual prototype early in the project timeline. The benefits are straightforward:
- The team can start the software-development tasks well before real hardware is available, shaving many months off the project-development cycle.
- Real system and application software can be leveraged in making the key architectural decisions. This aspect ensures that performance and power goals are met.
- A virtual prototype is a lot cheaper and easier than a hardware prototype to duplicate and distribute to a large number of software developers.
The challenges that are faced with the virtual-prototype methodology are twofold. First, modeling technology must strike the right balance between speed and accuracy. In addition, verification technology must ensure that during the course of the project, the hardware implementation does not functionally diverge from the virtual prototype.
This article has shown how these challenges can be overcome through a mixed-level modeling methodology that's tightly integrated with an advanced verification solution. The verification solution enables designers to catch potential functional inconsistencies between the hardware and the virtual prototype. It therefore enables accurate hardware-software tradeoffs early in the design cycle. It also allows software developers to use a virtual prototype with the confidence that the software will be compatible with the actual hardware.