Transaction-based verification isn't a new concept. It has been practiced for several years, mostly driven by the need to generate comprehensive test sequences to cope with the unrelenting increase of SoC complexities.
Conceptually, the idea is simple. Tests are written at high levels of abstraction, and the conversion from high-level commands to bit-level signals is moved into a dedicated entity called a transactor. In this way, a designer can focus on creating complex testbenches and ignore the boring details. Once created, transactors can be reused. Typical examples of transactors include communication protocols such as Ethernet, USB, PCI/PCIX, memory accesses, JTAG ports, or even digital cameras and LCDs.
It all sounds good, but there's a problem. The approach generates massive amounts of test cycles that bog down the simulator. After all, the simulator still executes the conversion from high-level commands into bit-level signals performed by the transactor. It's not uncommon to run a profiler and discover that the testbench consumes more than 50% of the execution time, with the remainder taken by the design under test (DUT).
Hardware-assisted verification platforms such as emulation systems or fast prototyping platforms may come to the rescue, making it possible to achieve a performance boost of several megahertz. This is done by mapping the back-end section of the transactor that converts high-level commands into bit-level signals onto the hardware platform. Testbenches would be written in C++ to drive the front-end section of the transactor. This approach offers a speedup of five or six orders of magnitude compared with simulation-based verification.
The corollary to the theorem is that it's now possible to build a virtual test environment instead of an in-circuit emulation (ICE) test bed, accomplished by replacing a set of speed bridges (Ethernet, PCI or USB) with an equivalent set of transactors. In general, an ICE test bed serves one specific design and isn't reusable on another. In contrast, transactors can be reused on any design.
In the ICE approach, designers must bear with the whims of real hardware behavior, sacrificing repeatability. Conversely, a virtual test environment based on transactors lets designers quickly reproduce a problem. Furthermore, the freedom from connecting the DUT to a hardware test bed allows designers to execute the DUT from remote locations.
Equally important, by gaining full control of the design clocks that aren't sourced by the hardware test bed, debugging becomes easier and more efficient. By controlling the clock frequency, it's possible to stop the model, read the memory content, force some registers, or dump the waveforms. Debugging in an ICE requires hardware logic analyzers. More importantly, no DRAM or LCD screen refreshing is necessary in the virtual environment.
With a PCI transactor, it's possible to hook up a PCI software driver to an emulated design that includes a PCI interface. It's similar to what can be done via a hardware speed bridge in an ICE. Likewise, a software debugger can be connected to an emulator via a JTAG transactor and run in step-by-step mode. That's not possible with a JTAG physical connection.
Once designers experience transaction-based verification via a hardware-assisted platform, the whole verification perspective changes. Being able to quickly set up a powerful test environment unfettered by cumbersome ICE hardware means easier and more effective debugging. The goal is the same—better design in less time—but the experience can now be something far less challenging.