As shown in the figure, although system visibility has dropped steadily over time, integration continues to increase. Only recently has visibility rebounded to some extent. The first major shift in product development came when microprocessor-based systems started to replace systems built from discrete components. As microprocessors invaded the electronics landscape, it was no longer possible to examine every signal of interest by probing board traces with a scope or logic analyzer. Instead, designers started working with the in-circuit emulator (ICE), an instrument that relied on a bond-out chip, a special version of the processor that provides access to signal nodes not visible in production chips. In simpler processors, it was sometimes possible to surround a production device with a sea of logic and perform this function. With the corresponding debugger software and additional hardware within an emulator, developers could monitor actions of on-chip buses, see instructions being executed, and set breakpoints that could halt processor operation to examine the contents of registers and memories.
ICE-based tools gave designers adequate visibility until four trends converged to change the tools landscape. For one, with increasing integration and the accompanying architectural complexity it enabled, the number of nodes required on the bond-out chip grew to the point where building the chip and connecting it to a system started to become unmanageable.
Also, the number of different chips multiplied as specialized processors emerged, increasing the number of bond-out chips needed. This was an expensive proposition for chip suppliers because any bug had to be fixed in two places, in regular production and for the bond-out chips.
Third, logic added in the ICE to share bond-out chip buses with system logic started to significantly alter chip performance characteristics. As clock rates climbed, gate delays in series with bond-out signals caused timing to exceed production-chip specifications. It wasn’t uncommon for a system to work fine with the chip’s production version, but fail when the engineer inserted the bond-out chip or ICE connection.
Another and very important factor was packaging technology. As pin counts doubled and packaging technologies transitioned from dual-in-line to higher-pin plastic-leaded-chip-carrier (PLCC) and pin-grid-array (PGA) packaging, the emulator target connection became both expensive and unreliable. This combination of timing and packaging issues caused development tools to introduce electrical problems. In addition, bulky physical connections prevented their use in systems with fully populated card racks.
When the ICE solution became ineffective, development tools made their second big leap by placing emulation control logic on production chips. The bond-out chip and traditional emulator pod were replaced with a combination of a production chip and an external emulation controller. Using serial-scan methods and dedicated I/O pins, the controller managed program execution, and read and wrote chip memory and registers. Each production chip could carry the pins and gates needed, due to the reasonable cost of this approach. As a result, complex emulation connections were no longer needed.
When we pioneered this technology in 1988, it was viewed as a bold step. Real-time visibility was traded for production-chip-compatible timing with no interconnection woes. The rest of the industry recognized the advantages of the new approach (some faster than others) and broadly embraced and adopted this technology. Chip vendors were thrilled that the era of bond-out chips had passed. Developers were disappointed that visibility was reduced, but relieved that the development tools were no longer a source of system problems. This on-chip debug logic could run or halt a program with breakpoints or watchpoints, step through code, perform memory reads or writes, handle real-time interrupts, and perform some basic software profiling. Developers controlled these actions with a PC-based debugger that interfaced to the on-chip debug logic through a JTAG port (an industry-standard test access port). Logic analyzers monitored system activity at the chip pins in an attempt to restore some of the visibility lost with this generation of tools.
We’re now reaching the limits of the third-generation tools, so another major leap in debug technology is a must. The complexity of systems on a chip coupled with the risk of a killer bug that takes months to find are forcing a reintroduction of real-time visibility into the software developer’s portfolio. Due to dramatic gains in gate densities and huge improvements in packaging technologies, visibility can again be accommodated on production devices. Hence, integration may turn out to be an ally in the short term.
Chip suppliers recognizing their customers’ problems have risen to the occasion, understanding that third-generation solutions just don’t have the horsepower to nail today’s really tough system problems. Some have abandoned simple third-generation solutions, such as the shallow program-counter discontinuity stacks that were deployed as early as the late 1980s. Instead, these suppliers decided to provide continuous program-counter and timing trace at full processor clock rates—battling the laws of physics along the way. Memory reference trace has generally been tackled at the same time.
MCU suppliers lead the charge on restoring visibility. The visibility-demanding control nature of their applications created a compelling business case to support the need for visibility-enabled tools earlier than in the DSP arena. Certain DSP chip suppliers hotly pursue this visibility capability while others still don’t accept the challenge. DSP chip suppliers’ willingness to deploy these leading-edge visibility technologies will help separate the industry leaders and winners from the followers.