An expert viewpoint brought to Electronic Design by Agilent Technologies, Inc.
I’ve been around long enough to remember when a good set of hand tools was all you needed to repair virtually any car. Today, without a diagnostic computer you often can’t even identify problems, much less fix them. I’ve also been writing about the electronics industry long enough to remember when a scope and logic analyzer allowed you to debug virtually any electronic circuit, even those with CPUs, by watching activity on a processor’s control and data buses. As with cars, though, we’ve lost visibility into today’s complex systems. Luckily, some new tools are appearing to address these issues. Pointing out such significant trends in the test business, those that will have a major impact on the way you work and the types of problems you can tackle, is the purpose of this monthly column.
To start off, I’d like to address issues with a device that has brought untold advantages to circuit designers: the FPGA, which has made the System on a Chip concept available to low-volume designs. In fact, roughly 2/3 of today’s sophisticated designs, primarily where engineers seek product differentiation through hardware, use FPGAs. Further, these devices have taken a leadership role along with memories in terms of process technology; some of them have features sizes of 90 nm and are getting quite large, today reaching 500 million transistors. The obvious downside of this high integration is lost visibility. How can you see what’s going on when a circuit isn’t functioning?
Clearly you can probe only into a limited number of nodes within an FPGA. A key question is how many pins can you spare as “probe leads”? Also don’t forget about the time you must spend determining which nodes will reveal the best debug information. Then design the logic such that these nodes come out to the external pins, to which you attach a logic analyzer – and hope that this design change doesn’t alter any critical timing paths that might affect overall system operating characteristics. Oh, and don’t forget to label each trace on the analyzer so you can remember what you’re looking at.
This debug process is iterative, and it’s often based on intuition more than anything else. Do you really expect, on the first try, to find those exact signals that isolate what’s not working? Heck, no! Depending on the circuit, chip and your skills, it could take a long time to change the device to uncover the problems—and watch out that those critical timing paths don’t change.
One way to increase accessibility is by inserting multiplexers in the device by hand, a concept smart designers have picked up on. From a single pin they can look at multiple internal nodes before a redesign is necessary. Also, for the past several years, you’ve been able to embed rudimentary logic-analyzer cores in a chip. It’s a viable concept when no pins are available for real-time debug or when you want to get detailed information at one specific point.
But wouldn’t it be great to be able to have all the resources of a desktop analyzer yet probe all around an FPGA? We’re getting real close thanks to products such as the FPGA Dynamic Probe. It expands on the by-hand multiplexer concept, but the debug core allows you to access up to 64 internal signals through each debug pin. Further, you can change probe points using a logic-analyzer interface without making any changes to the FPGA design. There’s automatic signal/bus labeling from the chip into the benchtop analyzer. With this virtual probing technology, there’s less risk in selecting nodes and perhaps no need to change a design.
One realistic concern is how many FPGA resources such a mux core eats up, and it’s surprising little. Brent Przybus, Product Marketing Manager at Xilinx, tells me that each signal to probe adds roughly one slice, approximately four LUTs and uses no lockRAM. Another limitation is that this technology is available today only on selected devices, but look for more on the horizon. In fact, Brent says that in addition to being optimized for Virtex-II Pro, it’s also compatible with Xilinx’s brand-new Spartan-3 technology.
When designing an FPGA, engineers typically spend half their time iterating through the design cycle to fix bugs. Nobody can predict how much time dynamic probing will slash from the total, but it will be considerable. I also must agree with Galen Wampler, an industry analyst at Prime Data, who notes, “Traditionally, semiconductor technology has driven the electronics industry, and there’s been a delay between devices and test tools for them. This is the first serious parallel collaboration between semiconductor and test companies, and it’s only the start of a major trend.”
For a free on-line video demonstration on dynamic probing technology, visit