Tracking down bugs in complex digital systems is a mighty challenge for design engineers, especially considering the rocketing speeds of microprocessors, buses, and other elements in these systems. To meet these challenges, manufacturers of logic analyzers are constantly improving the hardware and software of these sophisticated instruments.
Hardware improvements tend to make logic analyzers faster, wider, and deeper. Faster refers to an increase in state and timing speeds. Wider refers to an increase in the number of channels. Deeper refers to an increase in memory depth. Software improvements often enhance the instrument's usability.
We checked in with the two behemoths of the logic-analyzer domain, Tektronix Inc. and Agilent Technologies Inc., to find out how these instruments are improving, changing, and otherwise evolving to meet the demands of powerful new technologies. Pentium 4, PCI-X, double-data-rate (DDR) RAM, and Infiniband are just a few of these technologies that come to mind.
The key aspect of high-end logic-analysis systems, like the Tektronix TLA 700 series and the Agilent 16700 series, is their modularity (Fig. 1). Designers add modules to their systems whenever they need more channels or require the functionality that comes from adding an oscilloscope or pattern generator to the system. If more slots are necessary, an expansion chassis can be added to the base system. As you might guess, these logic analyzers come with a high price tag.
Why would a designer add more channels to the system? This technique is one way to get an instrument with a 200-MHz state speed to analyze, for example, a bus with, say, a 266M-Hz or higher speed. Logic analyzers deal with the speed issue through a combination of speed and channels. In fact, if you want to know what speed an analyzer can handle, just multiply its speed by its number of channels, a specification usually referred to as data bandwidth.
There's more to it, of course. Usually, the logic analyzer performs some electronic magic in the front end, such as multiplexing, to make everything work correctly. The overall trend, though, is the slow but steady march of state speed and channel capacity in logic analyzers. Last May, for instance, Agilent announced three new state- and timing-analysis modules for its 16700 series that have a state speed of 400 MHz. On the other hand, Tektronix introduced its TLA7XM expansion mainframe about a year ago (Fig. 2).
Adding this single expansion unit to the TLA 700 increases the number of logic-analyzer channels of the combined system to 2176, while the maximum number of channels possible is a whopping 8160. This large number of channels is especially needed when acquiring and analyzing data from multiple processors and buses in a single system, and it addresses the complex verification challenges associated with these high-end systems.
Riding hot on the heels of faster and wider is the feature known as deeper, meaning greater memory depth. Both Agilent and Tektronix claimed gains in this area last year. Agilent's new 16752A state and timing module, for example, offers a memory depth of 32 Msamples, while Tektronix's recently announced TLA7Q2/4 acquisition modules bring the memory depth of the TLA 700 Series up to 64 Msamples from only 16 Msamples (Fig. 3).
Greater memory depth is valuable, for instance, when working with communications protocols. The larger the memory, the greater ability it has to capture entire frames or packets of data.
There's another reason why manufacturers are constantly improving the memory depth of logic analyzers. The advent of simulators, which enable a lot more work to be done prior to the actual prototype, together with the process improvements in the levels of integration—meaning smaller and smaller transistors and larger and larger chips—are producing more complex designs. As Colin Shepard, general manager of the logic analyzer product line at Tektronix, puts it, "The good news is that we can even contemplate doing that complex of a design because we have better simulation tools. But the reality is, the problems that show up on the prototype are that much harder to find."
Due to the high levels of integration, when a problem occurs, its cause actually took place a while ago. This is why designers require greater and greater memory depth. "It just seems like there's an insatiable appetite for that. The greater the memory depth, the further backward in time designers can look from the trigger on that glitch to see what the real cause was, and see what sequence of events led up to the problem," Shepard remarks.
Another lively area of innovation in logic analyzers is triggering. The traditional logic analyzer trigger is kind of an "if-then-else" type. Unfortunately, this method of triggering doesn't work for modern packetized buses. For these buses, it's not an "if-then-else," but more like, "if source destination equals x, and timeout equals y, then trigger."
As Greg Peters, marketing manager for oscilloscopes and logic analyzers at Agilent Technologies, points out, "That's how customers want to debug their problems. We have to provide the interface that works in that way. It used to be, and still is for lots of customers, that the measure of triggering was how many levels do you have. And what customers thought was, 'If I want to look at this special protocol that I have, it looks like I will need 16 levels.' Or more commonly, they thought, the more levels, the better. It's still important, but customers want to express their triggers differently now. They don't want to express them in sequence levels; they want to express them in the protocol language that they're working with. At Agilent, we call that protocol-aware logic analysis."
Another interesting challenge related to triggers is the tendency of digital signals to look increasingly more like analog signals as frequencies grow surge ever higher. Tektronix's Shepard points to the Rambus challenges that designers have in the industry. "A lot of new analog issues came up with this technology," he says. "Yet, they were building a digital system. Those digital signals are really analog signals. Designers have to look at them in both domains, meaning good correlation is necessary with typical tools that capture analog signals, like oscilloscopes."
Shepard notes, "Our user interface can correlate analog and digital data. And we've put a cross-triggering mode into our scopes and our logic analyzers so that the instruments can cross trigger each other and capture the same information. The engineer can look at what's on the scope screen and on the logic analyzer screen and know that it's the same information, the same period of time. That kind of functionality is important now and will be even more important in the future."
A Demand For Simplicity
Usability is another important issue with logic analyzers and one that's continually being addressed. "What we're seeing is an interesting blend of customers needing incredible simplicity for some tasks," Peters says. "They literally don't want to learn the instrument. I think this is something that all instrument vendors are learning. What we find is that customers say. 'I don't have any time to learn this. I just need to use it. I'm going to find the instrument that looks easiest to use.' On the other hand, of course, if you're buying high-end performance that you will be using on a day-to-day basis, then you want something that has depth to it."
The challenge for companies like Agilent and Tektronix is to put in the right hooks to meet the application needs of their customers. Peters says that Agilent is doing more customization of interfaces, kind of akin to a protocol analyzer. "You want the instrument to look like your application as opposed to looking like a general-purpose test tool," he says. He calls this an increasing trend.
An interesting usability feature recently incorporated into Agilent logic analyzers is a technology called "eye-finder." To make reliable state measurements, a logic analyzer must sample stable data. As bus speeds increase, the time when data is stable decreases, so adjusting the setup and hold window of the analyzer becomes more critical. Jitter, skew, and pattern-dependent intersymbol interference also make the window smaller. Eye-finder technology solves this problem by automatically adjusting the setup and hold window on each logic-analyzer channel.
Another area where logic analyzers are making great strides, but still have far to go, is in their ability to integrate with other vendors' software development tools. Typically, software engineers first purchase a software-development environment. Then they pick a vendor and sort of hitch their development team to that wagon. All the software engineers on the team use the same tools. After they develop their software, and it's time to turn on the prototype, they start thinking about logic analyzers. They don't want to change their development environment, but rather want the logic analyzer to work with it.
Tektronix's Shepard says, "It's important for logic-analyzer suppliers to make sure that they can support a number of software-development environments—vendors' tools. The tools must be able to take data out of the logic analyzer and into that software-development environment so engineers can look at it in the environment they're comfortable with." Shepard sees this as a future trend, because today's logic analyzers aren't doing this quite as well as they could to benefit customers. "But we're moving in that direction, and taking action to do that," he says. "Tektronix's intent is to support all of the major software tools vendors."
Using their normal development tools, software engineers would be able to look at certain data from the logic analyzer and have the tool turn it into source code. They would be using their normal debuggers just as they did when they were developing the software—before the prototype was available. They could "step" and insert break points in the debugger, for example. But some of the data would be coming from the logic analyzer, being disassembled and connected to and correlated with source code.
"Tektronix is engaging proactively with multiple software tool vendors," Shepard says. "And we're doing collaborative engineering with their engineering teams and making sure that we can get our tools to work together better over time."
Beyond what happens with the hardware and software within a logic analyzer, there's the very important external piece—the probe. Development of specialized probes mirrors the development of new technologies in the computer and communications industries. For instance, Tektronix recently announced a new support package for the Pentium 4 processor. The package includes a hardware "interposer" board, software setup files, and clocking instructions, as well as software disassembly and analysis tools.
FuturePlus Systems, a third-party manufacturer of probes for Agilent logic analyzers, has many products that essentially turn the logic analyzer into a specific type of bus analyzer. The FS2330 DDR probe is an example (see "Probing Problems With DDR SDRAM," p. 74).
"There's a real collaboration there," explains Peters of Agilent. "What FuturePlus or other vendors provide is really that application-specific physical layer and sometimes the software layer that goes with it. We have to provide a general-purpose probing connection for them to connect up to. We design that. We pioneered a lot of the work on passive probing some time ago and we continue to refine passive probing."
Looking to the future, Peters thinks bus speeds will pose enormous challenges for designers and instrument manufacturers. "There's this rocket going up on speed right now," he remarks. "Infiniband comes to mind as one that's running at a 2.5-GHz clock rate. Probing needs for Infiniband are just out of this world, and the speed is expected to double or quadruple soon. Infiniband is kind of a point-to-point bus internal to computer architectures. I imagine that it might bubble over into switch architectures. So bus speeds essentially went from PCI 133 to 2.5 GHz, with nothing in between. Following this progression has been a real challenge for us, and we see that continuing."
Peters feels fortunate Agilent is already using RF techniques and will continue to use RF and other high-speed techniques the logic-analyzer group leverages from other organizations in Agilent to help make those measurements. "We're really starting to look at the whole business as an RF-type of measurement," he says. "Not an RF-type of measurement in terms of frequency, but in the design of the front end of the instrument. It's looking much more like a network or spectrum analyzer. Almost by design, it has to move in that direction. If you walk back to our lab, our guys are using network analyzers for signal-path analysis on pc boards. We have to do that to stay ahead of our customers who are, to some extent, doing that as well. I expect that more of this will have to happen in the future given the speeds that we're operating at."
Of course, logic analysis isn't just for high-end, high-performance computing and communications digital designers. Also, it's not only for companies that can afford high-end logic-analysis systems. A fair number of digital designers are working on embedded applications for the transportation, automotive, consumer electronics, industrial, and other industries. While high-end designers are trying to wring the most out of technology in terms of performance, the others are just trying to get something done with the electronics. So, it's more about functionality and price.
Logic-analyzer manufacturers develop instrument technology for the high end. But over time, they migrate that down to the lower end, which Tektronix calls the single-application designer. "They don't necessarily need slower equipment," Shepard says. "They're running fast. If you look at setup and hold times of devices, they're all in the 5-ns range now. So anybody who's doing digital design has those kinds of setup and hold times with which they need to contend. They all need subnanosecond timing resolution and good signal-integrity analysis whether they're working on a single application or multiple applications. We have to offer high-resolution timing and signal-integrity analysis at all price points."
Tektronix delivered subnanosecond timing at a moderate price range when it introduced its TLA 600 about a year ago. Agilent has its 1670 series benchtop instruments to address the requirements of designers who require this type of performance and price.
Because embedded systems encompass a wide range of performance possibilities, there's room for one other kind of logic-analysis tool, the PC-hosted or PC-based logic analyzer. These tools often target budget-minded designers. For instance, Agilent's Logic Wave is about half the price of its benchtop instrument. Agilent doesn't compete against Tektronix in this realm, but instead against other companies, found in the Manufacturers Of Logic Analyzers listing in this article.
Paolo Xausa of SofTec Microsystems describes his company's PC-based analyzers as "far from being high end, especially where sampling speed is concerned. But they're inexpensive and offer—built-in—a number of serial analysis features, including the RS-232, RS-485, SPI, and the I2C-Bus, which can considerably speed up the debugging process of embedded hardware/software applications."
An attraction of these kinds of systems is their ease of use. "The LogicWave has what we call single-screen operation," says Agilent's Peters. "With our other logic analyzers, designers have to go through setup, trigger screens, and so on. In LogicWave, everything is on one screen. It's a breakthrough in ease of use, which is what customers really prefer about that product."
|Manufacturers Of Logic Analyzers|
Agilent Technologies Inc.
Borge Instruments Ltd.
Hitachi Denshi America Ltd.
Link Instruments Inc.
+39 0434 640 729