Looking Toward The Testbench Of The Future

Dec. 21, 2010
The test bench is evolving toward greater integration, more bandwidth, higher sampling rates, and modularity.

Bandwidth requirements

MSO5000 oscilloscope

53200A series

Probing technology

Compliance testing

If you’re a typical user of test gear, you’ve most likely been looking at the same old equipment for a few years now. In these tough economic times, it’s not surprising that upgrades of potentially big-ticket items like oscilloscopes, spectrum and network analyzers, and other testbench staples have been put on the back burner. But while you’re hunkered down with your old warhorse scope, the industry’s test-equipment vendors are readying themselves for when you will need to upgrade.

And whether you like it or not, you will need to upgrade eventually. If your current design project doesn’t outstrip your equipment’s bandwidth or frequency range (Fig. 1), your next one probably will.

In the not-too-distant future, your testbench will be populated with equipment with greater bandwidths and broader frequency response. It will be a tightly integrated testbench, either because you’ll have instruments that pack multiple forms of functionality within one box, or because the standalone instruments will be able to share data and measurement setups.

And, it’s not unlikely that you’ll at least consider a modular setup that will allow you to swap functionality in and out, enabling quick on-the-fly changeovers for application-specific testing requirements.

Due to the economic crunch, many test engineering groups are operating under tight budgetary constraints. Thus, when they look to upgrade the bench, they will want to add new measurement capabilities and not simply replace what they already have. What will a new instrument do for them that the previous generation couldn’t?

The challenge for the industry’s test and measurement vendors will be to add new capabilities and increase the performance of its offerings without adding bells and whistles that don’t bring substantial value. Moreover, the test engineer wants these new capabilities without making the instrument overly difficult to use or to remember how to use should some time pass between instances of its use.

According to Scott Stever, general-purpose instrument planner for Agilent Technologies’ Systems Products Division, there are two broad groups among test and measurement customers. One is the “classic” customer, whose environment is reasonably stable and doesn’t change much over time. The other group is on a fast technology track and is trying to keep up with the end-product complexity curve.

“In the digital world, this may be driven by data rates and clock speeds. In the communications world it might be dealing with new standards,” says Stever.

Much of the world is still evolving from traditional analog communications to more current digital methods, with the cellular handset on the bleeding edge of that evolution. But in any communications realm, designs are moving from simple analog, scalar modulation methods to more complex digital and vector modulation techniques.

Continue on next page

As a result, design teams are wrestling with these revolutions in their end products, which can be fairly expensive to implement even though the end product becomes more cost-effective for the consumer.

Not too long ago, however, the measurement tasks associated with a changeover to digital communications were the domain of extremely expensive, highly specialized instruments. Today, these challenging measurements can be made using general-purpose instruments.

Revolution on the testbench
Gina Bonini, technical marketing manager at Tektronix, says the says the time is right for a revolution on the testbench. “Our customer research shows us that not much investment has been made in the bench instrument area for a while. A lot of instruments we saw on customer’s benches were fairly out of date with old connectivity. Their scopes were missing some of the features you take for granted today,” she says.

A major trend that will accelerate in 2011 and beyond is integration of the testbench. There are many reasons why this should be of interest to test engineers. Further, there are several senses in which integration will take place. They have to do with connectivity on the one hand and with physical integration of multiple instruments in the same box on the other. Moreover, this shift toward integration encompasses both the hardware and software sides of the equation.

One ongoing concern for test engineers is data logging. How do you connect the test instruments to the PC and transfer data? Traditionally, designers have relied on Microsoft Office applications such as Excel to deal with such issues, especially when it comes to the collection and organization of data and the creation of reports.

Improved instrument connectivity will result in more automation of these kinds of tasks. Further, connectivity makes it much simpler to perform jobs that require multiple instruments, such as filter characterization or the checking of transistor tolerances.

Both of these scenarios will be increasingly addressed using software such as National Instruments’ LabVIEW SignalExpress, which enables users to automate test sequences that span multiple instruments. Once the sequence is captured, it can be repeated easily, so the mundane aspects of characterizing those filters are programmed away and the likelihood of errors that hide in such repetition is vastly reduced.

Consider a standards-compliance plugfest, where the same test sequence must be repeated with great precision many times. Without automation software, the setup for such events is extremely tedious. Performing the test sequences themselves can take hours. Going forward, test engineers can expect such situations to go much more smoothly.

Connectivity: out with the old
The connectivity landscape is changing as well, with the tried-and-true general-purpose interface bus (GPIB) slowly receding into the background as it is replaced by USB and/or LXI. The older, classic computer interfaces like GPIB served the industry well, but the future is in faster serial interfaces.

Continue on next page

Having been developed in the early 1970s, GPIB now carries all the baggage of a 35-year-old interface. It’s well-known and understood, but it’s also very slow. Moreover, it cannot deliver modern connectivity solutions such as a Web interface.

“Most new applications in automated test have migrated to LAN (local-area network) technology as the preferred interface and away from GPIB,” says Agilent’s Stever. Now that it’s on the slippery slope toward obsolescence and eventual oblivion, GPIB has been relegated to the status of an optional interface.

Where and why this changeover occurs in instrumentation depends on the use case, though. “If you’re looking for fast production testing, that’s where things like PXI and modular instruments come in handy,” says Tektronix’s Bonini. On the other hand, in specialized test bays such as those in government/military applications, test engineers will be less likely to scuttle a certified setup using GPIB.

But in benchtop automation, LXI or USB comes in very handy. USB has the benefit of being a personal-area network, enabling you to plug instruments directly into a laptop without having to deal with IP addresses as with Ethernet (Fig. 2).

In the future, test engineers will have to examine their own use case as to whether it is easier to set up the bench as a personal-area network with USB—your LAN network may lend itself to the use of LXI. Most test vendors now provide a blend of LXI and Ethernet compliance across their lines while virtually all instruments come with USB connectivity, which for the foreseeable future will remain the easiest immediate connection option.

Once you have a LAN interface, it opens up possibilities for instruments to have Web pages to provide a different user interface. With so much design being done on a global basis at physically disparate design centers, this will prove extremely valuable going forward. Through instruments’ Web interfaces, users will have access not only to images of the front panel, but also to the ability to remotely set up the instrument for measurements using a dialog-box-driven interface.

However, the use of such Web-interface capabilities now can be a challenge in most R&D environments. This is because many test groups’ IP infrastructure still puts up roadblocks to using LAN instrument connectivity in everyday benchtop use. Many engineers still fall back on the USB interface for R&D benchtop use, while LAN is more prevalent in automated test applications.

Test hardware is also moving to more modular instrumentation platforms such as PXI. With such platforms, users can create customized systems that directly address specific measurement requirements. Modular setups from vendors such as National Instruments and Agilent Technologies can be tightly integrated with close synchronization of timing between modules.

With PXI, it’s relatively simple to generate signals and sample data all at the same times, with triggering set up at the proper times for these tests to occur. The data can be analyzed and processed after the fact.

Also on the rise is the use of software-defined instrumentation systems, which are moving toward higher sampling rates. Again, the trend is one in which multiple instruments are being integrated.

Continue on next page

A better look at the data
Instrument displays and user interfaces will continue to evolve as well with more instruments featuring colorful displays that go beyond basic readouts. With the exception of oscilloscopes, most instruments in the past traditionally have had simple numeric displays, which provide only a small window into the measurements being performed. From an ease-of-use perspective, users had to navigate through more or less blind menu paths to set up their instruments for measurements.

Today’s trend, and one that will only accelerate, is toward colorful graphical displays. This began several years ago with function generators having displays that allowed easier setup. The latest-generation instruments of these types have a much larger graphical display that allows for a more familiar navigation model for users with more intuitive operation.

“Where possible you want to leverage operational models that are familiar, such as Windows-like drop-down menus that give users visibility into where they are and what their options are,” says Agilent’s Scott Stever.

Once an instrument has a graphical display, there are options for other types of data presentation. An example of the direction in which the industry is heading is Agilent’s 53200A series of frequency counters (Fig. 3). Rather than the traditional frequency-counter display of a simple frequency readout, these instruments have a chameleon-like graphical display that not only enables easier instrument setup and operation, but also presents data in various formats and styles that give users much greater insight into the meaning of measurement data.

The 53200A series counters can display trend charts that allow a view into measurement data over time. They also can bring up histograms of the data to highlight variation and jitter on signals. This is an example of how instruments will evolve to deliver much more insight into how circuits are behaving.

Integration: one box, many functions
A major trend in test and measurement that shows no sign of tapering off is multiple measurement capabilities in a single box. The basic digital multimeter (DMM) has long been able to measure temperature, capacitance, and frequency in addition to voltage, current, and resistance. But DMMs are just the tip of the iceberg in terms of integration.

Oscilloscopes aren’t just scopes anymore. They’re logic and mixed-signal devices. Scopes have sprouted numerous add-on functions, such as protocol and jitter analysis. They have become the aggregation point to absorb some less frequently needed functions, a trend that increases the instruments’ value to the user in numerous ways, not the least of which is the reduction of testbench clutter.

The challenge that comes with this for the instrument manufacturers is that even as their products add more functionality, they must continue to be easy to use and operate. “Our customers’ job isn’t instrumentation,” says Agilent’s Stever. “It’s design and debug. The instrument is a tool to get that job done.”

Continue on next page

The imperative for instrument manufacturers to maintain ease of use in their products is due to the reality of how they’re used. A given piece of test equipment isn’t used equally in all phases of a design/debug cycle. A protocol analyzer might be used quite often at certain points and then left unused for a stretch of time. The equipment’s maker doesn’t want users to have to pull out a manual when it’s powered up again. Hence the trend is toward instruments with a lot more built-in help to give users cues on what to do next.

In applications involving energy management, where dynamic measurements of power usage must be made, the trend is toward the use of source/measure units (SMUs), where tight integration is required between the source and the measurement unit itself.

The trend is also exemplified by today’s arbitrary function generators, which now include pulse and arbitrary capabilities as well as other functions that in the past were found in digital instruments. Consequently, low-end pulse generators are essentially extinct as a standalone instrument. Instrument makers can provide multiple functions within single instruments due to smaller component sizes and the proliferation of specialized ASICs.

Look for the trend toward instrument integration within a single box to continue, with more introductions along these lines early in 2011 and throughout the year. One forthcoming introduction will integrate as many as four standalone instruments’ worth of functionality within a single chassis, yet another example of the ongoing trend toward integration.

The aforementioned 53200A series frequency counters from Agilent provide an example of how instruments are combining capabilities. Under the counters’ hood is a measurement engine that only a few years ago would have been found in a modulation domain analyzer, a much more costly instrument. Along with that engine comes the ability to capture data at 10 times the resolution and 100 times the memory depth and similar sample rates to Agilent’s previous generation of benchtop modulation domain analyzers.

Not long ago, Agilent also introduced the concept of an instrument along the lines of a dc power analyzer with elements of traditional multi-output power supplies but also with programmable arbitrary output amplitude. This enables users to program startup and shutdown sequences as well as steps between voltage levels. Data is displayed on screen either in scope-like views or in curve-tracer-type views, showing current or voltage versus time.

Thus, users gain an instrument that’s tuned for a particular set of applications without having to lash together disparate equipment and write a script to control them. It’s becoming more common for test manufacturers to shoulder the burden of this kind of integration.

One other sense in which integration will continue is in the physical domain, with instrument manufacturers making more of their product lines alike in terms of form factor. It’s understood that the test engineer’s workbench space is likely to be lacking, so more instruments will be built in a way that lends itself to stacking them vertically on the benchtop.

Advancing probing technology
Probing technology must evolve to keep pace with oscilloscope bandwidths. This applies to the amount of loading that probes impose on the circuits being tested as well as to their ability to physically make contact with ever-finer test points and IC pads.

Continue on next page

Measurement accuracy begins with the probe, and its connection options will affect its usability. One example of very recent probing technology that points the way toward the future is Tektronix’s passive probes, which offer very low capacitive loading in a passive voltage probe (Fig. 4).

When probing circuits, communication between the probe and scope will be an increasingly important part of the test setup. For example, connection of a current probe to the scope will automatically cause it to convert to ampere measurement. If the scope has an attenuation setting, it will automatically compensate for that attenuation scale.

“Much more of probing technology is being driven on the scope side of the equation,” says Agilent’s Scott Stever. “SMT (surface mount) technology keeps shrinking, and it’s harder to probe unless the circuit board has been designed with dedicated test points. So you have to plan ahead a lot.”

Expect to see further proliferation of automatic scope setup as well through what some designers have come to know as the “easy button.” Test engineers are being forced to deal with more domains of expertise, such as compliance test for various protocols.

On any given day, they may have to troubleshoot a USB 2.0 bus, an Ethernet interface, or an I2C or serial peripheral interface (SPI) bus. More instruments will feature “easy buttons,” (Fig. 5) and a single press will automatically configure the instrument for the task at hand.

With interfaces such as USB 2.0 and Ethernet becoming ubiquitous, test engineers must be prepared on a daily basis to perform such specialized measurements. As a result, more instruments will offer built-in help screens and menus to explain the intricacies of these tasks and help the user choose the correct option so the measurement is made properly.

Automating the testbench
Going a step beyond help screens and menus, automated test is becoming more prevalent on the benchtop as well. Automation in and of itself isn’t new, but what’s coming is a trend toward automation of system-level test.

“Devices are now systems and they don’t have direct test points on them anymore,” says Matthew Friedman, senior product manager for automated test at National Instruments. There are more wireless systems that require testing, such as wireless LANs and RFID systems, and a growing array of wireless protocols that must be debugged.

In addition, systems-on-a-chip (SoCs) have such dense packaging and so many intermediate signals between blocks that there is often no practical way to probe them. With the design community trending toward more discrete components with multiple levels of functionality, this adds new levels of complexity to verification. It’s no longer as simple as testing, say, a baseband chip for a wireless LAN, but the links from that chip to other chips and buses must also be tested.

The future of the benchtop is moving to higher levels of abstraction. The idea is to model entire systems and create systems that can test them using multiple flavors of mixed signals, whether they are analog, digital, or RF.

“Examples of this can be seen in hardware-in-the-loop testing and in protocol-aware testing,” says Friedman. “With the latter, our test instrumentation has to be able to perform handshaking with native protocols. From the software standpoint, you’d have to create an embedded system to do that kind of communication within your own test system.”

Along the same lines, a growing trend is to program FPGAs to excite the device under test (DUT). For example, to test a cellular handset, one can create an FPGA-based program to simulate the cell phone’s processor, which is used in turn to excite the TDMA chip inside the DUT. Similarly, from the software standpoint, expect to see higher levels of abstraction in programming those FPGAs for controlling test routines.

About the Author

David Maliniak | MWRF Executive Editor

In his long career in the B2B electronics-industry media, David Maliniak has held editorial roles as both generalist and specialist. As Components Editor and, later, as Editor in Chief of EE Product News, David gained breadth of experience in covering the industry at large. In serving as EDA/Test and Measurement Technology Editor at Electronic Design, he developed deep insight into those complex areas of technology. Most recently, David worked in technical marketing communications at Teledyne LeCroy. David earned a B.A. in journalism at New York University.

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!