Software-Defined Test Finds Broader Adoption

April 6, 2010
Thanks to technological developments in bus bandwidths, FPGA technology, and multicore processing, software-defined test is finding its way into scores of new applications in automated RF testing.

Software-defined instrumentation

PXI Express

FlexRIO FPGA-based instrument

Throughout its steady climb into the mainstream, software-defined instrumentation has represented a paradigm shift from traditional box-style instruments, which are defined largely by their hardware and the inaccessible software they run, to one in which instruments are defined less by their hardware and much more by software that is very much under user control.

The traditional box instrument, designed and purpose-built for specific measurement tasks, includes highly specialized hardware that’s optimized for the task at hand. In the software-defined instrument, the hardware is fairly generic, performing some class of I/O. But the measurement functionality is a function of the algorithm that processes the raw data streaming from that I/O.

“Essentially, the function of the instrument is defined in the firmware,” says Richard McDonell, automated test senior group manager at National Instruments. Further, with the advent of FPGAs, users can embed their own firmware in their software-defined instrument. Thus, they can define the instrument at the hardware level rather than at the software level.

In this article, we’ll look into how software-defined instruments are being used today and examine some of the growth areas. But let’s consider what they cannot do. While they’re becoming more valuable, software-defined instruments will never displace the traditional benchtop test instrument. In general, when you need a measurement quickly, the traditional bench instrument is still the best way to go.

“We don’t say software-defined instruments are the perfect solution for everything. For design and validation, the traditional instrument is still a very valuable asset on the bench,” says McDonell.

When you must meet the fast-changing needs of an automated test scenario, though, the flexibility of software-defined instrumentation is a tremendous benefit. Overall, software-defined instrumentation is aimed at the automated test market.

THE BIG PICTURE

Looking at software-defined instruments from a broad perspective, one would immediately note that they’re not a new idea. Major players, such as National Instruments, have promoted the concept since the late 1970s. Early on, the products were typically associated with general-purpose data-acqusition tasks.

Yet by the late 1990s, the PXI standard brought a new focus on automated test to the software-defined instrumentation market. In going beyond data acquisition, software-defined instrumentation spread across the spectrum of digital multimeters, digitizers, oscilloscopes, arbitrary generators, and other instruments. It eventually encompassed all of the elements one would find in a complete automated test system.

More than 1500 PXI instruments are on the market, and PXI remains the predominant platform for software-defined instrumentation. However, systems can be built around any of the various flavors of the PXI standard.

BREAKOUT AREAS

Some of the biggest breakthroughs in software-defined instruments of late are in the RF arena. In the early days of software-defined instruments, the size and signaling requirements of an automated system mitigated against their use in RF testing scenarios. But today PXI is having a major impact and a fundamental transformation is taking place in automated RF test.

Software-defined systems are being used in production test for standards such as WiMAX, GSM/EDGE, WCDMA, Long-Term Evolution (LTE), multiple-input multiple-output (MIMO), and 802.11n. It’s also finding use for many other RF applications that are not necessarily standards-based but also important. These include applications such as spectral monitoring. Others include particularly challenging test scenarios involving MIMO systems (see “Addressing Multi-Channel RF Test Challenges”).

The sweet spot for RF applications is at frequencies up to the 6-GHz range, which encompasses most common applications. A good deal of work is going on now in the higher frequencies for radar and other similar applications. For example, National Instruments is collaborating with Phase Matrix on a 26.5-GHz instrument in a 3U PXI form factor that’s set to to be released later this year.

RF in production test is seeing a major transformation toward an automated approach. Test engineers are moving away from call processing and other time-intensive tests for the physical layer, which, they are finding, often does not require an expensive call-processing box.

Rather than using such equipment to make an actual physical call with the final-production version of a handset, for instance, they can use what some term “just-enough” testing using automated equipment to ensure that the physical layer works properly. This not only saves time but also a good deal of money compared to the cost of the call-processing box.

Another big growth area for software-defined instruments is in semiconductor test, where PXI has had huge wins. Some of these have been in production, but most have been in the validation and verification and chip characterization phase.

“Where we’re seeing PXI and software-defined instrumentation in semiconductor production test is in mixed-signal parts, such as Analog Devices and its MEMS devices,” says McDonell. These, and other devices, are not purely digital. In addition, systems-on-a-chip (SoCs) have increasing amounts of RF circuitry onboard that cannot be tested by traditional digital test methods.

On the semiconductor characterization side, PXI-based automated systems are finding broader usage. Engineers have been able to use software-defined systems to shorten characterization cycles from 10 weeks to a matter of hours by employing more automation.

Hardware-in-the-loop applications, in which embedded software is tested in the context of the complex devices on which it runs, is yet another area in which software-defined instrumentation is growing. So many systems and subsystems have increased in complexity and are packed with embedded firmware: engine control units, medical devices, consumer electronics, and much more. Also, watch for emerging applications for testing in green energy/smart-grid niches such as solar panels and windmills.

A final growth area is in the area of video test. Software-defined instrumentation is finding applications in the testing of new video standards, including high-definition video. After decades of little change and a strictly analog bent, the world of video has been churned up by the advent of digital technologies. These digital video standards demand test setups that are more flexible, provide greater performance, and are optimized for automation.

COLLABORATION IN THE WORKS

Software-defined instrumentation is now a large enough market that it’s become attractive to at least one of the larger makers of traditional test equipment. To that end, National Instruments and Tektronix are collaborating on development of what is expected to be the industry’s fastest PCI Express (PCIe) digitizer. This instrument will leverage the analog-to-digital front end from Tektronix’s high-end oscilloscopes. Specifications will include bandwidth of more than 3 GHz, sample rates of over 10 Gsamples/s, and data throughput of greater than 600 Mbytes/s.

“This will be a fairly high-performance digitizer leveraging the Tek technology in a small form factor,” said NI’s McDonell. By duplicating the capability of a traditional scope in a PXI form factor, users will gain a highly capable digitizer that’s optimized for the automated test market. 

PCIe’s IMPACT ON PERFORMANCE

Even though software-defined instrumentation has been around for some time, it continues to evolve and enjoy the fruits of technical innovations. Some of these developments include modern instrumentation-optimized bus technologies, FPGA-based instrumentation, and multicore technologies. Each of these technologies brings something to the party, but when combined, they enable higher-performance testing through parallelism (Fig. 1). We’ll take a deeper look at each of these innovations in turn.

One of the more recent developments that has had a huge impact on instrument performance is the advent of the PCI Express standard. Bandwidth and latency are critical from a performance standpoint for automated test setups. The emergence of PCI Express as a high-bandwidth data bus with low latency has been a boon for integrators of software-defined instrumentation.

“If you don’t have best-in-class bus performance, test performance suffers,” says McDonell. “That’s why PCI was such a good bus and why PCI Express is even better.” Not only does PCI Express afford a huge boost in dedicated bandwidth, it also removes PCI’s limitations of being a shared bus.

All of that has been integrated into the PCI eXtensions for Instrumentation (PXI), which offers both PCI and PCI Express signaling in the backplanes. Consequently, a PXI chassis can use both PCI-based and PCI Express-based instruments on a single backplane.

Some vendors, such as National Instruments, already have products that are ready for the second generation of PCI Express, which was announced by the PCI Special Interest Group (PCI-SIG) in 2007. Whereas the first generation of PCI Express, dating to 2004, supported data rates of up to 250 Mbytes/s, PCI Express 2.0 doubles that, supporting rates up to 500 Mbytes/s.

As a result, a 32-lane PCI Express connector (x32) can support aggregate throughput of up to 16 Gbits/s. Expect an announcement in the second quarter of 2010 on final specifications for PCI Express 3.0, which will carry a bit rate of up to 8 Gtransfers/s (with data rates from 750 to 800 Mbytes/s). That will comprise a nice performance boost over PCI Express 2.0, which sports a bit rate of 5 Gtransfers/s.

The PCI Express standard’s support for multiple generations of technology means that all instruments are hardware- and software compatible. Users won’t have to redesign software or make changes to hardware design in their test setups other than implementing support for the faster clock rate when successive generations arrive.

MULTICORE MAKES ITS ENTRANCE

Another technical innovation that is making a big difference in the performance of software-defined instrumentation is the advent of multicore processors. Such processors are essentially standard in all sorts of test equipment today. National Instruments offers multicore PXI products with embedded controllers and a quad processor on one chip. These controllers can route individual tasks to different cores, which enables interesting applications with virtualization. Half of the cores can run a virtual operating system (OS) while the others support a standard Windows user interface.

A final innovation that has lifted the performance of software-defined instruments is multicomputing or, as it’s sometimes known, peer-to-peer computing (Fig. 2). One of the features of PCI Express as a bus technology is its ability to perform point-to-point data transfers.

With the original PCI standard, transfers had to be routed back through the host processor, which imposed additional overhead and delays. PCI Express allows one to, for example, route data directly from a digitizer to an FPGA module in the adjacent slot. Or, one can initiate a peer-to-peer transfer from one instrument to another while taking advantage of the bus’s high data rates and low latency.

Thus, the point-to-point transfer capabilities of the PCI Express bus open up a whole world of co-processing that enables large FPGAs to be dedicated to certain instruments. The test engineer can program those FPGAs with whatever algorithm they need at a given time. “This is the nirvana of software-defined instrumentation, not only allowing custom software at the application level on the PC but also at the data level on the instrument,” says NI’s McDonell.

In a multicomputing scenario, a system may consist of multiple processors either within the same system or dispersed across multiple systems communicating over a high-speed PCI Express link. This enables two PCI Express systems, each with its own quad-core processor, to communicate with each other over a cabled PCI Express link.

THE DATA BOTTLENECK

Now that you have all that data streaming in, how does one process it in a timely way? The bottleneck has been how to transfer that data between computing nodes and how to divide up the processing. In the automated test world, National Instruments’ LabVIEW software is one solution to this vexing problem. With LabVIEW, test engineers can write code the way they would target a data-acquisition board. LabVIEW translates that code into the FPGA’s native format. Otherwise, the engineer has to learn how to program in VHDL or Verilog to access the FPGAs.

NI’s reconfigurable instruments, dubbed FlexRIO (Fig. 3), further enhance flexibility. These FPGA-based instruments extend the software-defined approach, enabling users to define the software onboard the instrument itself. These instruments give users a standard means of interfacing to a PCI Express backplane on the chassis.

Onboard is a large Xilinx FPGA and a generic digital front end, which is literally digital data lines direct to the front panel. This enables either NI or end users to put various adapter modules on these digital pins to the FPGA. Users then can define any function they want, be it an analog-to-digital converter, a high-speed digitizer, or whatever their application calls for. NI also offers an adapter module developer’s kit with the schematics, tools, and files required to design custom adapter modules.

Complementing the FlexRIO instruments is LabVIEW FPGA, which engineers use to program the Xilinx FPGA. “In FPGAs, we saw a technology that would enable us to provide this kind of flexibility. But the number-one barrier to getting it adopted was that the majority of customers weren’t digital designers,” says McDonell. “There had to be some level of abstraction for targeting the FPGAs. This is a huge benefit for the designers.” The software also provides a way to import homebrewed intellectual property (IP) cores developed in VHDL onto the FPGA.

USING LEGACY EQUIPMENT

Even if a test-engineering lab has moved over entirely to a software-defined instrumentation setup, there inevitably will be some pieces that must be shoehorned into the overall solution. While the PXI standard solves some 80% of users’ requirements, some items still just aren’t yet available in a PXI-compatible version. Or, a test team may have existing legacy equipment that it would like to put to use alongside the software-defined elements. It’s not uncommon to see systems pieced together from equipment that spans various bus architectures, including PCI, PXI, PCI Express, and older GPIB-based instruments. These hybrid systems can be fashioned to be more than serviceable in most applications.

DON’T SWEAT LONGEVITY

Test engineers like to know that the approach they’ve committed to will be supported long into the future. There is a tendency in some quarters to look askance at software-defined approaches because of its foundation in commercial off-the-shelf (COTS) technology, which can be perceived as more subject to change than traditional instruments.

One thing to note is that most of the technologies on which software-defined instrumentation is based are not proprietary. The bus technologies such as PCI Express and PXI are standards-based and have a community of developers continuing their evolution. The buses are all mainstream technologies and have been in the marketplace for a good number of years. Multicore technology is not proprietary to any particular microprocessor house, nor is FPGA technology. So, there is little risk of building a test system around a proprietary technology or standard and having it orphaned somewhere down the road.

Furthermore, the basis of software-defined systems in COTS technology can be seen as beneficial, especially in terms of modularity and upgradeability. Any component can be upgraded at any time in a simple swapout. The same can be said for the processors that are the basis for the instruments. For instance, the Pentium II CPU that was shipped with a given instrument can easily be swapped out for a quad-core CPU when performance requirements dictate.

Sponsored Recommendations

Near- and Far-Field Measurements

April 16, 2024
In this comprehensive application note, we delve into the methods of measuring the transmission (or reception) pattern, a key determinant of antenna gain, using a vector network...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Empowered by Cutting-Edge Automation Technology: The Sustainable Journey

April 16, 2024
Advanced automation is key to efficient production and is a powerful tool for optimizing infrastructure and processes in terms of sustainability.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!