Advances in test and measurement equipment and IC devices keep marching on in lock-step with each other—one can't do without the other. Look into a just-developed piece of test gear, and you're bound to find devices made on the latest process technologies. These include silicon-germanium (SiGe) logic ICs, the highest-resolution, highest-speed analog-to-digital converters (ADCs), ever-more powerful microprocessors, and state-of-the-art digital signal processing (DSP). On the other hand, these devices could not exist without advanced instruments to test and characterize them into data-sheet numbers.
Oscilloscopes certainly are familiar test and measurement tools. But high-speed and broader-bandwidth communications applications now require equipment with greater dynamic range and less distortion. This is the arena where spectrum analyzers continue to dominate. Such analyzers now come with bandwidths up to 50 GHz, performing high-speed spurious-signal sweeps in just a few seconds. Increased automation and use of the latest ADCs and DSP chips are making spectrum analyzers even more accurate and easier to use.
Driven by growing demands for wireless communications, spectrum analyzers continue to shrink in size and weight for field-service applications. One particular example is the 100-kHz to 3-GHz MS2711B from Anritsu, which comes in a 4.9-lb package that measures a mere 10 by 7 by 2.4 in. Even handheld units like the 100-kHz to 3-GHz FSH-3 from Rohde & Schwarz (available from Tektronix in North America) are making their presence felt.
Although these instruments tend to be used for scalar measurements, recent developments have brought forth vector measurements. Vector analyzers, albeit with smaller bandwidths than spectrum analyzers, are aimed at applications that require rapid data acquisition and lengthy storage of digitized time-domain data. Handling both phase and magnitude signal information, they allow the complex analysis of signals that are digitally modulated in accordance with specific formats like QAM-64 (64-level quadrature-amplitude modulation).
Increasing IC device complexities and the need to keep a lid on testing costs have irrevocably moved test and measurement instruments from the component level to a subsystem and systems level. It's no longer feasible to use test equipment to analyze complex ICs like systems-on-a-chip (SoCs) and complex programmable logic devices (CPLDs), which are becoming systems in their own right. The role of testing SoCs and CPLDs has now fallen to built-in self-test (BIST) and Design-for-Test (DFT) circuits, as well as large and expensive automatic test equipment (ATE).
BIST and DFT architectures have helped to cut down costly and lengthy testing, but they're not the total answer. Instruments for core BIST and DFT testing also are under development, specifically for SoCs to address the time and cost issues. The Validator 500 from Teseda Corp. is a good example of the SoC test equipment that will eventually prevail (see Electronic Design,"SoC Test System Speeds Design Verification," Sept. 30, 2002, p. 43). It allows the verification of DFT structures in an SoC, thus guaranteeing that the IC's testability features are correct.
As a result, test and measurement equipment is being increasingly called upon to interface with electronic-design automation (EDA) tools, as test gear undergoes a shift from a largely hardware-centric role to a software-centric role. In fact, traditional instruments may some day become limited in their capabilities, with EDA tools taking over much of this task.
One nagging question is when will the largely hardware-centric test and measurement industry pay more attention to developing software logic analyzers? Software has replaced hardware as the quality variable, yet no major test and measurement manufacturer or emulator vendor has produced a good software-analysis tool to meet this challenge. Newer instruments like protocol analyzers, with deep memories and the ability to reconfigure triggering and filtering without knowing how bus signals are behaving, are one step in this software-centric direction.
A noticeable trend is the expanding market for test and measurement instrumentation well beyond the laboratory environment, which had been mainstay of instruments for decades. Sectors like automotive, as well as optical and wireless communications, are now developing application-specific instruments. Many of them, though, leave a lot be desired, particularly in the optical-testing arena. Designers are forced to rig their own equipment from basic in-house-developed test modules. The challenge of merging electrical engineering with optical engineering still remains.
The line between the instrument and the PC is blurring as more emphasis is placed on integrating test instruments with PCs. Instrument manufacturers are adapting to data-acquisition PC technology that runs under popular operating systems like Windows, connected through Ethernet and the Universal Serial Bus (USB). Some instruments can rightly be called specialized test computers, while some PCs are taking on analysis roles formerly reserved for test equipment. The development of application software like LabView and other packages have allowed designers to substitute software-based "virtual" instruments running on standard low-cost PCs for expensive standalone test equipment.
Where this trend will lead is anyone's guess. But the computational power of the computer and the analysis capabilities of an instrument are bound to create powerful test systems for the future. One thing is clear, however. Reprogrammable hardware devices like FPGAs are beginning to expand the range of programmable instruments of all kinds into virtual instruments powered from PCs.