Electronic Design

Throughput, Bandwidth, Usability, Cost Drive Instrumentation

While the test and measurement (T&M) sector of the electronics industry doesn’t experience the rapid-fire advances of other technologies, it still does a good job year to year. With its steady upgrades, T&M manages to keep pace with the latest electronic processes and standards. If this wasn’t the case, engineers would be hamstrung in creating the latest and greatest products.

Major Trends in Test
Competitive factors force instrument manufacturers to improve accuracy and precision, improve usability, add differentiating features, and address the latest test needs. As a result, all companies seem to be following several key trends:

The quest for improved performance: The demand for instruments to operate at higher frequencies never ends. Each year, we see an influx of higher sampling rates, wider bandwidth, and faster internal processors. High-speed serial-data buses, microwave wireless products, and larger/faster video demand more instrument performance than ever. And they all have a mixed-signal flavor, since more new products consist of a sophisticated mix of linear and digital and RF.

An embedded PC base: At the heart of most of today’s instruments lies a built-in PC surrounded with a high-speed sampling front end and a hard drive on which to store captured measurements. In fact, what instrument today isn’t an embedded processor surrounded with the I/O needed to make specific measurements and display them? It’s a great platform that’s familiar to most users. An entire array of useful peripherals like printers, RAID drives, and the like can be easily added as well.

Eric Starkloff, director of product marketing at National Instruments, indicates that most new instruments that incorporate an embedded PC use at least a dual-core processor. By rewriting the signal-processing software with new algorithms that exploit the multicore architecture, performance jumps significantly, which helps meet the demands of new technologies. And by combining a PXI test chassis with a four- or eight-core server via a PCI Express (PCIe) cable, performance gains can be even more significant.

More software: Since the instrument is a PC, the software implements the test functions. Starting with a Windows 2000 or XP operating system, vendors add their own test software to provide basic instrument functions (Fig. 1). The increased use of DSP then makes possible all sorts of sophisticated signal processing.

FFTs for frequency domain analysis, jitter analysis, and other advanced features greatly extend the usefulness of the instrument. Furthermore, with a PC base, the instrument can incorporate other software like LabVIEW, MathCAD, Mathlab, and Mathematica for advanced analysis of captured measurements. Special add-on software lets users customize the instrument to specific standards such as highspeed serial buses, video formats, or RF wireless standards.

Besides continuous improvements in the embedded instrument software, more add-in software products are making existing instruments even better. Mike Williams, president of Amherst Systems Associates, recently announced version 5 of the company’s M1 OT software, which runs on most Agilent, LeCroy, Tektronix, or Yokogawa scopes.

This software brings an entirely new patented artificial intelligence feature called the Hidden Anomaly Location (HAL), which finds signal-integrity problems faster than anything else available. The automated features of the package speed testing and troubleshooting, especially in the high-frequency designs that dominate new products.

Another form of software that’s significantly improving test is the iTest Team from the Fanfare Group. It’s an integrated test environment that’s designed to automate the test process, document the result, and provide reports. Using scripting languages to write test cases, iTest Team helps simplify the testing process of very complex products. It additionally provides a real boost in time-to-market while reducing test time.

Software forms the heart of most test instruments today. And that provides one big advantage: the ability to change, add to, and upgrade the features and performance of the hardware faster than ever to respond to the relentless technology changes.

Networking: Despite its age and low speed, GPIB is still number one in test networking. It lets instruments talk to one another and to a computer for automating tests, especially in a manufacturing environment. But today, Ethernet and USB are getting more play.

Most current instruments include an Ethernet port and one or more USB ports thanks to the PC-based architecture. Thus, an engineer can network the instruments into a system with USB on a benchtop, or connect the instrument to a LAN or the Internet, for data storage or sharing via Ethernet. Better still is the availability of LXI on standard or specialized instruments.

The PXI/LXI movement: More vendors are turning to the PXI modular instruments of the virtual variety (Fig. 2). With a standard PXI chassis or rack with an internal PCI or PCIe bus anchored by an embedded PC controller, engineers can buy just the modules needed for specific applications.

It’s great for production testing as well as OEM and system integrators with very targeted needs. The PXI trend lowers test costs in many applications. PXI is clearly the fastest growing sector of the test industry, with recent growth rates of better than 20% annually compared to the overall test growth of 4% to 5% per year.

LXI (LAN eXtensions for Instrumentation) instruments are a whole new class of instrument that replaces the GPIB with Ethernet. Instruments become inherently networkable into test systems. All feature a built-in Web browser, enabling the instrument to be controlled or monitored via the LAN or the Internet. Growth has been exceptional in the past few years as more vendors build LXI instruments or add LXI capability to traditional bench or PXI instruments.

The ATE crisis: Maybe crisis is too severe a word. Nonetheless, automatic-testequipment (ATE) systems vendors are under pressure to build systems that meet the higher frequencies and serialdata speeds of modern products and deliver ways to test increasingly more complex system-level chips and boards at a much higher rate. Increased integration at the chip level is the driving force behind these pressures.

Thus, ATE companies are turning to higher pin-`density test heads with improved signal integrity and faster switching. In addition, new updated boundary-scan standards and products assist in the quest for speed. Finally, software in the form of test scripting and synthetic instrumentation is being developed to achieve a whole new generation of ATE systems.

ATE systems are no longer doing basic parametric tests but instead full-function tests on SoC ICs as well as boards and final products. The goal is to speed testing to the point where IC or board physical handling is the limiting factor. Speed is paramount, and software to automate and manage the whole process has become mandatory. Prognosticators say the future lies in protocol-aware ATE that tests signal integrity and emulates the outside world to provide fully functional testing of the protocols in use.

For more, see “Conflicting Needs Require Compromises In Test” at Drill Deeper atricle ID: 17904

TAGS: Components
Hide comments


  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.