Electronic Design

Instrumentation 2.0: How Software-Defined Instrumentation Is Changing T&M

Coined by National Instruments, Instrumentation 2.0 refers to a software-based approach to instrumentation that lets users create their own end measurement outcomes from raw measurement data. Instrumentation 2.0 is virtual instrumentation (VI), where the instrument is a PC with software that defines the measurement capability. It's flexible, and in many ways, it's a better fit to the world of electronics test today than previous T&M methods.

Instrumentation 1.0 is what we've been doing all along. It defines T&M equipment that's in a fixed hardware configuration with vendor-defined measurements and a fixed user interface. And I don't mean that in a negative way. The fixed-instrument method has been successful for many decades.

But times are changing. Optional PC connectivity and embedded PCs with software measurement suites have helped Instrumentation 1.0 cope with the changes. It's possible we're at the tipping point where virtual instrumentation is about to become the dominant method of T&M.

VI is a fabulous idea. It is now widely implemented, but it has been restricted primarily to low-frequency and lower-data-rate applications for three reasons: sampling rate limitations, bus architectures, and processing speed (see the figure). Now these limitations are slowly abating.

For example, take sampling rate. Analog-to-digital converter (ADC) speeds have increased significantly from a few megasamples per second at best to well over 1 Gsample/s today. Also, resolutions now extend well beyond the typical 8 bits of the past. Even with reasonably priced commercial off-the-shelf (COTS) ADCs, it's possible to digitize signals well into the low microwave region, with a resolution that permits accurate measurements. That really opens the door to VI across the board.

But with fast ADCs, you have to process all that data to make your measurement. Processors have kept pace with faster clocks and new architectures, but they offer even more potential now that affordable dual-core (and even quad-core) processors are commonplace in PCs, laptops, and VI boxes.

All that's needed is a better way to program these chips to take advantage of the processing power to manipulate the signals and produce the measurements. And don't forget the impact of computer bus technology. Thanks to PCI Express and other fast serial buses, data can be moved fast enough to make VI very practical.

What makes Instrumentation 2.0 so desirable is that the user can define the measurements that need to be made. The availability of the raw measurement data streaming from the ADC in the instrument to the software lets engineers create measurements and outcomes specific to their needs. The VI hardware is usually modular and can be configured to the exact application. With the appropriate software, the user can create just about any type of test or measurement situation and user interface to fit the problem.

One of the main reasons why flexibility and programmability are important is the "Long Tail" effect. Put forth by Chris Anderson in Wired magazine in 2004, this concept says that in the past, most products were broader in application. Today, more and more products and applications don't target general purposes. Instead, they suit niches.

This is the long tail, that huge body of lower-volume niche products and applications that force electronic product developers and manufacturers to constantly change and update their T&M capability. This will be easier with Instrumentation 2.0 because modular and programmable instrumentation will address a greater variety of needs and niches.

Other factors at work include the software-based instrumentation mandated by the Department of Defense. Known as the synthetic instrumentation (SI) movement, it seeks to reduce the logistics footprint of military electronics T&M gear while reducing costs and solving the perpetual obsolescence problem. As military radars, radios, and other gear are constantly upgraded, the T&M equipment can keep pace with minimal extra cost and time delay.

While SI is still being defined and developed, one definition dubs it a subset of VI because it has more fixed hardware and software modules while VI overall is more modular and fully programmable. But SI is definitely Instrumentation 2.0.

The older-style fixed instruments of Instrumentation 1.0 will never go away. They will continue to be updated and enhanced. It appears likely that Instrumentation 2.0 in its VI and SI forms will become more widespread, though, as the technologies improve and the variety of new niche products increases. Eric Starkloff, director of product marketing at National Instruments, says that sales and usage of Instrumentation 2.0 may account for 50% of the overall T&M market. We're already living the dream.

National Instruments Inc.

Hide comments


  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.