Synthetic instruments make inroads with RF applications

Schaumburg, IL. A Tuesday September 17 Autotestcon panel moderated by Dr. Michael Granieri of National Instruments was titled “Synthetic/Software Defined Instrumentation—Are We There Yet?” Panel members included Wade Lowdermilk (RADX), Dr. Dave Carey (Wilkes University), Sean Thompson (NI), and Jeff Murrill (Northrop Grumman).

After a brief introduction by Dr. Granieri, the panel members presented their views of the current state of SI—in effect answering the question, “Are we there yet?” The results were a couple of qualified yeses, a no, and a maybe. In general, there was agreement that SI had solved RF problems pretty well, but had not been as successful in other areas.

Carey referred to his experience with the Tobyhanna depot, which owns more than 15,000 separate pieces of electrical/electronic test equipment. Of these, about 5,000 are power supplies, so SI really isn’t applicable for those. However, for RF analog stimulus and measurement applications, SI can replace a large number of individual instruments. Nevertheless, in the broader sense of the question, SI was not yet there according to Carey.

Murrill thought that SI was a good RF solution, but hadn’t done well elsewhere, so he responded with “maybe.”

Lowdermilk discussed the five phases in a product’s marketing life cycle: innovators, early adopters, early majority, late majority, and laggards. He said that many products reached the early adopter stage where sales growth appeared to be robust, but failed before reaching the early majority phase. In his view, SI for RF applications had reached the early majority stage, so on that basis answered yes.

Thompson referred to the large gains in technology, such as FPGA performance, that generally supported the aims of SI. So, although the question seemed to be a moving target as both the technology and the EUTs were becoming more complex, at least for RF applications, SI was there.

After the opening position statements, Granieri posed a series of prepared questions that the panel discussed. In some cases, members of the audience also contributed their views.

Although SI most often is associated with MIL-Aero applications, technically it should be capable of addressing commercial applications as well. Murrill pointed out that MIL-Aero test systems had to handle the most diverse set of requirements, so this is where SI pays the greatest benefit.

One group of questions led to a discussion of the generality of a SI solution vs. a particular implementation. By referring to the classical example of a spectrum analyzer, the more general capabilities of an appropriate collection of modules weren't being realized. The need to replicate a legacy instrument could be a design constraint rather than just one instance of the design. The point of the discussion was that while SI certainly can replace a conventional legacy instrument, it’s the capability of SI also to address future requirements that sets it apart. Reconfigurability is key to realizing SI’s full value.

This topic ranged from reconfigurability to the software architecture necessary to support it. Murrill gave the example of ARGCS, which implemented tests from major MIL ATS including CASS and TETS, to show that SI had the capability to address legacy test issues. Regardless of the specific instrumentation SI was to replace, the underlying software architecture should be general, supporting reconfigurability—input from Chris Gorringe (Cassidian Test Engineering Services, UK).

There followed some discussion of the typical DoD application where SI best fits. Generally, this is obsolescence mitigation and multiple instrument replacement leading to cost savings. Murrill suggested that SI was more expensive because there were more modules to replace over the life of a test system. Lowdermilk countered that because each module type had a distinct life cycle, only one module would need to replaced at any time, so costs actually would be reduced.

What would it take to move SI solutions into the mainstream more quickly? Actually, according to Lowdermilk and others, this already is happening. Traditionally, bench instruments could claim that SI and modular in general were good solutions to fairly mundane problems. However, for higher performance, you needed a bench instrument. This simply is not true today given the combination of advanced FPGAs, DSPs, CPUs, and software algorithms.

The question was raised about calibration and how it figured into the overall SI paradigm. Who was responsible for the calibration of the overall SI system when it was possible to replace any module or modules with similar parts from other manufactures? NI’s VST was given as an example of an instrument that partitioned calibration constants from other, user-configurable aspects. However, the software architecture issue and SI standardization in general progress, calibration must be an integral part.

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!