Demos Are In The Eye Of The Beholder

Aug. 17, 2011
A dispute between Agilent and National Instruments over the latter's comparative demo points to the need for educated consumers in the test market.

It’s not at all unusual for manufacturers of test equipment to launch products with reference to competitors’ offerings in terms of performance, specifications, and/or applicability to given test tasks. What is somewhat unusual, however, is for such references to elicit cries of “foul” from the competitor in question. What follows is something of a cautionary tale regarding test equipment demos and how difficult measurements can be to take, as well as regarding how measurement results can be interpreted depending on who is taking them.

At the recent NIWeek conference in Austin, a major piece of news was National Instruments’ launch of its PXIe-5665, a 14-GHz vector signal analyzer in the PXIe form factor. Rather than tout the instrument on its own merits, NI decided to stage a demo of a head-to-head comparison of its new baby with Agilent Technologies’ N9030A PXA signal analyzer, a traditionally-packaged bench instrument that delivers frequency ranges of up to 50 GHz (see coverage here). The demo was highlighted in a keynote presentation and on the show floor, and can be seen in this video on the NI website. The six-minute video runs through demos such as a basic harmonics measurement, adjacent-channel power ratio (ACPR) dynamic range and speed, and third-order intermodulation distortion.

Upon learning of the existence of the NI video, Agilent responded by insisting that the N9030A was “significantly misrepresented” in NI’s comparison. In their email and a subsequent phone interview, Agilent’s representatives enumerated various faults it found with NI’s use of its instrument. For example, Agilent’s people believe that the ACPR dynamic range comparison was invalid due to NI’s failure to use a notch filter; thus what they measured was not the ACPR dynamic range of the PXA, but rather the ACPR dynamic range of the incoming RF signal.

I could go on to detail the discrepancies that Agilent found between what NI says the performance of its PXA analyzer is and what Agilent itself determines in nightly regression testing of a farm of PXAs, but that isn’t really the point that I want to make. The point, I believe, is borne out of NI’s response to Agilent’s allegations.

One can look at two competitors in the same marketplace in various ways. In broad terms, both Agilent and NI are purveyors of test and measurement equipment for electronic engineers. But it doesn’t take a much closer look to see that their respective emphases are a bit different. Agilent tends more to target the traditional test bench market; NI, with its PXIe-based hardware, is clearly aiming more squarely at the automated-test market and production test applications.

In its reply to Agilent, NI’s representatives stress that in the demo, they “did their best to reflect the performance of both instruments when used in a typical automated test application.” The whopping advantage that NI claims in terms of measurement speed (in the case of the ACPR measurements) over Agilent is largely due to the way NI reckoned the overall measurement period. This period includes starting the acquisition of the RF signal, the actual RF signal acquisition, calculation of the measurement, and transfer of the measurement back to the PC. The PXA is a self-contained unit that doesn’t inherently require a host PC. That requirement comes in only in an automated-test use case when engineers must record test data to a file or determine whether the DUT meets certain pass/fail criteria.

All of that points to the fact that NI compared the instruments in a use case that favored its form factor, its architecture, and its overall approach to test. That doesn’t necessarily mean that the demo was entirely unfair, but it does mean that when one is the consumer of such a demo, one must take the use case into account in evaluating the results. Even NI’s people admit that “one can achieve better performance in terms of speed or RF performance with either the PXA or the NI PXIe-5665 using different settings.”

The moral of this story, if there is one, is that comparative demos of test equipment (or anything, really) must be viewed with a certain amount of skepticism. That skepticism should pair with awareness of your own test requirements and the circumstances in which you would use the equipment in question. Ask plenty of questions of the vendor personnel about how they’ve set up the measurements. Even in this case, with two well-established and well-respected vendors, “caveat emptor” remains in effect.

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!