The Hosted Instrument: Finally A No-Compromise Approach

June 21, 2004
An expert viewpoint brought to Electronic Design by Agilent Technologies, Inc. For a number of years we’ve been able to send basic commands to instruments from a computer and retrieve data....
An expert viewpoint brought to Electronic Design by Agilent Technologies, Inc.

For a number of years we’ve been able to send basic commands to instruments from a computer and retrieve data. That’s fine for some classes of instruments such as data acquisition units, counters or voltmeters that do little internal processing. The situation is far different for high-end instruments such as logic analyzers, communications testers and high-end oscilloscopes, which come with sophisticated user interfaces and whose internal processor and software perform complex operations. An ersatz user interface on a PC simply sending commands can’t give you access to the complete set of controls for such an instrument. With these high-end units there’s been only one efficient operating mode: sit down in front of the box, set up the measurement, acquire data and analyze the results locally.

Today, though, with a concept known as hosted instruments, we’re watching the evolution of PC/instrument interaction go far beyond sending commands. Instruments are taking on the features of PCs, even running desktop operating systems. Thus it makes sense that the same instrument software could instead run on a desktop machine that’s connected to the instrumentation hardware over a high-speed link such as a LAN. That’s the essence of the hosted instrument: the acquisition hardware and the processing software have become completely decoupled.

Consider the latest logic analyzers, which themselves are full-fledged Windows XP machines suited for rugged lab and field use. You can, of course, control them from the front panel in the classic sense. Now, though, you can install on a PC the exact same control and analysis software that runs on the analyzer’s motherboard and get the identical measurement capability from anywhere. You have the identical user interface as if sitting at the instrument, including multiple windows and powerful post-processing capabilities. Further, you can work in a familiar environment, with all your favorite utilities and applications immediately at hand.

Just think about the other benefits. First, full measurement access is available to anyone connected to the LAN, even if thousands of miles away from the hardware setup. That’s important given the move to geographically dispersed teams working on the same prototype at a central location. Load the software on another machine to share this power with other team members, at no extra cost.

Just as impressive, this architecture no longer limits the processing power you can apply to an instrumentation task. Sure, a high-end logic analyzer might be equipped with 1-GHz Pentium IIIs, but the processing power inside the instrument will never approach that available to most engineering teams. Manufacturers can’t simply toss into an instrument the newest, most powerful PC motherboard. They must take the time and expense to qualify that any components can withstand the rigors of an industrial environment. They must be able to service and repair motherboards in instruments that have service lives that extend 10 or 20 years.

With a hosted instrument, though, a design team can use the PCs they already have to increase its processing power a thousand times over, all without any incremental investment. Desktop CPUs are pushing past 3 GHz, and multiprocessor machines are commonplace. Hosted instruments benefit from the PC performance curve even further thanks to gigabit LANs, inexpensive mass storage, sophisticated graphics and other peripherals and features.

Wait, you say, isn’t the whole idea behind modular instrument schemes such as PXI to put PC power in the same crate as the instrument? It’s a good idea, except these modular PCs lag desktop PC technology and pricing for the same reasons. For instance, I’m aware of one PXI CPU card with a 2.2-GHz Pentium 4 that by itself sells for near $4000. Meanwhile, I can get a complete 2.66-GHz P4 desktop system for $500. It’s plain silly not to exploit such power at such a price.

PC hardware continues to advance rapidly, and it’s time for software to do some catching up. Today, there’s logic-analysis software written so that each of several display windows is mapped as a thread on a separate CPU. Many other such tasks in sophisticated instrumentation could easily exploit multiprocessing. So while we’ve recently seen hosted instruments make some great strides, they’ll ride the PC performance curve to make things even better for design engineers in the upcoming years.

Want to take a free look at some fully functional software for a hosted logic analyzer?

Go to www.agilent.com/find/curveol-june04

ADVERTISEMENT

Click here to download the PDF version of this page.

Sponsored Recommendations

TTI Transportation Resource Center

April 8, 2024
From sensors to vehicle electrification, from design to production, on-board and off-board a TTI Transportation Specialist will help you keep moving into the future. TTI has been...

Cornell Dubilier: Push EV Charging to Higher Productivity and Lower Recharge Times

April 8, 2024
Optimized for high efficiency power inverter/converter level 3 EV charging systems, CDE capacitors offer high capacitance values, low inductance (< 5 nH), high ripple current ...

TTI Hybrid & Electric Vehicles Line Card

April 8, 2024
Components for Infrastructure, Connectivity and On-board Systems TTI stocks the premier electrical components that hybrid and electric vehicle manufacturers and suppliers need...

Bourns: Automotive-Grade Components for the Rough Road Ahead

April 8, 2024
The electronics needed for transportation today is getting increasingly more demanding and sophisticated, requiring not only high quality components but those that interface well...

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!