An expert viewpoint brought to Electronic Design by Agilent Technologies, Inc.
For a number of years we’ve been able to send basic commands to instruments from a computer and retrieve data. That’s fine for some classes of instruments such as data acquisition units, counters or voltmeters that do little internal processing. The situation is far different for high-end instruments such as logic analyzers, communications testers and high-end oscilloscopes, which come with sophisticated user interfaces and whose internal processor and software perform complex operations. An ersatz user interface on a PC simply sending commands can’t give you access to the complete set of controls for such an instrument. With these high-end units there’s been only one efficient operating mode: sit down in front of the box, set up the measurement, acquire data and analyze the results locally.
Today, though, with a concept known as hosted instruments, we’re watching the evolution of PC/instrument interaction go far beyond sending commands. Instruments are taking on the features of PCs, even running desktop operating systems. Thus it makes sense that the same instrument software could instead run on a desktop machine that’s connected to the instrumentation hardware over a high-speed link such as a LAN. That’s the essence of the hosted instrument: the acquisition hardware and the processing software have become completely decoupled.
Consider the latest logic analyzers, which themselves are full-fledged Windows XP machines suited for rugged lab and field use. You can, of course, control them from the front panel in the classic sense. Now, though, you can install on a PC the exact same control and analysis software that runs on the analyzer’s motherboard and get the identical measurement capability from anywhere. You have the identical user interface as if sitting at the instrument, including multiple windows and powerful post-processing capabilities. Further, you can work in a familiar environment, with all your favorite utilities and applications immediately at hand.
Just think about the other benefits. First, full measurement access is available to anyone connected to the LAN, even if thousands of miles away from the hardware setup. That’s important given the move to geographically dispersed teams working on the same prototype at a central location. Load the software on another machine to share this power with other team members, at no extra cost.
Just as impressive, this architecture no longer limits the processing power you can apply to an instrumentation task. Sure, a high-end logic analyzer might be equipped with 1-GHz Pentium IIIs, but the processing power inside the instrument will never approach that available to most engineering teams. Manufacturers can’t simply toss into an instrument the newest, most powerful PC motherboard. They must take the time and expense to qualify that any components can withstand the rigors of an industrial environment. They must be able to service and repair motherboards in instruments that have service lives that extend 10 or 20 years.
With a hosted instrument, though, a design team can use the PCs they already have to increase its processing power a thousand times over, all without any incremental investment. Desktop CPUs are pushing past 3 GHz, and multiprocessor machines are commonplace. Hosted instruments benefit from the PC performance curve even further thanks to gigabit LANs, inexpensive mass storage, sophisticated graphics and other peripherals and features.
Wait, you say, isn’t the whole idea behind modular instrument schemes such as PXI to put PC power in the same crate as the instrument? It’s a good idea, except these modular PCs lag desktop PC technology and pricing for the same reasons. For instance, I’m aware of one PXI CPU card with a 2.2-GHz Pentium 4 that by itself sells for near $4000. Meanwhile, I can get a complete 2.66-GHz P4 desktop system for $500. It’s plain silly not to exploit such power at such a price.
PC hardware continues to advance rapidly, and it’s time for software to do some catching up. Today, there’s logic-analysis software written so that each of several display windows is mapped as a thread on a separate CPU. Many other such tasks in sophisticated instrumentation could easily exploit multiprocessing. So while we’ve recently seen hosted instruments make some great strides, they’ll ride the PC performance curve to make things even better for design engineers in the upcoming years.
Want to take a free look at some fully functional software for a hosted logic analyzer?