As we enter 2002, T&M can be thought of as a confluence of many rivers—higher performance, diminution in the physical size of devices and products, ascendancy of wireless, and the growing presence of the PC in instrumentation. All of these tributaries converge in one enormous deluge, flooding the already swollen river known as T&M.
We had better not forget optical products and their steady growth. As far as we can see for now, they offer the ultimate in bandwidth for moving information, and the pressures are extreme. Mobile traffic is reported to have grown 77% in the past year alone. These issues all demand that T&M pedal a little faster.
The breadth of instrument types is growing, but the bread-and-butter stalwarts are still the familiar ones. The oscilloscopes, logic and protocol analyzers, and automatic test equipment (ATE) are in the vanguard.
We will observe a continuing push to bring real-time oscilloscopes up to the same bandwidths and sampling rates now available in sampling oscilloscopes. As for ATE systems, these veritable 800-pound gorillas still guard the exits from the manufacturing floors, making sure that only the good stuff—chips, boards, and fully assembled products—are packaged and shipped. Here too, developing trends will change their roles.
Wireless, packaging, and particularly the continual shrinking of products, boards, and devices all call upon T&M to stay abreast or end up left behind. So what can we expect to see emerging in the balance of this year?
Though marketing managers are tight-lipped when queried about how much this or that parameter of an oscilloscope, logic analyzer, and the like will improve this year, unmistakable clues indicate what's coming. A number of products will be announced, following in stride many of the trends identified below.
Numerous events are converging at the chip level. Following almost in lock-step with the relentless shrinking of geometries tooled in silicon, supply voltages are in the low single digits and will probably descend into the millivolt range. As for currents, the picoampere (1012 ampere) will be supplanted by the femptoampere (1015), which may give way to the attoampere (1018). Such infinitesimal currents, with their unfamiliar names, will soon roll off our tongues as easily as "microampere" does today. Fifty years ago, we never would have imagined ever confronting such incredibly small values. But Moore's law is remorseful and ubiquitous.
On the silicon itself, there will be increasing demands on chip designers to build in self-tests along the lines of IEEE 1149.4. This is the analog version of an earlier specification, IEEE 1149.1.
Packages are becoming smaller too. Surface-mount technology (SMT), flip chips, and the use of extraordinarily small discretes are all increasing. Together they're changing the way that manufacturing and test engineers view T&M, particularly regarding the continued use of in-circuit testing.
Due to the current recession, cost efficiency, rather than faster, broader, and deeper, is demanded. To drive costs out of manufacturing, T&M must play a major role. In the past, the thrust was adding capacity and bandwidth. Now, the drive is to lower network costs. Today the issues zero in on reduced cost per test, reduced setup time, and increased throughput, and all are becoming more crucial than ever.Logic And Protocol Analyzers Take Off BOTH TELECOMMUNICATIONS AND WIRELESS APPLICATIONS ARE REQUIRING logic and protocol analyzers to confront huge climbs in data rates and faster edge rates in silicon. Protocol analysis, in particular, is encountering immense burdens because of the accelerating pace at which protocols are emerging. Major changes also are under way in telecommunications and wireless standards in the U.S. and overseas. These issues are pressing suppliers to think more in terms of modularization, in both hardware and software.
Logic analyzers are generally better suited for hardware debug, with features such as 2- to 4-ns resolution, adjustable setup-and-hold times, and adjustable voltage thresholds. They provide a simple state listing and timing waveform display and incorporate very powerful triggering. On the other hand, protocol analyzers are better at software debug due to deeper buffers, and the ability to configure triggering and filtering at a high level without any knowledge of how the bus signals work. They usually offer more powerful statistical analysis capabilities and displays at the command, packet, and state levels. However, they don't provide a timing waveform display.
Some of the events that will have particular significance over the coming year are listed here.
Because of the rise in packet systems, they too will expand the need for instruments to be equipped with protocol testing capability. (Virtually everything up to this point has been circuit-switched.) This will lead to more protocol analysis requirements on the wireless end because of the far greater number of protocols involved in packet data systems.One Of Many Beginnings...1938: Dave and Lucille Packard moved into a house at 367 Addison Ave., Palo Alto, Calif. Bill Hewlett rented the cottage behind the house, and the two former Stanford classmates began part-time work in the garage with $538.
They developed a resistance-tuned, audio oscillator (200A). Walt Disney ordered eight oscillators (200Bs) to assist in the production of the soundtrack for the movie Fantasia.
To stay in step with the continuing scalability of newly introduced products, both logic and protocol analyzers will exhibit more scalability. So a system of either type that tests Sonet, SDH, or Ethernet at a specific rate will also provide the potential to handle gigabit-Ethernet rates—and even higher. Along this line, protocol and logic analyzers will be produced with standard mainframe chassis architectures, designed to receive various kinds of test modules either available now or in the future.
Both protocol and logic analyzers will demonstrate more smarts, which will translate into greater reliance on PC Windows-type platforms. Along with this will come the ability to run various types of software applications to connect equipment in a network or in a production test environment or, for that matter, in virtually any kind of environment where an engineer or technician is obliged to perform measurements.
Both AT&T (www.att.com) and Cingular (www.cingular.com) plan to replace their TDMA wireless basestations with GSM. These companies will seek lightweight, low-cost protocol analyzers.
A meeting coming up soon in which logic and protocol analyzers will be covered is the Embedded Systems Conference, March 12-16, San Francisco. More information can be found at www.esconline.com.
Already a feature in some instruments, such as the Bus Doctor from Data Transit (www.data-transit.com), histograms will be provided on other protocol analyzers about to be introduced.
Becoming more popular in both types of analyzers will be post-capture filtering, which reduces the amount of data that a viewer must look at and confines the display to only the data of interest, such as a particular address range on the PCI bus, a particular destination, or a source/ID combination on a high-speed serial bus.
Protocol analysis is becoming more software driven and an adjunct to other test instruments. Expanding its scope, analysis packages will be aimed at RF, wired, and fiber-optic equipment. As the convergence of telecommunications with networks accelerates, traditional long-distance types of technologies and LAN Ethernet and/or Internet types of technologies will commingle. There will be much more migration of Internet protocols to the soon-to-burgeon, high-capacity networks, which will increase the burdens on protocol testing even more.
Currently, many businesses lack direct links for high-speed data. But as pipelines in the telecom arena via routers to telecommunications voice/IP and data/IP become more commonplace, metro networks will bring fiber links right to businesses. Data rates will soar, demanding more of protocol testing. Also expect to see a growing number of protocol analyzers and companion protocol analysis programs aimed at the last mile, for formats like those in General Packet Radio Service (GPRS).
We can expect a gradual shift from script engines to state machines because the latter are much more flexible, reactive messaging instruments.