Electronicdesign 2017 Xl best Test 0

Test Equipment Seeks To Stay Ahead Of The Curve

Jan. 10, 2010
Compliance testing for emerging serial protocols, the ubiquity of wireless connectivity, and the ongoing quest for low power are all driving the test and measurement industry to continue coming up with innovations in 2010.

This year was one of speed and bandwidth gains for the test and measurement industry’s offerings. Next year promises more of the same, driven largely by the need for compliance testing for new and emerging protocols such as USB 3.0 (SuperSpeed USB) and Intel’s LightPeak optical link.

In addition, serial link speeds continue to rise, as do link bandwidths. Add to that the proliferation of digital RF technology, with wireless capabilities turning up in a steady stream of new applications, and you have the makings of a busy year for the T&M industry.

One of the key trends that sets the bar for test equipment is the ongoing, industry-wide shift from parallel to high-speed serial data communications. “If we survey computer and communications standards, we have different standards for peripherals, displays, enterprise storage, etc.,” says Randy White, serial technical marketing manager at Tektronix.

SERIAL STANDARDS’ IMPACT ON TEST

In the serial data channel, data rates are rising in a trend with no foreseeable end. Typically, when data rates rise, they double. With that doubling, the challenges presented by the channel scale exponentially. “As you double the data rate, the channel distortion or attenuation may increase by factor of four times or even 10 times,” says White.

Protocols are becoming more complex overall, says Dave Graef, chief technology officer at LeCroy Corp. “The speed of signals affects protocol analyzers when, for example, trying to connect into a 16-lane PCIe (PCI Express) system,” he says. Because the PCIe connection is already made on the motherboard, probing that connection is not a trivial task.

“There are mid-bus probes, but the amount of signal you get on your probe and the amount that actually gets into the analyzer makes it very difficult to reliability connect into the system. As a result, the analog portion of protocol analyzers has grown more complex in recent years, a trend that shows no sign of abating,” Graef says.

USB compliance testing is an important area for test and measurement in 2010. There are already some 6 billion USB devices worldwide. And with USB 3.0 poised to enter the design mainstream, it’s a technology with huge growth potential (Fig. 1).  “In a recent USB plugfest we participated in, it became apparent early on that some developers aren’t as prepared for USB 3.0 as they could be,” says White.

USB 3.0 represents a tenfold increase over USB 2.0 from 480 Mbits/s to 5 Gbits/s over the same channel. If you think in terms of a 3-meter cable plus a 12-inch trace on the system board, this can translate into significant signal loss. It’s been common for PC makers to place USB connectors on the front of their chassis, but doing so lengthens the trace.

With USB 2.0, designers have typically had enough margin that USB links are truly “plug and play.” But with the tenfold increase in data rate that comes with USB 3.0, the channel loss scales exponentially as attenuation increases. Thus, USB channel designs will require compliance testing to implement some form of compensation or equalization to boost the signal out of the noise.

On the measurement side, designers will have to add measurements for equalization and de-embedding. This is old news for communications-system designers but new for computer makers, whose business models are based on keeping design costs low. They’d prefer to avoid standards-compliance testing, but they won’t be able to with the advent of USB 3.0. Consequently, channel compensation is a looming trend for 2010 and beyond.

An important implication for the test equipment going forward is that the expertise to validate channel compliance needs to be embedded into the instruments. “We have computer designers and system integrators with little or no experience in these advanced equalization techniques. They need to have this knowledge built in,” says White. So, it’s now not uncommon to find scopes with advanced equalization and compensation measurement packages built in.

Also, the intrinsic noise of the scopes needs to be better to have less impact on the signals. Going forward, an instrument’s effective number of bits (ENoB) will be an increasingly critical specification. “Oscilloscopes are typically specified in terms of the 3-dB bandwidth point, but that number doesn’t tell you about frequency response over the range,” says White. “Then there’s noise floor. ENoB accounts for both noise, the bandwidth, and the frequency response of the scope.” Look for ENoB to become a common figure of merit for scopes.

Advanced serial protocols can make probing more challenging. Thus, the industry’s standards compliance committees have indicated greater acceptance of probe and fixture de-embedding. Such approaches account for the probe’s electrical parameters and back-calculate what it might look like at inaccessible points of a circuit. “We call this ‘virtual probing,’” says Graef.

Placing a test fixture into a circuit and attaching a cable to it has its own impact on that circuit. As a result, fixture de-embedding is also a growing trend in instrument functionality. “If you have a test fixture in the circuit, you have to do some fixture de-embedding just to know what the signal looks like when you’re not probing,” says Graef.

WIRELESS EVERYWHERE

Wireless functionality is turning up everywhere, and this translates into a number of technology and market trends that will impact test and measurement equipment. Things like refrigerators, HVAC systems, and every corner of your car are packed with transceivers of various kinds. The inevitable result is spectrum congestion.

“The answer to this problem is devices with embedded brains and cognition that adapts to the environment,” says Darren McCarthy, wireless technical marketing manager at Tektronix. “We’re starting to see this as part of the equation.”

Already, ultra-wideband technologies incorporate the ability for networks to sense and adapt to potential sources of interference. This is playing out in ZigBee, where networks can sense and adapt to interference within the same industrial, scientific, and medical (ISM) band as Bluetooth, and wireless local-area networks (LANs).

“One of the flaws here is you don’t know how to detect another receiver,” says McCarthy. “This can be difficult with mobile devices if there are nearby radar, satellite, or TV receivers. These items will pose an ongoing challenge to the cognitive radio initiative.”

The so-called “greening” of wireless and its increasing focus on energy efficiency also poses challenges to systems integrators and test providers. “Energy efficiency isn’t just an issue in commercial electronics but also in military and defense,” says McCarthy.

Yet another aspect of “wireless everywhere” is the embedding of wireless technologies. A simple cell phone today typically contains several radios. “Traditional multiband RF is part of this, and so is wireless LAN and Bluetooth,” says McCarthy. “We’re also seeing the incorporation of Ultra-Wideband technologies into mobile devices to replace USB for fast downloading of photos.” With digital portions of these devices so close to the multiple antennas, the challenges in system integration and validation intensify.

For system integrators attempting to test and validate these RF subsystems, the challenge is in how to trigger and isolate signals. “How do you mitigate hardware and software problems? In doing so, complexity in triggering and isolation is important,” says McCarthy. Hardware/software issues can manifest themselves in how system code executes. If it’s less than optimal, it can cause transmitter problems that result in spectrum splatter or excessive power drain.

“This challenge becomes tougher as devices get smaller,” says McCarthy. “You may have a ZigBee transceiver that works well. But when it’s implemented close to a processor, it causes problems.”

What this translates into is new measurement requirements, with a need to analyze more complex signals in less time. Test providers will continue to deliver equipment and methodologies for analysis of complex, time-varying RF signals that go along with adaptive radios and radar systems. Technologies such as frequency hopping over multiple bands of interest demand wideband instrumentation.

Another trend propelled by the ubiquity of wireless systems is field capability (Fig. 2). “Testing of radios out in the field drives demand for handheld instruments,” says Mark Pierpoint, vice president and general manager of Agilent’s Technology and Services Organization. “These instruments continue to be well accepted in the market, and we’ll see more improvements in functionality and capability with these handheld packages.” That, in turn, will support the rollout of new radio standards.

The prevalence of multichannel wireless and multiple-input/multiple-output (MIMO) transceiver architectures will also drive the development of wireless test (Fig. 3). “MIMO is found either within devices or basestations,” says Pierpoint. The trend is toward more collaborative MIMO technology. “The difference is rather than a single handset or basestation processing everything, it’s done in collaboration between multiple handsets for higher performance and bandwidth,” Pierpoint says. The next generation of Long-Term Evolution (LTE), known as LTE Advanced, is likely to take advantage of collaborative MIMO technologies.

IN EMBEDDED, ENERGY MATTERS

In the broader embedded systems space, a major trend on the design side is energy efficiency. “We’re seeing integration of more switch-mode power supplies in embedded systems, which brings test issues,” says Gina Bonini, embedded technical marketing manager at Tektronix. The use of a switch-mode power supply obviously introduces switching into the end product.

“We are also seeing supplies themselves designed with an eye toward where power is lost in the supply,” says Bonini. “Designers are looking at switching transistors and the associated magnetics and doing switching loss measurements. They’re also doing power measurements on the primary side of the supply as well as measurements of ripple on the output.”

Going forward, embedded designers will be doing more in-depth testing on the design of power supplies themselves as well as more testing of their functionality when integrated into the end system. “When you add the load, you want to know the impact of power quality both on the line and at the load itself,” says Bonini.

Another key trend is dynamic power management. Embedded system designers using off-the-shelf processors will spend more time looking at power usage. “They’ll want to learn how to better program those devices with programmable frequency and voltage scaling,” says Bonini.

According to LeCroy’s Dave Graef, today’s power-management ICs will drive new test methods and equipment. “Analysis of how these power-management systems work, how they interact with the system at large, and the order in which things power up all requires tools. Things like switching efficiencies and loop stability are important when you have a relatively small switcher on a board doing local regulation,” he says.

More embedded system designs will be integrating higher-speed buses. Expect to see more integration of USB 2.0 into embedded devices, both for system-to-system links and for chip-to-chip communications within a given device.

This means embedded designers who have been accustomed to working with slower buses such as I2C will find themselves having to manage USB integration. “That brings test issues,” Bonini says. Similarly, integration of wireless protocols like ZigBee and Bluetooth are posing test challenges.

In general, embedded systems are rapidly becoming more complex. This applies not only to high-end handsets but also to low-cost children’s toys. “This means there’s a lot more engineers dealing with system-level test issues,” says Bonini. T&M vendors, then, will be packaging more tests into their instruments. All of the major oscilloscope vendors have taken this path, and we can expect more of it in 2010.

“Our mixed-signal scopes let you look at multiple parallel and serial buses to examine system interaction problems,” says Bonini. Today’s scopes allow you to look at serial buses, analyze their data flow, and troubleshoot problems. “You’ll see more of that from us,” Bonini says. To that end, Tektronix has built USB 2.0 trigger and decode capabilities into its lower-cost scopes.

THE PATH TO NEW INSTRUMENTS

Keeping up with these design trends is also challenging for T&M companies. It means designing ever-more capable front-end ICs for next-generation instruments. For example, Agilent Technologies recently announced the readiness of its next-generation front-end chips.

Fabricated on an indium-phosphide (InP) process, these chips will enable the company to deliver oscilloscopes in the first half of 2010 that offer true analog bandwidths of 16 GHz and greater. Such scopes would be squarely aimed at engineers working with high-speed serial data links such as USB, SATA, or PCIe. They would use such oscilloscopes to measure jitter and other parameters to ensure compliance to industry standards for interoperability.

In the next few years, as data rates extend beyond 8.5 Gbits/s, engineers will need oscilloscopes with true analog bandwidths greater than 16 GHz. In addition, the upcoming IEEE 803.2ba 40/100G standard will drive the need for high-quality, real-time signal analysis capabilities to 16 GHz and beyond.

Often, standards determine the direction in which test and measurement will go. One recently announced initiative, the AdvancedTCA Extensions for Instrumentation and Test (AXIe), is an open standard based on AdvancedTCA (ATCA). Proposed by the trio of Aeroflex, Agilent Technologies, and Test Evolution Corp., AXIe aims to extend that standard into the realm of general-purpose and semiconductor test.

According to Larry DesJardin, GM of Agilent’s Modular Product Operation, what the AXIe standard achieves even in infancy makes it worth pursuing as an industry standard for the T&M community. “AXIe provides the highest performance per rack inch available,” says DesJardin. The standard boasts enormous scalability, easy integration with PXI-, LXI-, and IVI-based equipment, and more modularity.

AXIe is based on the ATCA PICMG 3.0 specification, which offers a large board size. It’s a form factor that offers both horizontal and vertical configurations. This accounts for AXIe’s scalability, as systems can be scaled from one to 14 slots per chassis.

The structure of the layered AXIe architecture uses ATCA (PICMG 3.0 and 3.4) as the foundation with LAN and PCIe as the fundamental communication fabric. There are three connectivity zones in the ATCA backplane that underlies the physical architecture (Fig. 4). Zones 1 and 2 include all of the capabilities required for general-purpose or high-speed communications and are related to the AXIe 1.0 standard.

Also defined and related specifically to semiconductor test, the AXIe 1.1 standard uses Zone 3 for connectivity. AXIe 1.1 is a superset, with all of the features of AXI 1.0 and adding more triggering and time-synchronization structures through the use of DSTAR connections. Zone 3 is available as well for extensions to be built for specific applications in other areas. It will be used in future variants of AXIe (AXIe 1.n).

It’s important to note that all general-purpose instruments that can be connected in Zones 1 and 2 will also be accepted in Zone 3. “There will always be certain vertical markets that need functionality that is inapplicable to general-purpose uses,” says DesJardin. “The compatibility of AXIe 1.0 modules with these variants allows us to address vertical markets in a structured way and to build a larger ecosystem.” In effect, it eliminates the need for custom backplanes and “future-proofs” the AXIe architecture.

Sponsored Recommendations

What are the Important Considerations when Assessing Cobot Safety?

April 16, 2024
A review of the requirements of ISO/TS 15066 and how they fit in with ISO 10218-1 and 10218-2 a consideration the complexities of collaboration.

Wire & Cable Cutting Digi-Spool® Service

April 16, 2024
Explore DigiKey’s Digi-Spool® professional cutting service for efficient and precise wire and cable management. Custom-cut to your exact specifications for a variety of cable ...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!