In the wireless market, like in many other markets, test can be seen as a necessary evil. It’s necessary for obvious reasons, but it’s evil because of the overhead that it implies. One of the biggest trends in the wireless test arena is the increasing urgency to reduce the cost of test, a trend that is made imperative by the state of the economy and the ferociously competitive nature of the consumer electronics market.
You can think of almost any manufacturing endeavor in terms of what is generically known as the “80-15-5 rule,” which holds that 80% of costs are in materials, 15% in labor, and 5% in test, which refers to amortized test assets, depreciation of capital equipment, and the time it takes to test a device.
In all areas, the electronics industry continually looks to reduce its costs even as it embeds more technology and functionality within wireless devices. There are macro trends, such as functions moving onto systems-on-a-chip (SoCs), which help to reduce the materials cost. It’s often possible to reduce labor costs and to cut down manufacturing time.
But when it comes to reducing costs, test is a trickier subject. “Test costs will never go away,” says Darren McCarthy, RF and microwave technical marketing manager at Tektronix. In the manufacturing environment, the ideal is to drive test costs down to zero. In the design validation sphere, the goal is to design end products in a way that minimizes test requirements.
According McCarthy, examining the price-demand curves for various wireless technologies shows that adoption doesn’t happen until prices fall below a certain threshold. “For Bluetooth, you didn’t move from millions to billions of users until it was under $4 per chipset. Zigbee is at that threshold now. We’re trying to get it down to $0.50 to break the consumer threshold and get it into wall sockets,” he says.
A price-demand curve exists for each wireless technology on the market, and each one is on a different point on its curve. “All technologies, licensed and unlicensed, are trying to move down their curves,” says McCarthy. “By this point, cell phones should cost less than $10 each. But when you factor in the complexities of smart-phone technologies, they still retain their high price point.”
Some six or seven years ago, it might have taken a cell-phone manufacturer about a minute to test a handset, and almost all of that time was taken up in establishing the link between the handset and the test rack. These days, over-the-air test has taken a lot of time out of the equation, but the “long pole in the tent” today is testing for power.
Wireless devices or systems that operate on licensed spectrum must be calibrated for the power limits of the band they operate on. While simple functional pin testing is done at some point in the process, power calibration will always be there, and today it represents one of the more time- and cost-consuming aspects of wireless test. Expect technologies to appear that will begin chipping away at this barrier to lowering test costs.
Design for Validation
Ideally, design teams would want to validate their complex system designs so thoroughly that there’s not as much need for test at the end of the process. That happens when test vendors provide SoC design teams with embedded tools that go beyond those conventionally available for testing individual components or even subsystems. In 2012, we’ll see more instruments that are equipped to test whole designs with runtime embedded tools.
Another trend that is affecting test is the move from RF down to the baseband domain, says Jan Whitacre, Long-Term Evolution (LTE) market program manager in the Electronic Measurement Group of Agilent Technologies. Why move to baseband for a focus of development?
“It’s because of speed,” says Whitacre. “You can do a lot more a lot faster in the digital world than in the RF domain.”
Further, baseband chipsets are much less costly than RF chipsets. Implementing more of the device’s features at baseband ends up lowering the cost of the device. Additionally, analog-to-digital and digital-to-analog conversion blocks are being moved from the digital domain and into RF, closer to the power amplifier. With handsets and tablets having digital displays, wireless local-area network (LAN) functionality, Bluetooth, and multiple radio technologies, it’s preferable to not have multiple transmitter chains and to put it all within one chipset.
Thus, a key trend is to make the transmission block diagram one chain within the device. To that end, LTE topologies include multi-standard radios. Defined for the basestation, these radios enable the base to use one block diagram in the transmission of various radio signals.
The implications here for test, says Whitacre, is that test-equipment vendors like Agilent must become better at integration. “We have to improve the ability to look at both baseband and RF, and it also means that you want to do a lot more design and simulation upfront before your chipset is built,” says Whitacre.
To that end, look for more improvements with regard to co-simulation and test of modeled pieces of hardware, software, and actual hardware. It will become easier than ever to co-simulate hardware prototypes of an amplifier, for example, with software representations of the rest of the system (Fig. 1).
Manufacturing test must become more efficient, making better use of test automation and modular systems. “One of the key things is that they’re not going to use the protocol,” explains Whitacre. “They can’t take the time to do full call processing.”
Even though much of the design work on mobile devices will be performed at baseband going forward, design teams must continue to work carefully in the RF sections. Watch for more developments from the test equipment vendors in the area of modular products for RF production test.
The integration phase for mobile devices now must include test for the MIPI interfaces within the chipsets as well as RF-to-digital conversion using DigRF v.4.
“We currently spend a lot of energy on the digital-to-RF conversions,” says Whitacre. “The speed at which you can do your data throughput, the response of your devices, will be driven by the chipset’s ability to handle all that data.” Agilent and its competitors will continue to invest in equipment and test methodologies for emerging standards such as DigRF (Fig. 2).
DigRF does tend to make test more difficult in that it involves probing points in the design that no longer exist. It makes for much more “cross-domain” testing, as in digital inputs with RF outputs.
Watching Parallel Tracks
In handset design, there are two parallel tracks to watch in 2012. One is wideband CDMA (W-CDMA), while the other is LTE. “LTE gets all the press,” says Mike Barrick, business development manager for Anritsu’s wireless portfolio, “but W-CDMA will be developed where new networks can’t easily be afforded. You could say that’s true in the U.S.”
The currently deployed release of the 3GPP standard is Release 7, which provides for a 21-Mbit/s downlink rate. There is some deployment of Release 8 with downlink rates of up to 42 Mbits/s. Both of those releases sport 11-Mbit/s uplink rates.
Beyond these are the 3GPP Releases 9 and 10, which no one has deployed to date. Release 9 supports downlink rates to 84 Mbits/s, which is approximately the data rate that LTE would support in the common 10-MHz bandwidth. Release 10 supports downlinks up to 168 Mbits/s (Fig. 3).
With both LTE and W-CDMA, two technologies are emerging that will bring faster data rates: carrier aggregation and multiple-input multiple-output (MIMO). In 3GPP Release 8, dual-carrier HSPA already is being used, albeit only in the downlink. In the future, both LTE and W-CDMA will use carrier-aggregation technology to achieve higher data rates. For LTE, the deployment of LTE-Advanced networks is expected in 2013, with a good deal of R&D taking place during 2012.
For test purposes, setting up calls with dual carriers requires two transceivers in the test setup (and to extrapolate, four for 4x4 and eight for 8x8). For carrier aggregation, you’ll need one downlink transceiver per carrier. This poses some challenges for test-equipment makers, says Anritsu’s Barrick. “From our point of view, these are costly. Multiple RF power amplifiers also heat up the equipment, which means larger enclosures and more cooling.”
The bottom line is that reproducing the physical layer with higher-order MIMO and carrier aggregation is not trivial. If multiple parallel RF sections in the test equipment isn’t an attractive option, it’s possible that test equipment vendors will take a wideband approach.
“You could have a 3-GHz wide transceiver, perhaps,” says Barrick, “but no test equipment currently takes that approach. As with multiple RF decks, there are cost and heat issues involved.”