Electronic Design

Due To Technical Difficulties, We Cannot Bring You The World Today

With reliability more important than ever, designers need top-notch test tools to build bullet-proof communications systems.

Technical glitches. They're a part of life. Some are just an annoyance, while others are very costly—can you say Y2K? Worse yet, they promise to be much costlier in the future. No doubt you've noticed that the world economy is depending more and more on communications. As a result, design engineers are under increased pressure to build ultra-reliable devices. In other words, a communications device must work out of the box and continue to work flawlessly until it is replaced. To produce such products, designers need great test tools.

Testing prototypes is no easy task. Clock speeds are skyrocketing, bandwidths are ballooning, IC geometries are shrinking, and connections are disappearing. To date, designers appear to have the tools necessary to test and debug their prototypes. But how about three to five years down the line? Will designers still be using general test equipment, such as DSOs, logic analyzers, and meters, to test their communications prototypes? In my opinion, yes. But I also expect specialized test equipment to become more prevalent on the design bench.

General test equipment saw some interesting developments last year. For the first time, a DSO—Agilent's Infinium—was endowed with voice-recognition capability. Should all DSOs offer this feature in the future, at least as an option? I think so, because probing a board is difficult. Designers need test instruments that make probing easier, not harder, and voice recognition is one way to do this.

DSOs Get Specific
DSOs have long been seen as general-purpose instruments. But in the last few years, they've targeted specific industries, such as telecommunications, data communications, and computing. Both design and test engineers often have to make very specific measurements because of interoperability and compliance requirements. We've reported on this trend in recent years, and it will certainly continue in the future. We may actually see instruments that are specifically labeled "communications scopes," "computing scopes," or other types of scopes, rather than "general DSOs."

We're also seeing the integration of web servers into test instruments. Such a feature lets designers control instruments like logic analyzers from remote locations. Just as we all log on to the Internet today to get stock prices and sports scores, in the future, we'll be able to check waveforms on our test gear from anywhere at any time.

Generally, though, last year's big breakthroughs in general test equipment came in acquisition speed and memory—faster and deeper, as they say. These kinds of improvements are necessary in order to keep pace with the "bleeding edge" of technology, and they will continue well on into the future.

As Mike Lauterbach, director of product management at LeCroy, explains it, testing and troubleshooting a board requires a long acquisition memory. And then to compute the answers—the application-specific answers—a good processing engine with lots of RAM is necessary.

"In the best of all possible worlds," says LeCroy's Lauterbach, "you can buy yourself a really nice general-purpose oscilloscope and have this customized test package that does what you want when you have to make specific measurements."

Lauterbach also thinks that clock inputs, which LeCroy now has on its scopes, will be a key feature for communications work in the future. "The ability to supply an external clock into the communications testing device is a key element," he notes. "It's something that the general-purpose market does not ask for, but the communications market absolutely stresses it."

Last year, we also saw DSOs decrease in physical size in one case (the Tektronix TDS 3054), and increase in display size in another (the LeCroy Waverunner). If these trends continue, we soon may see 1-GHz or higher palm DSOs, along with scopes featuring display sizes as large as some modern notebooks. It's all about test equipment leveraging off advances in PC technology.

This will continue to occur, not only in hardware but in software as well. We've already seen examples of test equipment employing the Windows operating system, and more are on the way. A Windows OS brings a familiar look and feel to test equipment, while enabling test-equipment manufacturers to easily add features like connectivity and printer support.

Giving Test A Voice
How about an innovation we didn't see in benchtop equipment this year? Why are our test instruments so silent? Our PCs certainly aren't. According to focus groups run by Agilent Technologies, one of the problems designers have with high-end equipment like logic analyzers is the re-learning curve. They learn how to use the analyzer, do some work with it, and then put it aside for a couple of months. The next time they try to use it, they realize they've forgotten almost everything they previously knew and have to return to the manuals or go to the help system.

Why not endow the equipment with the ability to tell the user how to perform certain operations? For example, the designer might ask how to set a specific trigger, and the logic analyzer would answer. It would be even better if the logic analyzer could "remember" what was done the last time and replay the sequence—with a voice annotation, of course.

Before moving on to specialized test equipment, let's look at PC-based instruments. PC-based DSOs and logic analyzers have been increasing in power due to improvements in both the board-level products and in the host—the PC. The board-level products have steadily improved in speed and resolution over the past couple of years. Digitizers, for instance, have jumped from about 15 MHz two years ago to 100 MHz this year. They also increased in resolution from 8 to 21 bits. If this continues, we soon may see 500-MHz digitizers at 32-bit resolution. That's quite a powerful digitizer.

With PCs now running at 700 MHz and higher, and faster ones being developed all the time, there seems to be no limit to the power afforded to PC-based instruments. If you add gains in hardware, along with improved software due to faster, more efficient algorithms and improvements in operating-system and applications code, it's evident that PC-based instruments have a very bright future.

Testing communications boards and systems demands more than general types of test equipment. Specialized test gear can emulate data streams, test for standards compliance, and perform other functions. In the future, this gear will be used more and more in three key communications areas: mobile systems, digital TV (DTV), and high-speed networking.

Mobile phones have been around for more than a decade now. But this communications area has grown explosively in the past five years, and it will continue to grow steadily in the future. Despite this, communications companies around the world haven't yet settled on the single best system for mobile communications. The U.S. has a couple of different systems, Europe has its own, and Japan has still another. The big push right now and for the foreseeable future, then, is to meld all the different second-generation (2G) technologies into a single third-generation (3G) system (see "Mobile Phones: The Next Generation," below).

Whether or not one system eventually prevails, the contending 3G systems share a common transmission characteristic—they create spread-spectrum signals that feature an intermittent, bursty nature. This presents an interesting challenge for test instruments, especially spectrum analyzers, which are used for a host of wireless measurements.

One way to attack this problem is with a class of instruments called real-time spectrum analyzers (RTSAs), which differ from traditional swept- spectrum analyzers. Rather than acquiring one frequency step at a time, real-time analyzers capture a block of frequencies all at once during a user-specified time frame—20 ms, for example. These frames repeat continuously, with a full acquisition every frame. Through this method, the real-time spectrum analyzer instantly detects events like W-CDMA bursts.

Real-time acquisition isn't the only feature a modern spectrum analyzer needs, however. Second-generation CDMA (code division, multiple access) has a chip, or symbol, rate of 1.2288 Mchips/s, with a 1.25-MHz channel width. Third-generation W-CDMA (wideband CDMA), on the other hand, will have rates as high as 16.384 Mchips/s and a 20-MHz channel width.

Spectrum Analyzer Revelations
The real-time spectrum analyzer's spectrogram and waterfall displays reveal the signal's behavior at a glance and alert wireless designers to events that require closer examination (Fig. 1). Also, frequency-mask triggers can find the causes and effects of aberrant events. This type of function triggers on a user-selected event and captures a pre- and post-trigger record. Applications include detecting CDMA partial-rate randomized bursts and oscillator phase jumps.

Real-time spectrum analyzers, such as the Tektronix 3086, are well positioned to tackle 3G testing for the next few years at least. The road to 3G isn't likely to change in regards to overall specifications at this time, and 4G systems aren't projected to be deployed until the 2010 time frame.

One of the traditional swept-frequency spectrum analyzer's fortes is making adjacent channel power ratio (ACPR) measurements for 2G and 3G systems. Improvements in these instruments for the communications sector over the next few years will likely occur in the area of ACPR measurement speed, which has been a bottleneck so far. This problem is already being addressed by such instruments as the recently announced Rohde and Schwarz FSP3 and FSP7 spectrum analyzers, which employ dedicated hardware processing for ACPR tests. These new devices shrink ACPR test time from today's norm of up to several seconds to 100 ms or less.

In the next few years, both real-time and swept-frequency spectrum analyzers will likely perform new measurements to accommodate new standards. Designers may find themselves asking for a spectrum analyzer with a specific "personality" that's optimized for making certain measurements.

Beyond that, the measurement scene in this area takes a different twist. The industry is moving toward software radio and the ability to program a DSP that, for instance, can cover both cdma2000 and UMTS. This progress may place new demands on test instrumentation.

Ross Nelson, Tektronix's worldwide business development manager, Communications Business Unit, describes it this way. As you're building a software radio, the most important job is to correctly program the DSP. How do you program and then debug your DSP in real time as you watch how it performs on the air interface?

If you have a spectrum analyzer that can acquire in real time, you can time-correlate that spectrum acquisition to a logic analyzer that's decoding the DSP output. Then you can actually trace in real time the DSP code as it affects what you're seeing on the air interface. That would allow you to debug your DSP using the air interface as the actual signal that you're analyzing. So, this is a case where a real-time spectrum analyzer plays a major role.

Let's move on to more specialized communications equipment for 2G and 3G systems, such as mobile-phone testers, base-station testers, and protocol analyzers. In the next few years, the major challenge for manufacturers of this type of equipment will be providing designers with flexible tools. As standards twist and turn on the road to 3G, designers need to be able to incorporate these changes quickly, easily, and cost-effectively into existing 3G test equipment.

If 3G is right around the corner, can 4G be far behind? Fourth-generation wireless is probably too far out on the timeline to be on designers' minds right now. The reason why isn't so much because of what 4G might be, as it is because of what 3G could be. I don't think anybody today really knows everything that 3G could deliver. The killer application for 3G could be dozens of different things. Until 3G has really been deployed and the bandwidth is available for people to use, it's tough to truly anticipate all of the applications that might take off. For the next five years, at least, there's going to be plenty of work just trying to tap out 3G's potential.

Digital TV's Onslaught
The nationwide deployment of DTV, including high-definition TV (HDTV), started in November of 1998. It's going to continue well into the future until something better comes along. Remember, in this country, there is supposed to be a complete transition from analog to digital TV by 2006. Most of the equipment that had to be built to test these systems was designed over the past few years.

The specialized test equipment in this area consists of tools to generate the signals needed to test HDTV boards and systems. Several companies make signal generators for this purpose; some specialized, others modular. Designers need such equipment as a reference source to test high-definition processing gear, check for distortions, and fully stimulate the circuitry.

If there is a change in direction in the DTV world, test-equipment producers would have to respond. Manufacturers of modular systems would have to roll out a new module for their main platform. Is a change in the standard possible this year or next? Maybe. Late last year, the Sinclair Broadcast Group (SBG) amassed compelling evidence indicating that the current 8-VSB DTV standard does not provide reliable over-the-air service through simple antennas in complex, dynamic multipath conditions, such as those found in urban areas.

Dealing With DTV Blank Outs
Talk about annoying glitches. If an analog TV system has bad reception, you simply get a fuzzy picture. A DTV screen, though, can blank out completely if there's too much missing data. SBG believes the use of coded orthogonal digital frequency modulation (CODFM) technology, which reigns in Europe, would solve these reception problems. The FCC hadn't ruled on this matter yet at the time of this writing, but it doesn't seem likely to me that any significant change will occur soon.

Another hot DTV area is protocol testing. Currently available test instruments can test and qualify MPEG-2, digital video broadcast (DVB), and ATSC (Advanced Television Systems Committee) equipment and services. Over the next few years, I expect to see only incremental enhancements in this kind of equipment, since standards such as MPEG-2 should be in force for the foreseeable future.

Looking ahead to the video future, it appears that there will be an ever-increasing number of video delivery methods and mechanisms in the marketplace. One of these will definitely be the Internet. In this case, there needs to be a way to get the video into some packetized form to put on the net. All of that certainly takes a lot of software, but hardware is needed to actually come up with the signal. Test-equipment manufacturers will need to provide the tools to test these new hardware and software offerings.

Global networking and the Internet is sure to enjoy tremendous growth and change in the next few years. Currently, voice and data traffic are about equal. But from now on, data traffic is projected to grow exponentially. Measurement will be a key to driving this growth.

To provide high capacity, network operators have been deploying fiber-based core networks, many of which employ dense-wavelength-division multiplexing (DWDM). Speeds on the core presently range from 2.5 to 10 Gbits/s. The elements of a core DWDM link consist of transmitters, wavelength shifting elements, a multiplexer, a demultiplexer, receivers, and an optical path with an optical amplifier (see "Structure And Testing Of A Typical DWDM Link," below). To verify system performance and assure quality of service (QoS), measurements and tests must be performed on each of these blocks (Fig. 2).

Designers need to be able to examine very low-level optical signals (less than −10 dBm). They also require improved jitter performance in their equipment as data rates increase to 10 Gbits/s and beyond. These necessities should spur test-equipment manufacturers to point their high-end gear, such as scopes and logic analyzers, to solving these problems. It will be interesting to see how general test equipment, like DSOs, possibly transforms into more specialized equipment to serve high-growth industries such as this one.

Also, a clock signal isn't always available to trigger on in these systems. Some kind of clock-recovery mechanism, then, is necessary. Last September, Agilent Technologies fired the first shot by announcing its integrated clock-recovery and optical-reference receiver, the HP 83490 series. When used with Agilent's high-bandwidth oscilloscopes, the new series provides a stable recovered clock for all standard telecommunication and data-communication rates up to 2.5 Gbits/s. This kind of instrument will certainly develop in the next few years to handle ever-increasing core-network speeds.

Along with this increase in physical interface bandwidth, the data paths, or parallel interfaces, inside the switching equipment will continue to grow. This will create a demand for higher-performance test equipment, notes Bob McClung, general manager of Agilent Technologies' Digital Design Product Generation Unit. "Higher data-path rates will require higher bus rates with either single-ended or differential voltages. Being able to sample these new voltage levels with a logic analyzer will be critical for the future," McClung says. "Also, innovative new measurements will be needed for more efficient system-level debug of the networking equipment."

Accommodating The Standards
As in the other areas of communications, standards play a big role in high-speed networking. Instruments need to be flexible enough to support multiple standards. Many instruments already can respond quickly to change, since software has played a big part in new test equipment for some time now. Usually as standards are finalized, a test-equipment company only has to provide a new software module to its users to upgrade its equipment. This has worked well in the past, and it should continue to be the way manufacturers tackle this problem for the foreseeable future.

Just as communications affects every area of our business and personal lives, so too is it affecting the way designers interact with test equipment. Last November, Agilent Technologies announced its Center of Technology for Connectivity. As part of the announcement, Agilent detailed its overall test and measurement connectivity strategy.

Its three main goals are to move proprietary test and measurement standards toward computer-based standards, provide intrinsic "one-click" access to data so customers can do their jobs quickly and easily, and supply instruments that integrate into an intranet/Internet environment. While Agilent and other test and measurement companies have come to address these kinds of issues before, it does signal the importance connectivity will be commanding during the next several years.

We've seen many test and measurement products embrace connectivity simply by including an Ethernet connection. This small improvement gives designers a great deal of utility for transferring large files, printing screens, and so forth. This trend, including Ethernet connectivity as a standard feature on test equipment, should increase dramatically over the coming years.

Other computer-based connectivity standards that will continue to grow in the near future are USB and IEEE-1394 (Firewire). By this time next year, many if not all off-the-shelf PCs will have built-in USB and IEEE-1394 ports. These connectivity options will spur test-equipment manufacturers more and more to include one or both of these connectivity options in new test equipment.

One-click access to data refers to the ability of designers to bring information from instruments quickly and seamlessly into programs such as Microsoft Excel and Word, as well as to specialized analysis programs. Designers not only need to acquire data, but they also must analyze and present it. And with such awesome computing power on the desktop, it makes sense to provide designers with easy ways to get their data into PC analysis and presentation programs.

Agilent took a big step early last year when it added web-server capability to its HP 16600 and 16700 line of logic analyzers. In addition to the ability to view and operate these analyzers remotely, designers can instruct the instrument to perform operations, such as sending an e-mail when the analyzer triggers on an elusive glitch.

Keeping Pace With Technology
As technologies evolve in the years to come, the challenge for test and measurement companies is to stay one step ahead of the market. To do this, top-tier test-equipment manufacturers must continue to take things into their own hands. In other words, these companies must be able to design their own high-speed amplifiers and high-speed analog-to-digital converters (ADCs), since they cannot readily purchase, say, 10-Gsample/s ADCs. If recent announcements are any indication, test-equipment manufacturers are up to the challenge. And, hopefully, we'll see a future that's as glitch-free as can be.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish