It's not easy to satisfy a voracious appetite for bandwidth. But that's still the goal for designers of optical networks. Using cutting-edge technologies, such as dense and ultra-dense wavelength-division multiplexing, they're adding more and more capacity to optical fiber. At the same time, designers are increasing the data rates through that fiber. If the old saying is true, and "you can't build what you can't test," this should be a recipe for disaster. But apparently it's not.
The past few weeks and months have seen a flurry of test gear aimed at helping designers create the latest and greatest in optical-communications systems. The hottest areas of activity are communications analyzers, bit-error-rate testers, optical spectrum analyzers, and other types of equipment for testing state-of-the-art optical components, boards, and systems operating at 10 Gbits/s and beyond.
Some of the younger test-equipment companies are building a number of devices, including load modules, bit-error-rate testers, and protocol analyzers, to test fiber networks. Ixia Communications, for example, recently introduced its newest load module at the ComNet show.
"We use optics for testing packet over SONET and for testing Gigabit Ethernet over optical-fiber connectors," says Paul Mallinder of Ixia. "Our equipment uses the different physical-layer media, whether they be single mode or multimode. On top of that, we run different framing, either Gigabit Ethernet or packet over SONET."
The backbone of the Internet consists of routers that typically have OC-12, OC-48, and OC-192 interfaces connected to them. Those routers are responsible for switching IP packets. Ixia's packet over SONET (POS) testing is really testing the backbone infrastructure within the Internet. "We're testing to see what the devices under test do to things like latency. So when we transmit packets and receive them, we're measuring the latency that the DUT induces on that flow of traffic. Also, the jitter."
Two metrics are important in the test, according to Mallinder. One is the delay and the other is jitter, which is the variation of delay on a packet-by-packet basis. A third one is packet loss. As the device under test heads toward line speed, OC-48 or OC-12, what happens in terms of packet loss?
The company's number one customer is Cisco Systems, so it's testing the Cisco 12000 products. "We're not testing the fiber itself," says Mallinder. "We're not doing bit-error-rate tests on the fiber. What we're actually doing is testing above the physical layer—the link layer, network layer, and transport layer that's sitting above those fibers."
Operations like error injection are performed by Ixia's products. That operation introduces errors into the protocol that's running over the fiber. Different bit errors can be induced in the frame to see what happens to the device under test. "We do both destructive testing and proactive performance testing," states Mallinder. "For testing purposes, we generate traffic."
Typically, designers will benchmark the switches in a laboratory within a controlled test environment. They'll then deploy them into the Internet. Once the equipment is deployed, they use their equipment in a very controlled way to make sure they don't adversely affect normal Internet operation.
The product announced by the company at ComNet in January is called the IXIA 100 QoS Performance Tester. With this test tool, designers can understand what happens to quality of service over the Internet. It's not blasting the fiber lines to full frame rates. Instead, different streams use very small parts of that bandwidth and check for ways that the Internet can differentiate its services against each stream.
Designers are creating algorithms to prioritize traffic through a switch. After those are designed, they need to be tested. "What our equipment shows is the degree to which they work," says Mallinder. "Do they work as they should? It gives designers the ability to rectify their mistakes before moving to the manufacturing stage of the product life cycle."
The latest product handles OC-48, whereas earlier products targeted OC-12 and OC-3. Aside from the ability to handle the higher speed, it can test router functionality. One of the routing protocols in the backbone of the Internet is called BGP4 (Border Gateway protocol 4). "If you just have OC-48, it's great," states Mallinder. "We can fully blast the line and see what happens. But the whole key is that you need to be able to set up different routes within the Internet and then check, as you send traffic through those routes, as to when you send and when you receive. This verifies that the routes have been established correctly."
Fiber-optic designs do boast other hot areas, too—one being Fibre Channel systems. Companies like Finisar Corp., Ancot Corp., and FuturePlus Systems make Fibre Channel protocol analyzers and other kinds of equipment to test these systems.
Recently, Finisar introduced an interesting product called GT-SANmetrics. It's a software package that takes Finisar's protocol-analyzer data traces and presents them in a way that a user would want to look at them. For example, how much traffic is on this link? How much traffic is going to specific devices? How are the devices responding to the traffic? What does the latency in a SCSI environment do to a read or write to a drive? As such, SANmetrics provides the link between the actual performance of a network experienced by the user and the detailed Fibre Channel traffic collected by a protocol analyzer.
John Adam, director of instrument marketing at Finisar, says that SANmetrics has another application from the user perspective. Often, the user will see a symptom of the problem. The user then asks the manufacturer or system provider, "What's going on here?" To determine the problem, people bring in protocol analyzers and capture data. Then they have two choices: They can stare at the details of the traffic that a protocol analyzer gives, or they can run SANmetrics and obtain a profile of what's going on from a higher-level point of view.
Asked about any potential problems for the designer as Fibre Channel moves from 1 to 2 Gbits/s, Adam states that the world seems to believe that it's going to be a simple transition. He thinks there is a problem, though. The physical layer—the fiber-optic cable, the connectors, the whole infrastructure associated with this move—is going to become critical." Eye patterns at 1 Gbit/s have to be open about 500 ps to get valid data. When you double the speed, you're down at 250 or 300 ps."
So in Adam's view, the whole idea of testing cables and interconnects and making sure the physical layer is good is going to become very critical. "It's just something that designers who are looking at implementing systems need to be aware of," he says. "It's critical in copper, but it's also true in multimode fiber optics."
Multimode Fiber Optics
Multimode launches the light down the fiber-optic cable in basically a broad spectrum of wave fronts. "You may launch a good square wave at one end of a cable," states Adam. "When you look at it at the other end, because of the multiple path lengths of the multimodes, that eye may have collapsed to near zero."
So a cable may be good according to an optical power meter, which says that the fiber is continuous. It may be good with an OTDR, in that there are no kinks in the fiber or nothing to send power back to the receive end. But according to Adam, you can't pump data through it because of this eye collapse that comes from multimoding in the fiber-optic cable.
According to Adam, "That's something that our customers, our early adopters moving to 2 Gbits/s, are finding to be a real headache. "I think it's going to be a real headache for the industry."
Of course, the older test-equipment guys are in the fiber-optic network area, too. A couple of products recently introduced by Tektronix are on two new platforms designed to service the requirements of faster and faster communications speeds. One is the OTS9000 SONET/SDH analyzer and the other is the CSA8000 communications signal analyzer. Both instruments employ a Windows-based interface.
The "newness" behind the OTS9000 is really the platform upon which it sits. It's capable of testing speeds of 10 Gbits/s and higher. The card-modular instrument exploits the CompactPCI architecture. Therefore, when they need to, users can just plug in new cards.
The Tektronix CSA8000 sampling oscilloscope replaces a platform that's about 10 years old (Fig. 1). "We had just gotten to the point with the old product where we had bumped into technological issues," says Jim Roth, worldwide business development program manager at Tektronix. "It served us well for a very long time, but the market is moving forward very quickly and we needed to bring a platform out that could address the future requirements of designers for the next five to eight years."
The company received a lot of feedback from its customers about what they wanted to see in an instrument. Performance was clearly the foremost requirement. After that was integration. For optical tests, many elements of the test system are integrated into the platform between the mainframe and the modules. The CSA8000 boasts a user-configurable modular architecture and a variety of optical plug-in modules that support conformance testing to multiple standards.
"With the CSA8000, you have a lot of filters and reference receivers, as well as clock recovery and power monitors, which historically have been things that you hooked on the outside of the box," according to Roth. "This gave you really great flexibility, but it made the instrument very difficult to configure."
According to Roth, that's very important in a manufacturing environment. Before, when changing the data rates that you were testingt to, you'd have to change hardware out. Now everything is integrated into the mainframe and modules, so changing data rates and your test environment is very easy. It's done through a menu instead of swapping out hardware.
Better Jitter Performance
The company did achieve some breakthrough performance elements as well. In particular, RMS jitter performance for long-term jitter improved by two orders of magnitude. Some designers need to trigger on an event and view a signal, for instance, 100 µs out. The new instrument brings that signal two orders of magnitude improvement in stability. "Somebody who wants to look at a full SONET frame can actually trigger at the beginning of a frame and then delay out and search throughout that frame for whatever particular elements they're looking for in there," Roth says.
Many automated measurements have been added to the platform by Tektronix. These include both standards and other measurements that designers like to see. Standards testing is a requirement. Basically, the performance has to be there in order to test to the standards. As speeds increase, though, the user must consider the error from the device versus the error from the test equipment. Tektronix has reduced the error contributed by the test equipment.
Aside from standards testing, designers like to look at a lot of other measurements, like Q factor, says Roth. They use different measurements to do quick and dirty analysis. "We've integrated a lot of that into the new platforms," he states.
By working with the standards bodies, Tektronix hopes to improve the standards. As the test equipment becomes capable of performing new measurements, the company pushes to get those added to the standard and also works to improve the specifications within a standard.
According to Todd Baker, the company's worldwide business development program manager, "If the test equipment is able to improve in orders of magnitude on a particular measurement, we're pushing to improve the standard—SONET in the U. S. and SDH in Europe at the 10-Gbit/s level and at the optical level."
Businesses deploying Gigabit Ethernet and Fibre Channel, as well as to the homes using cable and xDSL types of modems, are driving the demand for 10-Gbit/s speed, Baker says. "These are order-of-magnitude types of improvements in bandwidth at the business and at the home. That's driving demand all the way up into the core of the network. So the demands are changing with equipment designed for the home, the central office, the metro and access areas, and then all the way up into the core."
Aside from increasing data rates, equipment manufacturers are boosting the channel count. One of the keys in testing is to be able to test high channel count. "Naturally, you're not going to go out and buy 32 or 64 products and stick them into a test system," says Baker. "It just becomes very unmanageable."
Dense-wavelength-division multiplexing (DWDM) is increasing the number of transmitters, the amount of receivers, and the complexity of multiplexing and demultiplexing. The industry is moving from four and 16 channels to as many as 132 channels in steps.
Equipment manufacturers seek modular test solutions in which they can put multiple receivers on one side, and a single transmitter on the other. Tektronix's solution (the OTS9000) was the result of a shift in the industry toward high channel count. "That's something that we saw happening several years ago, so we've been reacting over time," he says.
Agilent Technologies, a company with a new name but a long history of providing test solutions, has just introduced a high-speed oscilloscope that's Windows-based. This is a complete platform change from the company's previous high-speed digitizing oscilloscope. But plug-in modules used for its last platform are all forward-compatible with this new one. The mainframe, called the Infiniium DCA model 86100A digital-communications analyzer, includes the high-speed scope. The personality of the mainframe of the platform is dictated by the plug-in modules. Some modules have purely electrical inputs, like a scope that has either 20 or 50 GHz of bandwidth. Other modules are all optical, either multimode or single mode. Various rates exist for Gigabit Ethernet, Fibre Channel, SONET, and ATM.
"The performance parameters have not changed much," says Mike Resso, product manager for time-domain reflectometry. "It's mostly the user interface." He believes the Windows-based interface makes things easier for the designer trying to really take advantage of the power of the equipment.
Another important piece of test equipment in the optical arena is the optical spectrum analyzer (OSA). Two have been announced recently: one by Agilent and the other by Advantest.
Agilent's 86140 family of OSAs was originally introduced at the Optical Fiber Conference (OFC) last year (Fig. 2). At this year's OFC held earlier this month, the company introduced the B version of those instruments. The Advantest Q8384 OSA, also displayed at OFC, was announced last year.
Jerry Chappell, product marketing manager for OSAs at Agilent, says there are three markets driving optical-spectrum-analyzer demand right now. The first is system test, the second is testing the individual modules of the system, and the third is testing the active and passive optical components that make up the module. An emerging fourth area uses an OSA to look at an individual 2.5-Gbit/s or 10-Gbit/s signal.
In the past, designers really didn't have to measure amplitude versus wavelength. But when the industry moved to WDM, all of a sudden multiple channels, lasers, or colors were all on the same fiber. The analog version of this is the multiple cable TV channels on a single cable.
To characterize the system, designers will use an OSA to look at the individual channel spectra. They want to measure where each channel is, its wavelength, and the signal's power. In addition, they will measure the noise that's in between those channels—signal to noise. Very similar measurements also are performed in both RF and microwave.
As channels are placed closer and closer together on the fiber, it becomes a challenge. The receiver, in this case the OSA, must have outstanding dynamic range. The instrument needs a wide enough bandwidth to fit all of the modulation within a particular filter shape. But it also requires enough rejection so that the designer isn't measuring part of the adjacent signal when trying to measure the noise. Dynamic range is vital, as it provides the ability to accurately measure the signal and yet accurately characterize the noise in between the two channels.
According to Chappell, another important aspect of OSAs is channel-analysis software. That software automatically characterizes channels for the user. When designers had three or four channels, it was easy enough to put markers on them and perform manual measurements to make sure they were where they were supposed to be. However, with channel counts at 80 channels or higher, users don't want to move a marker trying to manually measure all 80 signals.
"This is done automatically by firmware in the box," says Chappell. "It's an application provided with the spectrum analyzer that automatically characterizes all the channels and tells you where they are. The step beyond that, which we're working on, is to tell you whether or not they meet or exceed their specifications." An OSA is used by the designer at all stages of the design, but only as a receiver. "We refer to it as a single-port measurement," notes Chappell.
As mentioned, OSAs can measure active and passive components. This is more of a stimulus-response measurement. For passive components, someone puts a stimulus into the device and measures its response. "The stimulus used to be a white-light source," says Chappell. "People would just take a halogen bulb and it would provide 900- to 1700-nm wavelengths at fairly low power. Now, edge-emitting LEDs are available in all of the areas of interest. We've got products that let you put four of them together. That allows you to go from 1450 to 1650 nm."
OSAs are starting to look at an individual 2.5- or 10-Gbit/s signal. If you try to look at the individual sidebands, you can't right now. There isn't enough resolution. In other words, the OSA's filter isn't fine enough. Some companies are trying to address this issue.
"Some designers would like to see an OSA with enough resolution that you could go in and look at the actual sidebands of the modulated signal," states Chappell. "We're a couple of years away, I think. The lasers right now are not stable enough to do the downconversion."
Another tough measurement is trying to determine the lasers' line width. A typical CW laser has a line width of under 100 kHz. Measure it with an OSA, and at best that measurement is the resolution bandwidth of the instrument itself. For the best one out there, that's on the order of 10 picometers (1.2 GHz). "Users sometimes think they're measuring line width," says Chappell. "They're just not aware that what they're really measuring is the limitation of their own measurement receiver."
He points out another development that's unfolding: A broadband source and an OSA can be used to measure passive components. But users will run into a limitation at some point. They'll have fixed power density with a broadband signal, but they need to measure narrow devices. As a designer narrows bandwidth in order to measure very fine components, the less light makes it to the photodetector.
As resolution is made finer and finer on the spectrum analyzer, the power available to measure becomes lower. But the noise floor stays the same, therefore the measurement range starts to become smaller and smaller. As filters become narrower and narrower, then, it's becoming more difficult to make certain types of measurements. "There's a shift occurring from a wavelength-selective receiver with a broadband source to a wavelength-selective source with a broadband receiver," according to Chappell.
In optics, one-by-n devices remain very popular. Demultiplexers can be 40 channels, 80 channels, and in the future, maybe 100 or 200 channels. In the past, the way those were measured was with a broadband source with a spectrum analyzer and a switched chassis. What they're doing now, Chappell says, is taking a tunable laser source, putting it into the demultiplexer, and then hanging inexpensive power meters on the output of every single channel.
With a single sweep of a tunable laser source, every power meter is recording data at trigger points. At the end of one sweep, all n channels are characterized, whatever n may be. As device capacity is raised from 20 to 40 to 80 channels, more hardware is needed. But their total test time remains the same. "The throughput is amazing, and that's something that is just now making it into manufacturing," says Chappell. "It's a different solution from a spectrum analyzer."
A lightwave measurement system like the one just described is manufactured by an Agilent division in Germany. It's a tunable laser source and a chassis for power-meter modules. The user can slide in n number of power-meter modules and trigger them all to make measurements at the same point. All of the data is dumped back to a PC, and the results are shown on the screen.
Chappell expects optical dispersion measurements to become a huge area of interest. This includes both chromatic dispersion and polarization-mode dispersion, commonly referred to as CD and PMD. As data rates increase from 2.5 and 10 Gbits/s to 40 Gbits/s, CD can potentially limit the transmission distance. So designers are characterizing not only the fiber, but every component that goes into the system for dispersion.
Right now, this kind of testing is accomplished with a tunable laser source, an RF network analyzer, and a novel test set. That matches a product recently announced by Agilent: the 86037C chromatic-dispersion test system.
Getting back to oscilloscopes, though high-speed sampling oscilloscopes are needed to test the fastest communications signals, high-end general-purpose DSOs still can test slower optical signals. LeCroy Corp. introduced a system about a year and a half ago for testing optical signals. It looks at OC-3/STM1 and OC-12/STM4 optical telecom systems.
Called the MT03 Mask Test Kit, it lets the user do all of the things that are normally done with an oscilloscope. In addition, the incoming signal can be taken in and compared to a standardized test mask. The OE325, which is part of the kit, performs the very important job of converting the optical signal to an electrical signal—not an easy device to design. "That's a problem I think every test-equipment manufacturer has, converting the optical signal to an electrical signal," says Mike Lauterbach, director of product management at LeCroy.
The kit also contains a reference receiver that performs the filtering with a four-pole Bessel-Thomson filter, as specified by the standard. After Conversion and filtering, you can look at these signals as if they were any other electrical signal.
Paul Fowler, product manager for communications at LeCroy, notes that as you move up the line from OC-3 and OC-12 to OC-48 and beyond, the speed gets high enough whereby you start moving away from the realm of a general-purpose oscilloscope. If you want to see signals in the time domain, you end up in the sampling-oscilloscope arena.
A lot of the research is being done at the higher-speed data rates in the R&D labs. But it takes some time for that technology to migrate its way down to an actual product that's offered to customers, Fowler points out.
"What we're seeing right now is that although there is a lot of work being done at OC-48 and speeds faster than that, there's a tremendous amount of interest at this point in OC-12 and its counterpart in Europe, STM4, which is going into production at this point," says Fowler. "We have a lot of demand for our products in production environments, particularly in STM4."