The Evolution of Optical Transceiver Test

Long ago, verifying the performance of a digital communications system was straightforward. The entire network was installed and owned by a single company, and if the system worked, extensive testing of the subcomponents was unnecessary. But in this age of optical networks using components produced at many different sources, proper compliance testing must ensure that system-level specifications are met when all components are connected, no matter where the components came from.

Basically, an optical communications system consists of a transmitter, a fiber channel, and a receiver. When a transmitter is paired with a receiver through a fiber and the desired bit-error-ratio (BER) is not achieved, is the transmitter at fault? Or, is it the receiver? Perhaps both are faulty.

A poor quality receiver can be compensated for by a high performing transmitter and vice versa. When a standard communications system such as Ethernet or Fibre Channel is designed, device specifications are determined so that the cost/performance burden is properly balanced at both ends of the fiber. Specifications are set so that any receiver will interoperate with the worst-case allowable transmitter, and any transmitter will provide a signal with sufficient quality so that it will interoperate with the worst-case allowable receiver.

Transmitter/Receiver Testing

To set the transmitter specifications, it is important to know how good or bad the receiver and channel will be for the receiver specifications. A starting point for this iterative process might use performance levels from an earlier-generation system.

When a new standard is being developed, it likely will use components that have yet to be fully developed since performance levels usually make a significant jump in the next-generation system. This can present some difficulty since the component developers may be reluctant to make large investments in new components until a standard and the subsequent component specifications become stable. This can lead to a slow and tedious process as the standard and the components evolve together.

Nevertheless, a transmitter must be able to operate with the lowest performing receivers and channels deemed acceptable. If a receiver needs a minimum level of power to achieve the system BER target, this will be used to determine the minimum allowed output power of the transmitter. If the receiver can only tolerate a certain level of jitter, this will be used to define the maximum acceptable jitter from the transmitter. In a digital communications system, the receiver makes logic decisions on each incoming bit so the shape of the transmitter waveform also must be specified.

Key requirements for the transmitter are based on the receiver needing a wide separation between logic 1s and logic 0s. Also, a consistency in the time location of the transitions between logic 1s and logic 0s is necessary. This helps ensure that decisions are made where there is little chance for a mistake.

An eye diagram is the common method to view the transmitter waveform. In the eye diagram, the various combinations of data patterns are superimposed on a common time axis usually less than two bit periods in width. Good amplitude separation and low jitter are seen as an open-eye diagram.

Rather than make several measurements to determine the quality of the eye, it is possible to do this in one test. The openness of an eye diagram can be verified by performing an eye mask test.

A mask consists of several polygons that are placed in and around the eye diagram, indicating areas where the waveform should not exist. A good waveform will never intersect the mask and will pass the mask test; a bad waveform will cross or violate the mask and fail (Figure 1).

Figure 1. Mask in Eye Diagram Verifies Eye Opening

The oscilloscope will determine if any waveform samples fall on the mask. Most industry standards define a transmitted signal as noncompliant if any waveform sample violates the mask.

This brings up an interesting problem with eye mask testing. Compared to a small number of waveforms, a larger population of waveform samples should provide a more accurate assessment of the transmitter performance. However, every eye diagram will have random characteristics in both amplitude (noise) and time (jitter).

As more data is collected and compared to the mask, the likelihood of mask violations increases. In theory, if enough samples are acquired, eventually almost any transmitter will fail a mask test. The probability for mask violations may be very low, but mask testing generally does not account for probabilities. It is an all-or-nothing test, pass or fail.

This brings up an interesting question. How many samples must be acquired to perform the test?

At a minimum, enough samples must be taken to align the mask to the waveform. Typically, a small population will be sufficient.

How much more data should be acquired? If there is significant space between waveform samples and the mask, it is likely that collecting more data will only slightly change the result. The problem occurs when a device barely passes the mask test. Collecting more samples could lead to a failure.

Some recent industry standards, such as IEEE 802.3ah and IEEE 802.3aq, recognize the problem and allow mask failures to occur. However, only a very small ratio of hits to samples is allowed: one failed sample for every 20,000 samples measured. This permits a large population to be collected, possibly yielding a more accurate measurement without an increased likelihood of failure. Other new standards also are considering allowance of a small percentage of failed samples.

This allowance leads to a modified version of the classic eye mask test. The eye mask test is performed as usual, but a second step takes place once the required number of samples has been acquired. An automatic eye-mask margin test is performed to determine how far the mask dimensions can be expanded before the ratio of mask hits to total waveform samples exceeds the ratio allowed by the standard.

In Figure 2, the mask dimensions are expanded until 20 hits are encountered. There were 400,000 samples between the crossing points of the eye to which the mask is aligned. The mask can be expanded significantly from the standard dimensions, indicative of a very good transmitter.

Figure 2. Automatically Magnified Mask to Verify a Compliant Hit Ratio

IEEE 802.3aq has been designed to transmit signals as fast as 10 Gb/s over channels where dispersion dominates system performance. The modal dispersion for installed runs of multimode fiber with distances in the 200-m to 300-m range can completely close the eye as seen at the system receiver.

Advanced communications techniques will be required to overcome the large signal dispersion problem. Receivers will use equalization schemes to compensate for the impairments caused by the channel. However, this complicates the definition of what an acceptable transmitter might be. Eye-mask testing may prove to be meaningless if the eye at the output of the channel is closed no matter what the signal quality is going into the channel.

A new test approach being used in 802.3aq verifies if it is possible to equalize a transmitter signal after it has passed through the worst-case expected fiber spans. The transmitter waveform is captured and run through a virtual channel model that simulates actual fibers.

This signal then is passed through a virtual finite-length equalizer. The equalized signal is compared to the quality of the signal as if it had been passed through an ideal equalizer. The virtual receiver is sophisticated, using both linear and decision feedback equalizers with several signal taps.

Even though the virtual channel and virtual receiver models are complex, the actual test process can be performed within the oscilloscope. Rather than an eye diagram, a long waveform pattern is captured.

A good example is a 511-bit pattern with at least seven samples for each bit. The waveform record is passed through an on-board MATLAB® script that models the fiber and receiver. The output of the script is a power penalty value in decibels that can be compared to the specifications defined in the IEEE standard to verify compliance. The test commonly is known as the transmitter waveform dispersion penalty.

The receiver test boils down to achieving an acceptably low BER with the worst-case signal from the fiber at the end of the network. Historically, the telecommunications industry treated issues like fiber dispersion with links that were engineered. Network designers used sophisticated models of the individual elements to determine an optimum mix of components to place at various points within the network.

Transmitters and receivers were specified and individually tested under ideal conditions. Rather than accounting directly for various types of signal degradations that occur in the network, testing under ideal conditions meant that system-level performance was assured only when individual devices passed tests by a significant margin. Since system configuration changes were not allowed, this was acceptable.

Stress Testing

As systems became more complex, faster, and even cheaper, it became necessary to test receivers using nonideal signals with different types of impairments. This is commonly referred to as a stressed signal.

The first types of stressed signals that gained wide acceptance in high-speed fiber-optic communications were for long-haul applications where the use of Erbium Doped Fiber Amplifiers was common. The Amplified Spontaneous Emission (ASE) noise could cause excess errors on receivers that otherwise performed well when tested under typical ideal conditions.

Vendors of long-haul network equipment required test signals intentionally degraded by ASE noise. These tests still are in wide use today.

As the use of inexpensive high-speed data links increased, cost requirements demanded interchangeable components. The philosophy for high-speed optical communications testing was changed significantly by the 1- and 10-Gigabit/s Ethernet standards.

IEEE 802.3z (1 Gb/s) published in 1998 and IEEE 802.3ae (10 Gb/s) published in 2002 introduced receiver testing with degraded signals. Both standards dictate a well-defined combination of impairments that shall be added to an ideal signal presented to the receiver under test.

The degradations emulate the worst-case noise and intersymbol interference (ISI) that might occur in a system. This approach to real-world testing can be considered as the evolutionary portion of these standards.

The receiver must properly recover signals that are degraded by expected worst-case performance of the transmitter and fiber. For example, 802.3ae specifies a test signal with both amplitude and timing impairments. The degraded signal eye shows closure relative to a clean signal due to intentional timing jitter and amplitude interference. This process was given the name stressed eye testing (Figure 3).

Figure 3. A Clean Eye Converted to a Stressed EyeBoth eyes are bandwidth limited by a Bessel-Thompson 7.5-GHz filter.

Receiver BER measurements are straightforward. Producing a distorted signal also is easy. However, producing a precision distorted signal can be difficult. If the signal distortion is too small, a bad receiver may appear good. If the distortion is too severe, a good receiver may appear bad.

Modern BER test systems (BERTs) have pattern generators designed to produce calibrated levels of distortion or stress. A stressed signal generator typically will have the capability to produce a data signal that has precise amounts of timing jitter, ISI, and attenuation (Figure 4).

Figure 4. Block Diagram of a BERT Pattern Generator

Most optical receivers use some form of clock recovery to extract a timing reference for the decision circuit. Clock recovery systems generally have a limit on how fast jitter can be and still be tracked and tolerated by the receiver.

To verify proper operation of the clock recovery system, 802.3ae requires that receivers tolerate both low-frequency and high-frequency jitter. A jitter tolerance template, consisting of specific jitter frequency/amplitude pairs, is applied to every receiver. While the standard requires receivers to be compliant for all jitter frequency/amplitude pairs on the jitter tolerance curve, the typical manufacturing compliance test focuses on the highest frequency case since receivers tend to easily tolerate low-frequency jitter.

802.3ae was well received within the optical communications industry, and other high-speed serial optical standards like 10G Fibre Channel adopted this concept of testing with a stressed eye. A receiver test process that better reflects real system conditions has helped to ensure the interoperability of network components.

Similar to transmitter test, the next significant evolution for receiver test emerged in 2006 with IEEE 803aq as an amendment to IEEE 802.3. Again, signal distortion through multimode fiber can result in a signal at the receiver that is very difficult to interpret.

The distortion on the fiber is mainly an ISI type of jitter. A 10-Gb/s signal traversing a multimode fiber of 100-m length may experience so much distortion that the received signal becomes unrecognizable. The impulse response of such a fiber can be much wider than 1 UI, and the resulting ISI causes high error rates using a conventional receiver. Electronic compensation circuitry in the receiver makes it possible to reverse the effects of dispersion in the channel and recover the data encoded in the transmitted signal.

How can a receiver be tested to ensure it is capable of operating with such a distorted signal? This signal must be recreated in the test environment. To create this new type of stressed signal, it is run through a low-pass Gaussian filter to slow down the rising and falling edges. The purpose of this filter is to emulate the limited bandwidth of the typical transmitter. Wideband noise then is added to emulate the effects of a noisy transmitter.

The ISI generator is a sophisticated circuit providing four paths with different delays before recombining with different amplitudes. Three sets of different amplitudes have to be evaluated. This is to simulate the different modal groups traveling along the multimode fiber. A compliant receiver operates down to the required stress-sensitivity power limit and still shows a BER below 10-12.

In summary, testing transceivers has evolved to accommodate the changes in system architecture and performance. As data rates increase, the basic eye-mask test still is viable, but important changes have shown up in recent standards, are gaining in popularity, and likely will be used in future standards. Receiver testing using complex stress signals also is becoming more common.

About the Authors

Greg Le Cheminant is a measurement applications specialist for Digital Signal Analysis Products in the Digital Test Division at Agilent Technologies and represents the company on several industry standards committees. His 23 years experience at Agilent/Hewlett-Packard includes five in manufacturing engineering and 17 in various product marketing positions. Mr. Le Cheminant is a contributing author to four textbooks on high-speed digital communications, has written numerous technical articles on test-related topics, and holds two patents. He earned a BSEET and an MSEE from Brigham Young University. Agilent Technologies, Digital Signal Analysis Division, 1400 Fountain Grove Parkway, Santa Rosa, CA 95403, 707-577-6524, e-mail: [email protected]

Rainer Plitschka is an application specialist for bit error ratio testers at the Agilent Technologies Digital and Photonic Test Operation within the Digital Test Division. He has been with Agilent since 1979, working 15 years in R&D as a hardware engineer and within project management. In 1995, he joined marketing, holding various functions in sales development and product marketing. Mr. Plitschka received a diploma in microwave technology from University Karlsruhe, Germany. Agilent Technologies R&D and Marketing GmbH & Co. KG, Herrenberger Strasse 130, 71034 Boeblingen, Germany, e-mail: [email protected]

November 2008

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!