For major telecommunications carriers and service providers, it is time for a move to 100G coherent optical for long-haul and ultra-long-haul dense wavelength division multiplexing (DWDM). The rapid growth of high-bandwidth applications like on-demand video and cloud-based services conspire to make carrying capacity an absolute top priority. For now, the solution is to squeeze more spectral efficiency out of existing fiber. To accomplish this, carriers are looking beyond conventional on-off keying (OOK) to coherent dual-polarization quadrature phase-shift keying (DP-QPSK), quadrature amplitude modulation (QAM), and many variations of orthogonal frequency-division multiplexing (OFDM).
With coherent lightwave techniques moving from R&D to manufacturing and production deployment, many development laboratories still content themselves with in-house coherent receivers and analysis software, often coupled to adaptive equalizers, to maximize eye openings under all conditions. While this approach is important for receiver development, it may frequently miss critical sources of signal distortion in the transmission system, and the slow response of these systems may prohibit rapid identification of root causes of failure.
It is therefore crucial to have characterized, calibrated, and repeatable analysis methods and equipment. However, many of the test standards needed to ensure interoperability, such as the eye masks commonly used for serial data communications systems, remain undefined for the current leader, the DP-QPSK format, which is sanctioned by the Optical Internetworking Forum (OIF). Nevertheless, testing to a specific Q factor can at least allow cross-checking of component performance. It is vital to devise a test strategy that can be adapted to accommodate different coherent modulation schemes as the various standards and technologies evolve over the next several years.
When designing and deploying coherent long-haul fiber transmission systems, it is important to ensure that the coherent optical transceiver achieves predictable bit-error-rate performance and repeatable Q factors. In this article, we look at test tools, in particular coherent lightwave signal analyzer technologies, which make it possible to discover and mediate impairments of a physical design whether at R&D, manufacturing, or during deployment.
Understanding What Went Wrong
In any transmission system, the ability to determine “what went wrong” when a transceiver fails in production or in the field is critical to success. Unfortunately, the conventional method of direct detection is insufficient for the measurement of phase-modulated signals. For instance, a photodiode used in the traditional way, as a receiver set up to detect OOK or amplitude modulation, will respond with all ones when presented with a phase-modulated signal in which the optical carrier is modulated in phase but not in amplitude. As a result, conventional eye-diagram analyzers are not recommended in their current configuration because they cannot be used to plot conventional quantities.
What is required instead is a coherent lightwave signal analyzer that derives phase information by mixing the incoming field or signal under test with a local laser that operates at a fixed wavelength. The coherent analyzer allows the user to look at the optical signal in the complex plane, meaning both the real and imaginary parts of the signal. This combination of amplitude and phase or vector describes the magnitude (amplitude) and angle (phase) relative to the reference signal produced by the local laser.
As the transition from R&D to production takes place, the test system needs the ability to compensate for impairments. Direct-detected methods do this in a limited fashion in hardware while a coherent approach can do this in software by emulating the firmware used for phase and clock recovery, polarization resolution, and equalization. This also makes it possible to model the transmission channel and disentangle impairments to provide insights into what is causing bit-error rates. There is also a need at this point to develop test margins and strategies for quick problem identification and resolution.
The coherent test instrumentation can also be used in production environments to “turn knobs” to track down sources of errors. These include transmitter modulator-bias adjustments, loop tracking, source power and line width, and tuning on the laser themselves, as well as determining modulator driver power and their signal quality. Other potential sources of error and failure include receiver function thresholds and margins as well as the receiver hybrid calibration matrix on the optical paths. Once in the field, testing involves bias loop and feedback, source power measurements, tuning, and understanding thermal effects.
Coherent signal analyzers
With coherent detection, complex (quadrature) modulation with polarization diversity has the advantage of exploiting the entire electric field of the optical carrier. In addition to gains in spectral efficiency, access to these field quantities at the coherent receiver further enables mathematical filtering to compensate fully for impairments such as chromatic and polarization-mode dispersion. Signal quality can then be measured using metrics such as Q-factor or error vector magnitude (EVM) to characterize and debug transmitters, transceivers, transponders, lasers, modulators, and semiconductor devices.
A coherent or constellation analyzer includes a polarization diverse optical front-end together with a tightly integrated ultra-wideband and low-noise, real-time oscilloscope to digitize the output of four balanced photo receivers and process the result to recover the phase and clock. It then presents a stable constellation for both the x and y polarizations of the coherently modulated signals in the fiber.
The coherent analyzer enables a complete calibration from the fiber input to the electrical output to make sure that hardware is golden. In other words, using an oscilloscope with the widest bandwidth, highest sampling rate, and the most sensitivity, the coherent optical analyzer can completely calibrate and faithfully represent the optical field in fiber. This includes calibration of the analog front-end path gains, phase angles as well as frequency response and skew or path delays.
Let us examine the architecture of an optical analyzer (Fig. 1). Serving as the optical front end to the digitizer, the coherent reference receiver takes as input two single-mode fibers, one carrying the signal and the other the phase reference, or local oscillator (LO). In the receiver, the phase reference is evenly divided into both X and Y polarizations, and mixed with the signal in two branches, both I and Q. The four channels are transduced by balanced photodetectors into electrical outputs, which in turn feed into a real-time oscilloscope with enough bandwidth to capture the difference-frequency waveforms.
At the heart of this system is the digitizer. Given the high data rates available with fiber, it is important to have a digitizer with the most accuracy and sensitivity along with the widest bandwidth available. Oscilloscope manufacturers are constantly evolving digitizer technology to address market needs. The latest models now provide more than 20 GHz of bandwidth and 50-GS/s sampling rates across four channels. For more performance, one may combine multiple oscilloscopes for more than 30 GHz of bandwidth and 100-GS/s sampling rates across four channels. The number of channels is important because full-field characterization of signals in fiber requires four channels: in-phase and quadrature for both X and Y polarizations.
The burst-mode channel data is then processed using software running on the oscilloscope (or an external computer) to extract the tributaries associated with the modulation scheme, report measurements on the results, and display the extracted signal in a variety of formats. These can include constellation diagrams of each polarization and eye diagrams of each tributary with associated Q-plots (Fig. 2). Such software offers numerous other ways to present the data, or users could create their own presentations using MATLAB.
Practical application of test strategies
When considering options during the transition from R&D to qualification and production, it is important to understand the distinction between a coherent instrument and a coherent receiver. We design a coherent instrument to have the widest bandwidth and calibrated performance possible. By contrast, we design a coherent receiver to have only the bandwidth necessary to give a certain bit-error rate at a given optical signal-to-noise ratio. Thus, the quality of the eye will be less than what can be depicted on a true calibrated instrument, and there will be no way to accurately determine test margins or to see the root cause of failures. The receiver’s role is to adapt and narrow bandwidth to eliminate as much noise as possible, making it a poor substitute for test instrumentation beyond simple pass/fail.
In a production environment, the coherent signal analyzer can be set up to perform a range of tests automatically, which saves time. Thanks to graphical user interfaces, a wide range of users with varying skill sets can learn and use the system. It also makes it possible to test different equalization and phase-recovery algorithms in a live, interactive environment to optimize performance. Further, the ability to understand the effect of bandwidth limitation at different points in the signal chain can help to reveal the points where these limitations are creating excess errors.
Impact of digitizer performance
One of the most important requirements for a coherent optical signal analyzer is accurate presentation of the actual signal crossing the fiber on a screen. We evaluate this by examining the sensitivity, linearity and bandwidth of the digitizer. For a given coherent optical front end, the digitizing system is a critical variable in determining how accurately the measurement system functions.
Another example shows the same 28-Gbaud single polarization electrical signal, but the data is captured using different digitizers (Fig. 3). In the diagram on the far left, the effects of bandwidth limitations in a scope with bandwidth of less than 20 GHz are apparent in the rounding of the eye and the lack of a completely flat upper and lower rail. Granted, the eye is completely open and therefore this is an error-free measurement, but it is not a fully accurate representation of the incoming signal. As the performance improves, the quality of the eye improves as well. We essentially eliminate the limitations of bandwidth and frequency response with an acquisition taken at 33 GHz with a 100-GS/s sample rate, illustrating the importance of having an accurate and sensitive digitizer.
The advantage of a high-bandwidth digitizer when paired with a coherent analyzer is that it can reveal the source of limitations in an optical transmission system. Let us examine another view of bandwidth, in this case signal spectrum (Fig. 4). At 20 GHz, the limitations of the modulator and the modulator driver are not evident because the bandwidth of the oscilloscope itself has become part of the contributing limitations. However, at 33 GHz, the point where the signal drops off is now clearly visible because the bandwidth of the digitizer and optical receiver are no longer a factor.
Measuring Tx constellation imperfections
The coherent optical signal analyzer provides a wide range of measurements to get at the root cause of problems and to understand the various sources of impairments. One of the more useful measurements is to look at transmitter constellation imperfections, including error vector magnitude (EVM), Q-factor, and phase angle.
EVM is fundamentally an analog measurement of what will become a digital signal. Looking at the detected symbols and measuring their distance from the ideal symbol location yields the error vector magnitude (Fig. 5). EVM might be listed as an average or a function of time. An advantage of EVM is that you do not need to know the pattern, as is the case with Q-factor.
On the other hand, Q-factors provide a much more realistic and nuanced view of the symbols (Fig. 6). With Q-factors, the system is moving the decision threshold and counting the bit errors resulting from moving the decision threshold. Q-plots, which are effectively inverted bathtub curves, indicate in-phase and quadrature components of a coherently modulated eye. The Q-plot provides not only the projected bit-error rates but also a singular measure of the eye quality as a Q-factor. This makes it possible to quickly determine if there are any impairments in bandwidth by optimizing Q.
Another measurement that points to root causes is constellation phase angle, which can be precisely read out using a calibrated instrument. Consider the case of a 76° phase angle between the in-phase and quadrature components of a modulated signal compared to the ideal 90° angle (Fig. 7). Such a signal would be error-free in a back-to-back scenario. However, as it becomes noise loaded, the error rate would likely increase more quickly than if the system was more precisely tuned.
A coherent lightwave signal analyzer allows engineers to understand and optimize optical networks that use advanced modulation such as DP-QPSK. It is able to analyze the signal to measure constellation parameters, quadrature and modulator bias values, symbol masking, EVM, signal and phase spectra, BER, and Q vs. decision threshold. This range of accurate and repeatable analysis saves time and makes the system usable by field technicians or manufacturing engineers with a range of knowledge and background.
As 100G technologies transition from R&D to qualification and production, test automation enabled by coherent signal analyzers becomes increasingly important as well. Other capabilities of the signal analyzer include test equalization and phase recovery algorithms and the ability to understand the various effects of bandwidth limitation whether at the transmitter, digitizer or the receiver.