Jitter And Its Measurements

Aug. 6, 2001
Jitter is the unwanted variations of a binary signal's leading and trailing edges. It occurs as the signal is processed or transmitted from one point to another. Most jitter is caused by noise picked up from a phase-locked loop (PLL) in the signal...

Jitter is the unwanted variations of a binary signal's leading and trailing edges. It occurs as the signal is processed or transmitted from one point to another. Most jitter is caused by noise picked up from a phase-locked loop (PLL) in the signal path. Jitter also is a time displacement, either periodic or random, of a signal's switching edges. You can think of jitter as the lengthening or shortening of one signal element, usually one bit time, in an NRZ binary signal. Jitter also can be considered as a form of FM, and it produces an FM-like spectrum.

Whatever you call it, jitter is a nuisance for engineers. It's difficult to eliminate and measure. During the design of OC-48 and OC-192 Sonet systems with DWDM, or the new 1-Gbit/s and 10-Gbit/s Ethernet networks, jitter became a major focus due to the decrease in timing margins. Excessive jitter always increases the bit-error rate (BER) in the system. As a result, most serial data-communications systems have jitter standards that must be met to ensure robust performance and the quality of service (QoS) expected in today's networks.

The three critical jitter specifications are:

Jitter tolerance—the amount of jitter that a system can accommodate at its input. It's directly affected by the capture and tracking characteristics of the PLL in the clock and data recovery (CDR) circuit.

Jitter transfer—the amount of input jitter passed through to the output. It's a function of the PLL bandwidth.

Jitter generation—the amount of jitter produced with the PLL. Low-bandwidth loops produce more jitter.

Most jitter results from internal noise produced by the circuits in the PLL, coupling through sensitive nodes such as analog-loop filter connections, and power-supply ripple, as well as coupling from surrounding circuits or equipment, especially clocks.

The units of jitter measurement are picoseconds peak-to-peak (ps p-p), rms, and percent of the unit interval (UI). The p-p measurement states the maximum to minimum amount of time deviation, usually in picoseconds. A jitter measurement can also be the p-p average over a 30- or 60-s duration, or over, say, 10,000 cycles. Rms jitter is one standard deviation (σ) of the p-p jitter value where the distribution is Gaussian in nature (see the figure). Jitter also is expressed as a percentage of time compared to the UI or one bit time. For example, one UI at 10 Gbits/s is 100 ps. A jitter specification might be 40 mUI, meaning 4 ps.

Bellcore (Telecordia) GR-1377-CORE, Issue 5 specifications list the jitter specifications that Sonet/SDH systems should meet. The typical specification for OC-192 is a maximum jitter of 100 mUI p-p, and the rms value is approximately 1/10 to 1/8 of the p-p value, or around 10 to 12.5 mUI.

Sponsored Recommendations

The Importance of PCB Design in Consumer Products

April 25, 2024
Explore the importance of PCB design and how Fusion 360 can help your team react to evolving consumer demands.

PCB Design Mastery for Assembly & Fabrication

April 25, 2024
This guide explores PCB circuit board design, focusing on both Design For Assembly (DFA) and Design For Fabrication (DFab) perspectives.

What is Design Rule Checking in PCBs?

April 25, 2024
Explore the importance of Design Rule Checking (DRC) in manufacturing and how Autodesk Fusion 360 enhances the process.

Unlocking the Power of IoT Integration for Elevated PCB Designs

April 25, 2024
What does it take to add IoT into your product? What advantages does IoT have in PCB related projects? Read to find answers to your IoT design questions.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!