What's the relationship between jitter, phase noise, and signal-to-noise ratio (SNR) in data converters?
Clock jitter is a problem in data conversion because it introduces uncertainty (noise) into the conversion process. Jitter in the time domain is equivalent to phase noise in the frequency domain. Phase noise spreads some of the clock's power away from its fundamental frequency.
This is significant because sampling can be equivalent to mixing or multiplication in the time domain, which is equivalent to convolution in the frequency domain. Thus, the spectrum of the sample clock is convolved with the spectrum of the input signal. Also, because jitter is wideband noise on the clock, it shows up as wideband noise in the sampled spectrum. The spectrum is periodic and repeated around the sample rate. Thus, this wideband noise degrades the noise floor performance of the analog-to-digital converter (ADC).
What equations can be used to analyze the effects of jitter/phase noise on converters?
To calculate the effect of phase noise on SNR, consider that a clock time delay is equivalent to a phase delay at a given frequency. In terms of noise power, this implies that phase noise in rms radians, su2, equals v2clk times st2, where st is the phase jitter in rms seconds, and vclk is the clock frequency in radians/s. Thus, for any value of jitter error, a higher-frequency signal will have a greater phase error.
Phase noise defines the clock SNR by:
SNRclk(dB) = 10log su2
Assume a simple case in which the bandwidth of the clock jitter falls into a single Nyquist zone, and exclude quantization noise and thermal noise. In single-carrier systems, then, the SNR of a signal, f0, sampled with a jittery clock is:
SNRsig(dB) = 1/(4p2st2f0)