As with most things electronic, today’s oscilloscopes are getting faster, more accurate and less expensive. Long an “indicating” device on the design or repair bench, mainstream digital storage oscilloscopes (DSOs) now offer high accuracy and remote control by a host computer that make them suitable for a range of measurement applications.
With the growing adoption of quality standards like ISO 9000, many companies are paying more attention to oscilloscope calibration and how it can affect product quality. Whether you buy an oscilloscope calibrator to do this work in-house or send these instruments to their respective manufacturers or local calibration service, an understanding of the practical side of oscilloscope calibration today may save you time and money.
Typical Oscilloscope Calibrators
Dedicated oscilloscope calibrators are special signal generators that source leveled sine wave, fast rise-time pulse and time-mark signals. Until recently, the most commonly used instruments were individual, manually operated plug-in generators manufactured by Tektronix. The growing interest in oscilloscope calibration has spawned a new generation of integrated oscilloscope calibrators offered as an option in general-purpose calibrators. The Fluke 5500A is an example of this type of instrument.
Most oscilloscope calibrators available today can accommodate the largest cross section of the oscilloscope calibration workload—instruments up to 250-MHz bandwidth. With the switch to digital oscilloscopes and the availability of low-cost 500-MHz instruments, additional equipment and techniques are required to extend the capabilities of the oscilloscope calibration equipment already in use.
Warranted vs Typical Specifications
As with analog oscilloscopes, high-speed digital oscilloscopes have warranted specifications for DC vertical gain, bandwidth and time-base uncertainty. Conspicuously absent are specifications for rise time and aberrations. Unlike their analog predecessors, these parameters for digital oscilloscopes are stated as “typical” or “calculated.”
This variability is significant. For a 1-GHz oscilloscope, this works out to a rise time between 350 ps and 450 ps. To complicate matters, only the National Institute of Standards and Technology in the United States quantifies leading-edge rise- time aberrations. Even then, only rise times to 1 ns are qualified although work on faster pulses is ongoing.
While rise-time performance in high-speed digital oscilloscopes is generally not warranted, verification is a good idea, especially after a repair. By understanding what you are trying to accomplish, you can save tens of thousands of dollars in equipment expense.
A 1-GHz oscilloscope will provide a typical rise time of 0.35 to 0.45/bandwidth, or 350 ps to 450 ps. Keep in mind that the pulse you see is a combination of the rise time of the pulse generator and the rise time of the oscilloscope. Those parameters need to be combined using a root sum square technique:
The fast-edge pulse is also used to check aberrations that may require adjustments should a bandwidth verification fail. One manufacturer of a popular 500-MHz DSO calls out a 175-ps, 240-mV peak-to-peak pulse requiring a high-performance pulse generator.
The verification requires that aberrations are <± 3% within the first 5 ns of the pulse. Is such a powerful piece of equipment really necessary? Figures 1 and 2 show the pulse response of this oscilloscope using an oscilloscope calibrator with a 150-ps pulse head and a 125-ps tunnel diode pulser. Both sources provide similar response over the first 5 ns.
Equipment prices range from a few hundred dollars for the tunnel-diode pulser to more than $20,000 for a calibrator with the 150-ps pulse head. A word of caution, however. The characteristic shape of the pulse is dependent on the source, so you need to know what a “good” pulse looks like using a known-good oscilloscope. This allows you to separate the distortion of the calibrator from that of the oscilloscope.
Low-Frequency Vertical Gain
Unlike analog oscilloscopes, vertical gain in digital oscilloscopes is usually verified using a low-noise DC source. With oscilloscope accuracies on the order of 0.2%, such a source should have an uncertainty of about one-fourth that or 0.05%. The source also needs to be quiet down to 1 mV for the most sensitive ranges.
Often a power supply monitored with a digital multimeter can be connected with a “T” to the oscilloscope input. An oscilloscope calibrator, although more expensive, simplifies matters by providing better impedance matching and by using attenuators to reduce noise.
As an added benefit, oscilloscope calibrators can source the 1-kHz square waves required for analog oscilloscopes. Although digital-oscilloscope calibration procedures may require only a dc voltage, a check using square waves can uncover other problems that might otherwise go undetected.
Most look at bandwidth as the most important performance characteristic of an oscilloscope. It is defined as the frequency where the signal level drops by 3 dB (~29%). With 50-W input impedance oscilloscopes, this is measured using a 50-W source. If the oscilloscope has a 1-MW input, a 50-W termination is used at the oscilloscope input. In either case, the calibration source outputs a sine wave with precisely controlled amplitudes over a wide frequency range.
The leveled sine-wave mode of today’s oscilloscope calibrators is, of course, a tailor-made solution for this test. These calibrators are designed to have sufficient flatness throughout their frequency range as well as the low voltage standing-wave ratio necessary to deliver the leveled sine wave to the oscilloscope. Since most of today’s oscilloscopes have 500-MHz bandwidth or less, mainstream oscilloscope calibrators with 250-MHz to 600-MHz bandwidth can be a practical solution as well.But what about those few oscilloscopes with greater than 500-MHz bandwidth? Often, a synthesized RF signal generator can be a practical alternative. However, the typical RF signal-generator output level accuracy of <1 dB (± 11% in voltage) is insufficient. It can be characterized at various calibration points (frequency and level) using an external power meter. Be sure to use the best quality Teflon cable you can find to minimize errors contributed by impedance mismatching.
Many manufacturers of digital oscilloscopes use a term called “effective bits” to describe the dynamic performance—nonlinearity and distortion—of the analog to digital converter (ADC). Although this is seldom a warranted specification, if it is specified at all, an effective-bits test can tell you a lot about the dynamic performance of the vertical section of a digital oscilloscope.
A sine-wave curve-fitting method is used to test effective bits. A sine wave is digitized and then compared to a perfect sine wave. Using an algorithm supplied by the manufacturer or other industry source, the effective-bits resolution is calculated. While a typical high-speed digital oscilloscope may use an 8-bit ADC, the actual resolution may be 4.5 bits to 6 bits depending on the dynamic response of the ADC.
To perform this test, you need a signal generator that can output a sine wave between 200 mV and 300 mV with better than -48 dB of distortion, assuming an 8-bit ADC. Many oscilloscope calibrators have sufficient performance to do this test. If an RF signal generator is used, external narrow bandpass filters may be required to remove harmonics and noise. Tests are normally done at one vertical gain setting, typically 0.05 V per division.
Most digital oscilloscopes feature more accurate time bases than their analog counterparts, typically between 10 ppm and 50 ppm, requiring a calibrator with a frequency accuracy on the order of 2.5 ppm. The method for verifying DSOs differs from analog oscilloscopes. But a comb pattern time-mark generator commonly used with analog oscilloscopes can be used for DSOs as long as the rising edge of each mark is fast enough and signal jitter is low.
For digital oscilloscopes, apply a signal and use the delay time-base function to increase the resolution. For example, in Figure 3, a 10-ms period signal is applied. The oscilloscope is set for 10 ms per division and triggers on the signal’s leading edge. The horizontal sweep generator (time base) is delayed 10 ms.
In Figure 4, the time base is set to 100 ns per division and the signal is viewed with respect to the first trigger. In effect, the resolution is now 1 ppm, more than sufficient for most digital oscilloscopes. The fast leading edge of the time mark helps you to determine the crossover point when using the faster time base.
By understanding oscilloscope specifications—those that are warranted and those that are only typical, you can make more informed decisions about how you calibrate your higher-speed instruments and what calibration equipment is required. Often with the proper precautions, you can use general-purpose equipment you may already have.
About the Author
Bob Myers is the product marketing manager for calibration at Fluke. He has held several marketing positions within Fluke over the past 10 years. Mr. Myers earned a B.S. from the University of Oregon School of Journalism. Fluke, P.O. Box 9090, Everett, WA 98206-9090, (206) 347-6100.
Copyright 1997 Nelson Publishing Inc.