Two of the most important features of a digital storage oscilloscope (DSO) are the length of its acquisition memory and the amount of RAM memory that can be applied to calculating answers from the raw data. In many cases, the amount of acquisition memory determines the fidelity with which the scope can record a signal.

But recording the signal is only the first step. The key to finding signal aberrations, characterizing circuit performance and making the wide variety of measurements that have made DSOs popular is the processing horsepower of the oscilloscope.

**Capturing a Signal**

The maximum time window that can be captured by a DSO using a sampling period D t is:

time window = D t x acquisition memory length

where: acquisition memory length is the number of samples that can be captured in the data acquisition memory.

Since the acquisition memory length is a fixed amount, the only way to capture longer time windows is to make the period between samples longer (**Figure 1**). For example, a scope with 100k of memory and a sampling period of 2 ns (500 MS/s) can capture a total time window of 0.2 ms at that sampling rate.

If you want to capture a 4-ms signal using 100 ksamples, you would have to stretch the points to 40-ns per sample (25 MS/s), degrading the accuracy of timing measurements by a factor of 20 and losing many signal details. Any frequency above 12.5 MHz (one-half the sample rate) will be aliased.

Many DSO users believe the ADC in a scope determines its sampling rate. They don’t realize the acquisition memory length also plays a vital role.

The state of the art for acquisition memory is 2 million sample points per channel. This means a full 4 ms can be captured at 2 ns per sample using 2 million points. Consequently, a scope putting 2 million points on a signal will give you 20 times better timing accuracy, a much better view of your signal and more usable bandwidth than one which uses 100 kpoints on the same signal.

**Signal Irregularities**

One of the prime purposes of an oscilloscope is to troubleshoot problems. The toughest problems are ones which occur infrequently. Scope vendors have been working hard to help engineers with this task.

One new scope has a chip set that can quickly acquire many triggers and display them in a color-persistence mode. Less frequent events come out in a different color than common events. But there are some limitations, beginning with the small number of analysis tools available. Also, only 500 points can be acquired per trigger by this chip set. This means the signal must have a short, simple shape or the sampling rate must be reduced to record long events, increasing the danger that signal details and glitches will be missed between samples.

Long memory can be used in a different way to attack this problem. Suppose the symptom is an occasional misbehavior of a clock. The nature of the problem is unknown so there is no prior knowledge that would allow you to set up a special trigger based on amplitude, rise time, width, etc.

You can simply use auto trigger, acquire 2 million samples of continuous clock data (per trigger) and then histogram the pulse amplitudes, widths, rise times, areas or other parameters of interest. A single trigger with 2 million data points will have as much information as 4,000 triggers of 500 points each. In just a few triggers, you get enough data to see the nature of the irregularities.

With this method, there is measurable information about the number of occurrences of each type of wrongly shaped clock pulse. **Figure 2** shows what the results might look like if the clock synchronizer occasionally chopped a clock pulse. There are rare pulses which are very short followed by a second pulse with a glitch.

The histogram of 993 sweeps quickly acquires 7,046 pulses. The lowest width is 7.4 ns, the average is 50 ns and the high is 56.2. Note that the vertical scale is logarithmic. There are 12 bad clocks with 7.4-ns width, 12 with 56.2-ns width and 7,012 with the normal 50-ns width. You can measure the ratio of good to bad pulses, make an adjustment and see if it has a measurable effect or use the data in the histogram to set up a special trigger based on width to troubleshoot the problem.

**Frequency Domain Measurements**

One of the most common options in DSOs is fast Fourier transform (FFT) capability. Since the FFT in a DSO comes from a set of discrete points, with sampling period D t, the information in the frequency domain is also a discrete set of points, whose spacing is D f.

The resolution in the frequency domain is determined by the frequency span being measured and the number of points within that span. Nyquist’s theorem determines the range of frequencies that can be measured. They range from DC to one-half the sampling rate at which the data was captured.

An FFT of an array of N time domain data points produces N/2 frequency domain points within the range of frequencies between DC and the Nyquist frequency. So the frequency resolution of the FFT is

1/2 sampling rate

D f = ———————————————————-

1/2 number of points input to the FFT algorithm

The two one-halves cancel, giving a resolution equal to the sampling rate divided by the number of points input to the FFT. Obviously, it is important to capture the data at a high sampling rate.

Long data acquisition memory plays an important role here since it allows the scope to have a fast sampling rate for a longer period of time. More data points also can be put into the FFT algorithm if we capture more points in a long memory scope.

But just capturing the points isn’t enough. The DSO needs the processing power to actually compute the FFT on a long data array. For example, one recently introduced DSO captures up to 500,000 points of data on a signal, but the FFT processing in the scope is limited to the first 10,000 points captured. This loses a factor of 50 in the resolution of the FFT compared to a scope which can process all 500,000 points. This is a tremendous loss in frequency information. Why would a vendor do this? The answer lies in the next important facet of memory in a DSO.

An FFT calculation is complex and may require 10 times as much RAM in the processing memory as the number of points input to the FFT algorithm. To perform an FFT on a 500,000-point waveform may require 5 MB of RAM. You also need a fast, powerful processor and a numerical coprocessor to handle long data arrays. Both the RAM and the processor/coprocessor add considerable cost to a scope.

**Figure 3** shows the difference made by this trade-off between price and performance. In Figure 3a, an FFT is performed on the first 10,000 points of a waveform. In Figure 3b, 1,000,000 samples are captured on the same waveform and an FFT is performed on the entire record. Both sets of data are captured at a 500 MS/s sampling rate so the highest frequency component measured is 250 MHz. The difference in frequency resolution is a factor of 100, 50 kHz vs 500 Hz.

The frequency peaks on the bottom of Figure 3a are very broad. In fact, there is only a single point on each of them. In Figure 3b, the peaks are seen more accurately as being very narrow. The first peak is really two peaks at closely spaced frequencies. Those two peaks could not be resolved by the 10 kpoint FFT.

In buying $5,000 computers, we have become experts about the amount of RAM, how much cache memory there is for the processor and the amount of local RAM on the video board. Those memories are very important to the power of the computer.

The same is true with DSOs where we need to know about data acquisition memory, processing RAM, display memory and storage memory for both waveforms and front-panel setups. The power of the scope is in its capability to capture, view, measure, analyze and archive signals–all of which are tied to its memory.

*About the Author** *

* Dr. Michael Lauterbach is Director-Product Management for LeCroy Corp. He holds a Ph.D. in Physics from Yale University and undergraduate degrees in Physics and Mathematics from Carleton College. LeCroy Corp., 700 Chestnut Ridge Rd., Chestnut Ridge, NY 10977-6499, (800) 553-2769. *

Copyright 1995 Nelson Publishing Inc.

July 1995