0801dmms1

When is a Volt a Volt?

In the 1930s, an isolated coal mining town in Pennsylvania had an AM radio station. Every day at 6 a.m., 2 p.m., and 10 p.m., the guard on duty at the mine would look at his high-quality pocket watch to verify the time, then pull the chain to blow the whistle, notifying everyone in town that it was time for a shift change.

Every day from 6 a.m. to 10 p.m., the radio station aired music, read the latest news from the daily paper, and presented commercials. There were no network connections, so everything originated in the little studio. Every half hour, the announcer identified the station call letters, looked at his expensive pocket watch, and gave the time.

A visitor arrived from the big city one day and noted that, if his watch was correct, the company whistle was 10 minutes late. Not wanting to judge too quickly, he found a radio, tuned to the local station, and learned that the announced time agreed with that of the mine.

Still not satisfied, he went to the guard and asked how he set his watch. “I listen to the radio station every afternoon to get the time.”

At the station, he made the same inquiry. “Oh, I set my watch every morning when the coal-mine whistle blows.” Oops! The guard’s and radio station announcer’s watches were in good condition. However, both were 10 minutes late because neither used a traceable standard.

This story, often-repeated but never verified, carries a message for test engineers. A timing system or test instrument can be the best on the market, but unless it is calibrated periodically to a standard traceable to an impeccable source, it can give the wrong answers.

Calibration Before Delivery

DMM manufacturers are well aware that, like the guard’s and station announcer’s expensive pocket watches, their high-performance instruments are useless unless calibrated to a traceable standard. Ideally, we would like all instruments to be calibrated frequently by the U.S. government’s National Institute of Standards and Technology (NIST) in Gaithersburg, MD. This would assure us that when we get a reading of 12.9973 VDC, it really is 12.9973 VDC.

However, since it is impractical to tie up the resources of the institute to calibrate the hundreds of thousands of DMMs in the nation, we must settle for calibration by transfer standards. The validity of these high-resolution, high-quality, stable instruments can be traced to the NIST.

All DMM manufacturers recognize the importance of standards traceability. For example, Agilent Technologies has several standards in its DMM calibration laboratory directly traceable to the NIST. “Our Josephson-junction array,” according to Mark Bailey, voltmeter product manager, “operates near 0°K and emulates the reference volt at the NIST. It has been in continuous operation longer than any other commercial array in the world. We also maintain our resistance standard in a temperature-controlled oil bath.”

There even are ways to minimize the effects of drift during the calibration sequence. For example, Signametrics manufactures plug-in DMMs so the company calibrates them to transfer standards in a PCI or PXI chassis loop. “Several instruments are mounted in a chassis,” the president, Tee Sheffer, said, “and the host computer commands the calibrator and all instruments simultaneously. It reads the results and issues a report to the operator for adjustments as necessary. In the case of capacitance and inductance measurements, we use golden transfer components.”

Calibration After Delivery

How can you get periodic assurance that your DMMs still are as accurate as when they were built and calibrated? Some users, especially those in small organizations, return their DMMs to the original manufacturer or a third party for calibration on a regular schedule. Others perform their own calibrations in a standards laboratory and use carefully protected instruments as golden transfer standards.

Golden standards are kept in the calibration laboratory and maintained in a controlled environment. Only specially qualified personnel can use or even touch them since their integrity must be guaranteed to the next higher level of calibration such as NIST.

Since the cost of transfer standards and their transportation to the NIST is high, the number of such instruments and frequency of transfer is kept as low as practical, and a high-quality DMM is used to transfer traceability to other DMMs in the facility using intercomparison techniques. “Our 8½-digit 1281 often is used for such transfer,” according to Peter Dack, business development manager at Fluke. “It has two inputs, so the calibrated standard can be compared with the unknown without risking possible errors in reconnection. We call this the Ratio Mode.”

The Fluke 1281 also incorporates two techniques to remove sources of error in resistance measurements. In the True Ohms mode, it switches the measurement current on and off periodically as it compensates for thermal errors, and in the low-current resistance mode, it divides the current by a factor of 10 to reduce errors in measurement of components that are susceptible to self-heating.

Agilent Technologies recommends a 90-day calibration interval for critical applications and at least a one-year cycle on every instrument. The company suggests using Fluke’s MET/CAL® Software with the Agilent 6½-digit 34401 or 8½-digit 3458A DMM. The company policy and the recommendation to the user: “You must adjust.” Even if an instrument is within limits, it must be adjusted to the center of the band.

Fluke recommends using one of its precision multifunction calibrators, such as the 5720 or the 4808. Each can be used manually or automated with the MET/CAL Calibration Management Software.

National Instruments provides accuracy specifications for 24-h, 90-day, and one-yr operation. For in-house calibration, the company’s Calibration Executive Software automates the process using the Fluke 5720 Precision Source. NI also suggests that a DMM be returned after each year of use. In turn, all units are sent to Verizon-ERS, an NI global partner, for precision calibration to a traceable standard.

Like other manufacturers, Signametrics provides detailed calibration procedures with each DMM. Added is the reminder that if the instrument is used for a limited subset of functions, you may choose to restrict the calibration to those functions and label the calibration records accordingly.

Keithley Instruments has developed a way to minimize the number of precision sources needed for voltage and resistance calibration. “Using our 28-bit analog-to-digital converter, linear front-end amplifier, and ladder calibration scheme, you can calibrate the 0.1-VDC to 1,000-VDC ranges of our 2750 with just two sources,” noted Chris Miller, the DMM product manager. “In a similar manner, we use just four precision resistors to calibrate the 1-W to 100-MW resistance ranges.”

This Year’s DMMs

New products with especially useful features are offered by DMM manufacturers this year. The characteristics of each are summarized in the DMMs comparison chart that accompanies this article.

Return to EE Home Page

Published by EE-Evaluation Engineering
All contents © 2001 Nelson Publishing Inc.
No reprint, distribution, or reuse in any medium is permitted
without the express written consent of the publisher.

August 2001

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!