Test & Measurement


Why calibrate test equipment?

3 May 2006 Test & Measurement

You are serious about your electrical test instruments. You buy top brands, and you expect them to be accurate. You know some people send their digital instruments to a metrology lab for calibration, and you wonder why.

After all, these are all electronic - there is no meter movement to go out of balance. What do those calibration folks do, anyhow - just change the battery?

These are valid concerns, especially since you cannot use your instrument while it is out for calibration. But, let us consider some other valid concerns. For example, what if an event rendered your instrument less accurate, or maybe even unsafe? What if you are working with tight tolerances and accurate measurement is key to proper operation of expensive processes or safety systems? What if you are trending data for maintenance purposes, and two meters used for the same measurement significantly disagree?

What is calibration?

Many people do a field comparison check of two meters, and call them 'calibrated' if they give the same reading. This is not calibration. It is simply a field check. It can show you if there is a problem, but it cannot show you which meter is right. If both meters are out of calibration by the same amount and in the same direction, it will not show you anything. Nor will it show you any trending - you will not know your instrument is headed for an 'out of cal' condition.

For an effective calibration, the calibration standard must be more accurate than the instrument under test. Most of us have a microwave oven or other appliance that displays the time in hours and minutes. Most of us live in places where we change the clocks at least twice a year, plus again after a power outage. When you set the time on that appliance, what do you use as your reference timepiece? Do you use a clock that displays seconds? You probably set the time on the 'digits-challenged' appliance when the reference clock is at the 'top' of a minute (eg, zero seconds). A metrology lab follows the same philosophy. They see how closely your 'whole minutes' track the correct number of seconds. And they do this at multiple points on the measurement scales.

Calibration typically requires a standard that has at least 10 times the accuracy of the instrument under test. Otherwise, you are calibrating within overlapping tolerances and the tolerances of your standard render an 'in cal' instrument 'out of cal' or vice-versa. Let us look at how that works.

Two instruments, A and B, measure 100 V within 1%. At 480 V, both are within tolerance. At 100 V input, A reads 99,1 V and B reads 100,9 V. But if you use B as your standard, A will appear to be out of tolerance. However, if B is accurate to 0,1%, then the most B will read at 100 V is 100,1 V. Now if you compare A to B, A is in tolerance. You can also see that A is at the low end of the tolerance range. Modifying A to bring that reading up will presumably keep A from giving a false reading as it experiences normal drift between calibrations.

Calibration, in its purest sense, is the comparison of an instrument to a known standard. Proper calibration involves use of a NIST-traceable standard - one that has paperwork showing it compares correctly to a chain of standards going back to a master standard maintained by the National Institute of Standards and Technology.

In practice, calibration includes correction. Usually, when you send an instrument for calibration, you authorise repair to bring the instrument back into calibration if it was 'out of cal.' You will get a report showing how far out of calibration the instrument was before, and how far out it is after. In the minutes and seconds scenario, you would find the calibration error required a correction to keep the device 'dead on,' but the error was well within the tolerances required for the measurements you made since the last calibration.

If the report shows gross calibration errors, you may need to go back to the work you did with that instrument and take new measurements until no errors are evident. You would start with the latest measurements and work your way toward the earliest ones. In nuclear safety-related work, you would have to redo all the measurements made since the previous calibration.

Causes of calibration problems

What knocks a digital instrument 'out of cal?' First, the major components of test instruments (eg, voltage references, input dividers, current shunts) can simply shift over time. This shifting is minor and usually harmless if you keep a good calibration schedule, and this shifting is typically what calibration finds and corrects.

But, suppose you drop a current clamp - hard. How do you know that clamp will accurately measure, now? You do not. It may well have gross calibration errors. Similarly, exposing a DMM to an overload can throw it off. Some people think this has little effect, because the inputs are fused or breaker-protected. But, those protection devices may not trip on a transient. Also, a large enough voltage input can jump across the input protection device entirely. This is far less likely with higher quality DMMs, which is one reason they are more cost-effective than the less expensive imports.

Calibration frequency

The question is not whether to calibrate - we can see that is a given. The question is when to calibrate. There is no 'one size fits all' answer. Consider these calibration frequencies:

* Manufacturer-recommended calibration interval: Manufacturers' specifications will indicate how often to calibrate their tools, but critical measurements may require different intervals.

* Before a major critical measuring project: Suppose you are taking a plant down for testing that requires highly accurate measurements. Decide which instruments you will use for that testing. Send them out for calibration, then 'lock them down' in storage so they are unused before that test.

* After a major critical measuring project: If you reserved calibrated test instruments for a particular testing operation, send that same equipment for calibration after the testing. When the calibration results come back, you will know whether you can consider that testing complete and reliable.

* After an event: If your instrument took a hit - something knocked out the internal overload or the unit absorbed a particularly sharp impact - send it out for calibration and have the safety integrity checked, as well.

* Per requirements: Some measurement jobs require calibrated, certified test equipment - regardless of the project size. Note that this requirement may not be explicitly stated but simply expected - review the specs before the test.

* Monthly, quarterly, or semi-annually: If you do mostly critical measurements and do them often, a shorter time span between calibrations means less chance of questionable test results.

* Annually: If you do a mix of critical and non-critical measurements, annual calibration tends to strike the right balance between prudence and cost.

* Biannually: If you seldom do critical measurements and do not expose your meter to an event, calibration at long frequencies can be cost-effective.

* Never: If your work requires just gross voltage checks (eg, 'Yep, that is 480 V'), calibration seems like overkill. But what if your instrument is exposed to an event? Calibration allows you to use the instrument with confidence.

One final note

While this article focuses on calibrating DMMs, the same reasoning applies to your other handheld test tools, including process calibrators. Calibration is not a matter of 'fine-tuning' your test instruments. Rather, it ensures you can safely and reliably use instruments to get the accurate test results you need. It is a form of quality assurance. You know the value of testing electrical equipment, or you would not have test instrumentation to begin with. Just as electrical equipment needs testing, so do your test instruments.



Credit(s)



Share this article:
Share via emailShare via LinkedInPrint this page

Further reading:

B&K Precision’s Series 1820B frequency counter
Comtest Test & Measurement
These compact and versatile instruments are designed for a wide range of frequency measurement applications, from telecommunications to verification and validation of oscillators.

Read more...
How transition-edge sensors detect microwave radiation
Test & Measurement
The elegant interplay of superconductivity, thermal physics, and precision electronics makes TES technology a cornerstone of modern low-energy photon detection.

Read more...
Compact high precision magnetometer
Future Electronics Test & Measurement
Bosch Sensortec has introduced the BMM350, a compact 16-bit, 3-axis magnetometer engineered to deliver high accuracy, low noise, and exceptional energy efficiency in space constrained designs.

Read more...
Ultra compact NTC thermistors
RS South Africa Test & Measurement
Murata Manufacturing Co. has expanded its compact NCU03 series of NTC thermistors with two ultra-small 0603M devices tailored for consumer and automotive designs.

Read more...
Omniflex uses LoRaWAN to track water usage
Omniflex Remote Monitoring Specialists Test & Measurement
Omniflex has helped New South Wales Ports improve its ability to track water usage by installing remote monitoring to 38 water meters at its Port Kembla site, sending the data to the NSWPorts web portal.

Read more...
Surviving the extremes: Understanding shock and vibration in MEMS sensors
Altron Arrow Editor's Choice Test & Measurement
By considering factors such as mechanical headroom, damping, and system-level robustness, designers can ensure that the chosen sensor not only survives, but performs reliably over time.

Read more...
Advanced pressure monitoring sensor
EBV Electrolink Test & Measurement
The Infineon KP497 is an advanced, highly integrated digital pressure sensor designed for demanding automotive and industrial applications, with a particular focus on battery management systems.

Read more...
Slimline 150 W bench PSU
Vepac Electronics Test & Measurement
The PeakTech P 6222 is the company’s new, slim laboratory power supply that offers precisely adjustable output values of 30 V and 5 A with a continuous power output of 150W.

Read more...
Redefining edge intelligence in RF analysis
Vepac Electronics Test & Measurement
The HAROGIC PXR Series bridges the gap between benchtop-grade RF performance and high-performance edge computing.

Read more...
Otto Wireless appointed as sole agent for Dragino Technology
Otto Wireless Solutions Test & Measurement
Dragino is a globally recognised leader in LoRaWAN and Internet of Things products and solutions, and is known for delivering reliable, cost-effective hardware.

Read more...









While every effort has been made to ensure the accuracy of the information contained herein, the publisher and its agents cannot be held responsible for any errors contained, or any loss incurred as a result. Articles published do not necessarily reflect the views of the publishers. The editor reserves the right to alter or cut copy. Articles submitted are deemed to have been cleared for publication. Advertisements and company contact details are published as provided by the advertiser. Technews Publishing (Pty) Ltd cannot be held responsible for the accuracy or veracity of supplied material.




© Technews Publishing (Pty) Ltd | All Rights Reserved