Calibration with confidence -
the assurance of temperature accuracy
Taylor Instrument / Consumer-Industrial
Products / Sybron Corporation
Arden, North Carolina 28704
Highly sensitive temperature
devices, particularly those with multi-digit electronic display, give the
illusion of accuracy. However, knowledge of true temperature -- the real
concern of measurement accuracy -- is only indirectly related to sensitivity
or precision. To assure temperature accuracy, it is necessary to maintain
a temperature reference standards capability. this must include equipment
and procedures that permit calibration of operating devices with temperature
standards in a way that insures minimum uncertainty. For most requirements
the creation and maintenance of such capability is neither expensive or
difficult, but lack of understanding often results in expense and inaccuracy.
Equipment and procedures are discussed that permit calibration with confidence
at three levels of accuracy; an uncertainty level of +/- 1.0 degrees Celsius,
+/- 0.1 degrees Celsius, and +/- 0.01 degrees Celsius, respectively.
Subject index: Calibration
This paper outlines temperature
instrument calibration fundamentals that apply to "daily use" conditions
in laboratory and industry. In style, language, and content, therefore,
it differs from the majority of papers on temperature measurement.
Most technical papers are
written to advance knowledge in a given field, and are written primarily
to benefit the few working actively in, and are most familiar with, that
In contrast, this paper is
written to restore to general understanding a knowledge of long-standing
calibration fundamentals that are familiar to experienced professionals
in the field, but are not generally understood by many who have a "need
The task of assuring accuracy
in temperature measurement is critically important. Safety or health would
be compromised, equipment damaged or product wasted in many processes if
the temperature were incorrect. And no matter how precise the measurement
or careful the operator, if the device is not calibrated correctly, the
result is wrong.
II. DEFINITION OF TERMS
The assurance of temperature
accuracy begins with an understanding of four key concepts: "Accuracy,"
"Precision," "Reference" and "Standards," and the relationship between
"Precision" in temperature measurement
has to do with detecting very small changes, and also with the ability
to repeat measurements again and again with similar results. It may or
may not imply knowledge of a correct temperature.
"Accuracy," on the other hand,
refers to a knowledge of true temperature, and implies confidences in the
similarity of measurements in one location and another. For example, a
heat sterilization temperature may be determined in the laboratory, and
then monitored in the manufacturing area. Instruments used in either or
both locations may indicate temperature to a small fraction of a degree.
If either or both, however, are not correctly calibrated or are subject
to drift, there may be failure of the process because of inaccuracy. Precision
often brings a false sense of accuracy.
of temperature are the key to assurance of accuracy. Two types are important
-- primary standards and secondary, or reference standards. The most widely
accepted primary standards are those used to define the International Practical
Temperature Scale -- IPTS-68. (ref.1)
"Reference" is a term used in
two ways in temperature measurement. First, it is used to describe the
process of comparing the reading of one instrument with another -- most
commonly the indication of an instrument being calibrated with the "known"
temperature of a primary standard material or thermometer. Second, it is
used as a term describing a thermometer itself -- a "master reference thermometer"
or "Secondary reference thermometer." This is a high-accuracy instrument
-- commonly a specially-made mercury-in-glass thermometer -- used to calibrate
temperature devices under "daily use" conditions in laboratory and industry.
Either way, the term "reference" refers to the comparison process
by which correct calibration is assured.
III. BASIC APPROACH
Most temperature measurement
involves use of a measuring instrument of some type, usually a thermometer.
Assurance of accuracy of that instrument involves basically a two step
Details of the methods and equipment
needed to accomplish these two procedures with confidence, depend on the
level of uncertainty required. We will consider in this paper the equipment
and procedures needed to calibrate to three uncertainty levels; an uncertainty
(or maximum expected error) of +/- 1 degrees Celsius, of +/- 0.1 degrees
Celsius, and +/- 0.01 degrees Celsius. However, calibration at all three
levels involves this same basic two-step approach.
Compare -- under conditions as
close as possible to actual operation, the indication of an operating instrument
with a "working standard" -- a master reference thermometer whose accuracy
is known with very small uncertainty.
Periodically check the accuracy
of the master reference thermometer in accordance with the manufacturer's
instructions -- either by reference to a primary standard of temperature
such as an ice bath, or by having the instrument re-calibrated at NIST
or a respected testing laboratory.
IV. CALIBRATION FUNDAMENTALS
There are four important fundamental
considerations that are most important in assuring good calibration procedure:
Insure that conditions of installation
of the sensing element approximate actual use conditions as closely as
possible. Degree of immersion, ambient temperature, shielding and housing
(protective shield or other installation accessory) all may affect the
heat flux around the sensing element and thus influence its calibration.
Much calibration work is done by using a rapidly-agitated liquid bath as
an approximation of actual use conditions. Such baths are the least expensive
way to provide a stable, uniform, easily regulated temperature transfer
medium. They may or may not closely simulate actual sensing element heat
flux conditions. For instance -- consider a sensing element that is in
a metal-shielded housing (with a substantial heat flux through the housing
to cooler surroundings and hence a higher-than-normal reading. On the other
hand, oil has much poorer heat transfer capability than water or steam
due to its insulating properties, and hence may supply less heat to maintain
the heat flux, resulting in a lower-than-normal reading.
Insure that the equipment used
for calibration, and the surroundings and procedures, contribute the smallest
error that is possible. This usually includes having a relatively large
mass of liquid medium, agitated vigorously to insure good heat transfer
and minimum temperature gradient; insulation to aid in temperature stability;
and a sensitive proportioning temperature control system to minimize fluctuation.
Depending on temperature range and conditions, the equipment need not be
sophisticated or expensive. For example -- for calibration between ambient
and, say, 140 degrees F, a large, insulated food/beverage container provided
with a kitchen food mixer and simple paddle for agitation, and with the
temperature controlled by manually opening and closing a hot water faucet
-- can become, in the hands of a skilled operator, a precision calibration
bath useful for calibration at uncertainty levels less than 0.1 degrees
The master reference thermometer
must have an accuracy such that its level of uncertainty is a small fraction
of the allowable calibration error desired; preferably on the order of
one to two tenths. This means that for calibration of thermometers or temperature
control devices to within a maximum expected error of +/- 1 degree C, the
reference thermometer should have a maximum error of no more than a few
tenths of one degree; for the calibration error to be less than +/- 0.1
degree C, the reference thermometer must have a maximum error of no more
than a few hundredths of a degree, and so forth. Equally important is the
long-term stability of the master reference standard. It must be able to
be used with confidence for a practical period of time between its own
calibration checks, and with reasonable certainty that it is not subject
to short-term variations in calibration.
The above three fundamental considerations
are all involved with the process of comparing a temperature sensing instrument
to a master reference standard thermometer, to do with the second step,
that of insuring that the master reference instrument is itself continuing
to be accurate. This assurance of the accuracy of the standard themometer
is again done by comparison. In this instance, however, the comparison
is usually done by referencing its indications to a primary standard or
near equivalent. The most commonly-used of these are the triple point of
water, or for most laboratory and industrial use, its near equivalent,
the ice point. It is commonly understood that an ice point may have uncertainties
on the order of 0.01 deg C. However, James L Thomas of NIST, in 1939 performed
an exhaustive test that indicated that with care, the ice point could be
realized with an uncertainty of little more than the triple point of water.
(ref 2) Moreover, an ice bath is so much easier to
prepare and use than any of the standard fixed points that it has become
the common choice for reference standard calibration. However, as with
any procedure in high-accuracy work, care must be taken. It is therefore
appropriate to describe procedures that will insure minimum error.
V. REALIZATION OF ICE POINT
The basic steps required to
insure ice-point accuracy are:
Insuring water purity
For most purposes, ice made
from ordinary culinary water is sufficient. However, since most dissolved
minerals affect the freezing point, it is common to use only ice and water
that has been demineralized. For an ice point with less than 0.01 deg C
uncertainty, only distilled water, and ice made from distilled water should
be used and the container should be of carefully-cleaned glass or stainless
steel. As little as 12PPM of some salts can cause a 0.01 deg C reduction
in the ice point.
Insuring minimum heat flux
To insure that the sensor
being tested is unaffected by ambient conditions, it should be placed in
the center of a relatively large mass of ice and water (normally two liters
or more), well away from the walls of the container, and the container
should be insulated to minimize melting of ice. The sensing element being
calibrated should be immersed adequately to minimize heat transfer through
its housing (remembering the rule that calibration conditions should approximate
To guard against temperature
rise due to insufficient ice, and to insure against poor heat transfer
due to air in the bath, the following procedure is recommended:
Fill the container with crushed
or chipped ice.
Fill the container with water
to an overflow condition.
Add more ice until ice is tightly
paced to bottom of container, allowing water to overflow.
Insert sensor to be calibrated
and allow temperature to reach equilibrium (normally 5 minutes or more).
If test continues more than a
few minutes, add more ice periodically, as before, insuring that ice is
packed tightly to bottom of container each time. The goal is to insure
that at all times the sensor is in contact with an ice/water mixture over
its entire surface.
VI. CALIBRATION PROCEDURES
Application of fundamentals discussed
above to the calibration of specific temperature sending elements will
vary somewhat, depending on the level of accuracy required. It is uneconomical
and unnecessary to take the time and care needed for extremely precise
calibration, when not required by the needs of the process being monitored,
or when the sensor has substantial built-in inaccuracy. The important consideration
is the amount of inaccuracy (or, more properly, the level of uncertainty)
that is permissible. For convenience, we will discuss procedures for three
levels of uncertainty: +/- 1.0 deg C, +/- 0.1 deg C, and +/- 0.01 deg C.
Calibration within +/- 1.0 deg
For many uses where an
uncertainty of the order of +/- 1 deg C is acceptable, thermometers and
controllers are purchased having specifications that claim inaccuracies
no greater than that amount. The instruments are then used for extended
periods of time without calibration -- often, in fact, until breakage or
major malfunction occurs. If in fact, and accuracy of +/- 1 deg C is important,
this is a dangerous practice, since few instruments will remain in calibration
for extended periods unless specifically made for long-term stability.
Even many glass thermometers, generally accepted as "correct unless broken,"
are no longer regularly made with the expensive glass annealing and aging
steps that insure the necessary stability.
The simplest calibration procedure
for such instruments is to make a periodic ice point check, if 0 deg C
is included in the instrument range, and/or to compare desired readings
with that of a high-quality mercury-in-glass thermometer such as the ASTM
precision series, ASTM 62C through 70C (or F). These reference thermometers
have scale graduations, in the moderate ranges, of +/- 0.1 deg C or +/-
0.2 deg F and hence are within the accuracy range (an order of magnitude
more accureat than the instrument to be calibrated) needed for such service.
Calibration within +/- 0. 1 deg
In order to insure that
routine temperature measurements with operating instruments are accurate
to within +/-0. 5 deg C to +/-1. 0 deg C, it is necessary for the instrument
itself to be calibrated to an uncertainty of no more than +/-0. 1 deg C.
Since this is the accuracy range most commonly needed in industrial use,
the calibration procedures will be described in more detail than those
Equipment: Care must be
taken in selecting and using equipment for calibration at this level of
uncertainty, since the reference thermometer, temperature controller and
other items must introduce errors of no more than a few hundredths of a
The following items are
Procedures: Actual calibration
procedure for achievement of less than +/-0. 1 deg C uncertainty is quite
simple--still following the "BASIC APPROACH" described at the beginning
of this paper. The major effort centers around extra precautions taken
to insure that each error and uncertainty is less than a few hundredths
of a degree, so that the sum of all uncertainties is less than one tenth.
The degree of difficulty in achieving this result depends on the temperature.
It is not difficult--with proper equipment and training--in the range from
0 deg C through 90 deg C, more difficult in the range 0 deg C to -40 deg
C and 90 deg C to 200 deg C, and extremely difficult outside those ranges
due to equipment limitations.
Reference Standard Thermometer:
One of two types of instrument is commonly used; a high-accuracy mercury/glass
thermometer accompanied by a signed certificate of calibration with corrections
to the nearest 1/5 of a graduation division, or a precision platinum resistance
probe with high-accuracy indication system, also accompanied by a NIST-traceable
calibration record. Since there is a cost difference of between 10: 1 and
50: 1 between the two instruments, the mercury/glass thermometer is most
Ice bath: The same ice
bath can be used as described above, as long as care is taken to avoid
contamination of the water or ice. One additional piece of equipment is
needed, a 10X microscope and stand, to allow reading of the mercury/glass
thermometer without parallax and to permit careful interpolation to at
least the nearest 1/5 of a graduation division.
Temperature bath: There
are three important criteria in good bath construction: First, that the
heating/cooling elements be isolated from the test area; second, that the
bath be well insulated to minimize heat transfer load and controller stabilization
needs; and third, adequate agitation. As a rule of thumb, on all baths
except those at low temperatures where the medium is highly viscous, adequate
agitation is insured when the liquid surface has the appearance of water
at a "rolling boil" condition. Also, in order to insure stability, most
well-designed baths have a minimum exposed surface area. If this is not
possible, a well-insulated cover should be made to cover all but the minimum
exposed surface area.
Since the advent of solid-state electronics, vast improvement in proportioning
controllers has come about. The best for calibration bath purposes have
a visual indicator--a flickering lamp that indicates control status (off
when temperature is above control point, on when below, and flickering
intermittently when at control point) . For control temperature below ambient,
it is common to install a throttle-able refrigeration system for gross
control (continuous operation) and an electric heater with sensitive controller
to override for fine control. As noted under "A" ." above, manual control
can also be used if calibration is infrequently done and the cost of an
adequate proportioning action must be simulated by a variable resistance
unit that allows a varying heat input rates rather than "on-off" control.
Greatest attention will be
given to procedures using the most dependable and economical components;
(a) a rapidly-agitated liquid bath or baths for temperature comparison,
and (b) a master reference standard thermometer or set of thermometers
that are mercury-in-glass units built to ASTM Precision-series standards
but calibrated and certified accurate to the nearest 1/5 of the smallest
scale division--with certification directly traceable to the NIST. Comments
are in two groups corresponding to the two steps of the Basic Approach;
comparison of thermometer to be tested with the reading of the master reference
thermometer, and when calibration check of the master reference thermometer:
(a) Comparison of thermometer
to be calibrated with master reference thermometer in agitated liquid bath.
This involves primarily attention to details that could influence the accuracy
of results, including:
Periodic check of temperature
bath to insure negligible temperature gradients.
Understanding bath temperature
control system and adjusting to insure negligible short-term fluctuation.
Learning technique of taking
readings on slowly rising temperature to minimize effects of mechanical
hysteresis in the mercury/glass thermometer.
Understanding time response and
thermal lag of instruments to be sure that enough stabilization time is
Learning the technique of interpolation
of mercury/glass thermometer scales so that readings of both instruments
can be made consistently to the nearest 1/5 (and eventually 1/10) of the
smallest graduation division.
Checking skill of technician
and dependability of equipment by making multiple tests and by comparing
one person's results with another with the same equipment.
Assuring consistent immersion
of sensor, and consistent ambient conditions that both simulate actual
operating conditions as exactly as possible (or if not possible, determining
a reliable correction factor to apply to calibration results)
Understanding the relative stability
of each sensor to be calibrated, so that recalibration cycle is timed properly;
and keeping calibration records to support timing decisions
Taking care to properly apply
calibration corrections from the calibration certification of the master
reference thermometer to the test readings.
Insuring adequate lighting for
Taking adequate precautions to
insure against parallax errors in reading both reference and test thermometers.
(b) Calibration check
of master reference thermometer: If the master thermometer is a high-accuracy
mercury-in-glass unit that has been properly made and certified, this calibration
check is primarily a matter of making a periodic ice point check under
carefully- controlled conditions (described below); and recalculating calibration
corrections if necessary. Normally, such a thermometer can be used for
decades without needing to be returned to the factory or laboratory for
recalibration. If a platinum resistance thermometer is used as a master
reference, it should be completely recalibrated (at least at all temperatures
needed for use) once per year or oftener.
The continued use of mercury-in-glass
thermometers for the majority of applications as master reference standards
is due to this unique feature--the face that if proper records are maintained
and procedures followed, the accuracy of the thermometer can be known with
confidence for several decades without the need for a full recalibration.
This is true of few other temperature devices. The following explanation
might help understand this unique feature:
a. All measurement
devices are subject to change with time and usage. This includes the resistance
elements of platinum resistance thermometers and bridges as well as the
glass of mercury-in-glass thermometers. The important criterion is to be
able to measure and know the magnitude of these changes.
d. To permit this simple calibration
check, mercury-in-glass thermometers made for use as master reference standards
include the following features:
b. The ideal way to know how
much change has occurred in a device is to compare it periodically with
something that does not change--a "primary standard."
c. This brings us to a pair
of interesting phenomena that combine to provide the unique capability
of the high-accuracy mercury-in-glass thermometer as a master reference
glass thermometers have been made for over a hundred years, and during
that time manufacturing techniques have been developed and tested that
have been time-proved to assure a remarkable capability: That is, that
essentially all measureable change that will affect the temperature indication
will occur in the bulb of the thermometer. It is possible, then, if the
temperature representing the freezing point of water (the ice point), is
included within the thermometer scale, that a careful calibration check
at that one temperature will, in effect, provide a calibration check of
the entire scale, since there will be no relative change of indication
of one part of the scale over another.
(2) It is relatively easy
and inexpensive to realize the temperature of freezing water to a level
of uncertainty of a few thousandths of a degree in any laboratory or office.
Thus, a temperature instrument that needs only an ice point check to assure
its accuracy over its entire temperature scale can be recalibrated indefinitely
by simply making ice point checks and applying any correction needed to
all other temperatures indicated by the instrument.
(1) An auxiliary
"ice point" scale if 0 deg is not included in the range.
For a greater
understanding of thermometry practice using mercury-in-glass thermometers,
refer to NBS Monograph 150 (ref 3) and for greater
understanding of high-accuracy thermometry using platinum resistance elements,
see NBS Monograph 126. (ref 4)
(2) Unusual care in manufacturing--up
to 75 or more manufacturing steps including aging and annealing operations
compared with 20 or less steps in making "laboratory" glass thermometers.
(3) An individually-graduated
scale etched into the glass surface. Each individual graduation may be
spaced slightly differently than the adjacent graduation to exactly match
variations in the glass bore diameter.
(4) A signed certificate of
calibration resulting from a retest of the thermometer at a number of points
throughout the scale range, under extremely carefully controlled conditions,
using a reference thermometer kept in calibration through a high-level
recalibration and Measurement Assurance Program as described under "Calibration
within O .010 uncertainty" in the main body of this paper. Any corrections
noted on the certificate should then be applied to appropriate readings
of the thermometer, with interpolation between certification points.
For a high accuracy ice point
check of a mercury/glass master reference thermometer, the following should
-- Insuring that
only demineralized water and ice are used, that the bath is kept full of
ice and water, that precautions are taken to insure minimum heat flux and
complete stability, as described above under "Realization of Ice Point."
Calibration within +/-0.01 deg
-- Use a 10X microscope, carefully
aligned to insure that the microscope axis is perpendicular to the axis
of the thermometer. This insures against parallax error, and allows accurate
interpolation of mercury column height to 1/10 of the smallest graduation
-- Also using a 10S or 20X
microscope, examine the bulb and bore of the thermometer to insure that
there is no evidence of "air" in the bulb or mercury column, and no droplets
of mercury separated from the column.
-- Keep adequate records to
gradually gain confidence in the stability of the master reference thermometer.
A good plan is to check the ice point at least every four months until
a shift of less than 0.2 of the smallest division occurs between checks,
then extend to an annual check. However, if annual checks show a change
of 0.2 division or more, return to four-month checks until again stabilized.
-- Whenever a careful ice
point check shows shift in calibration of more than 0.2 of a division,
the calibration certificate should be amended to add the correction to
all calibration values. (For mercury-in-glass thermometers only.) This
can be done as a result of over 100 years of experience that verifies that
essentially all change occurs in the glass bulb of the thermometer, and
its magnitude is determined by the ice point check. Readings at all other
scale points will have, therefore, shifted the same amount as the ice point.
This is the level of accuracy
required to perform initial calibration and recalibration of the master
reference thermometers described under "B ." immediately above. Since this
level of accuracy requires a calibration uncertainty of no more than a
few thousandths of one degree, unusual care must be taken. Mercury/glass
thermometers cannot be used, due to their lack of resolution as well as
mechanical variations. The thermometric standard commonly used is a precision
platinum-resistance element used with a precision potentiometric bridge.
Newer systems such as quartz thermometers and electronic digital indicating
devices are available, but do not have the long-term performance record
of the platinum resistance element and bridge combination.
Actual calibration procedures
are similar to those described under "B." above except that greater care
is taken at each step, and long-term experience in calibration techniques
is required to minimize errors. However, to insure the continued accuracy
of the master reference thermometer used for such calibrations requires
sophisticated equipment and procedures. Basically, the resistance element
as well as the precision bridge are trouble-free, extremely stable instruments.
They are both, however, subject to small changes with time, and these changes
can affect the output value at one portion of the range while not affecting
it in other areas. This requires a regular recalibration schedule for both
the bridge and resistance elements. At this accuracy level, interlaboratory
correlation becomes important as part of a Measurement Assurance Program.
Such a program is planned
to assure confidence that uncertainty levels of no more than a few thousandths
of a degree are maintained. Beyond that basic element, however, the program
includes a system of checks and double checks to virtually eliminate the
possibility of error due to equipment failure or operator mistake. This
program includes most, or all, of the following steps:
-- Periodic (oftener
than once per year) checks of working bridges and resistance elements against
a master bridge and element.
-- Calibration check of master
bridge by use of a standard resistor on an annual or more frequent basis.
-- Calibration check of standard
resistor by independent testing agency--annually until fully stabilized,
then every three to five years.
-- Round-robin interlaboratory
comparison tests of resistance elements.
-- Periodic check of both
working systems and the master calibration standard system against primary
standards--not only the triple point of water, but other according to need,
freezing point of zinc,
freezing point of tin,
boiling point of oxygen.
In summary, it is possible to
have confidence that temperature-measuring instruments are accurate by
following a simple two-step process: First, comparison under controlled
conditions of an operating temperature device with a master reference standard
thermometer; and second, periodically checking the accuracy of the master
thermometer by appropriate means.
A calibration program offering
assurance of accuracy to a level of uncertainty of less than 0. 1 deg C
can be developed at low cost, based on the use of carefully-made and calibrated
mercury-in-glass thermometers as master reference standards. This accuracy
and economy is possible because of the simplicity of high-accuracy calibration
check at the temperature of freezing water (the "ice point") , and the
property of a mercury-in-glass thermometer that an ice point check insures
that the magnitude of calibration change is known throughout the entire
temperature range of the thermometer.
For a complete
discussion of IPTS-68, see the authorized text in Metrologica 5,
35 (1969). Return to text
L., "Reproducibility of the ice point," in 1941 edition of Temperature,
Its Measurement and Industry, New York, Reinhold Publishing Co., 1941.
Return to text
J. Wise, Monograph
150, U.S. Department of Commerce, National Bureau of Standards, January
1976. Return to text
G. Furukawa and H. Plumb, Monodgraph 126, U.S. Department of Commerce,
National Bureau of Standards, April 1973. Return to text
For a description
of considerations in an effective Measurement Assurance Program, see Furukawa,
G.T., "A Measurement Assurance Program - Thermometer Calibration," unpublished
ASTM Technical Talk, June 25, 1980. Available from Dr. Furukawa, U.S. Department
of Commerce, National Bureau of Standards. Return