Food Safety Magazine

FSM eDigest | April 21, 2015

Checking Field Thermometer Accuracy

By Robert W. Powitz, Ph.D., M.P.H., R.S., C.F.S.P.

Checking Field Thermometer Accuracy

In the absence of a National Institute of Standards and Technology (NIST)-traceable, dry-well thermometer calibrator, conventional lore recommends using an ice bath to validate electronic thermometers or calibrate mechanical ones. Presumably, the ice/water mixture will be 32 °F (0 °C). This is not always the case. The water and ice mixture made from distilled, reverse osmosis and de-ionized water will result in a 32 °F mixture, or close enough; whereas surface or well waters may differ widely in their content of total dissolved solids (TDS) and affect the temperature of the mixture. Conventional wisdom tells us that the higher the salt content or TDS, the lower the melting point of ice. The freezing temperature of “pure” water versus highly mineralized potable well or surface waters can vary as much as ±4.5 °F. Add to this the manufacturers’ accuracy claims of the thermometers; which can be as high as ±2 °F, as in the case of bi-metal dial thermometers, the variance of the ice water mixture and thermometer together can result in an error as high as ±6.5 °F. This does not instill a lot of confidence in thermometer accuracy verification, particularly when an errant thermometer is used as an enforcement tool. There is a better way to do this.

Here is the logic. The temperatures of most frequent concern to the regulatory community are less than 41 °F and greater than 135 °F. Therefore, would it not be more prudent to do a two-point validation or calibration than a single point at some approximate temperature? Secondly, would it not make more sense comparing the temperatures of the thermometers under test to some temperature standard, rather than worry about the TDS of the water/ice mixture and its freeze point conversion factor?

Begin with a “temperature standard” thermometer to rapidly check working thermometers with a relative certainty of being accurate. A “temperature standard” thermometer is a liquid-in-glass general purpose laboratory thermometer, built to NIST specifications. A convenient temperature range is 0–220 °F. You will also need two inexpensive, 16-ounce insulated, travel tumblers.

To conduct the validation/calibration process, simply fill one tumbler with cold tap water and the other with hot tap water; immerse the liquid-in-glass thermometer in either tumbler along with the probe of the electronic or mechanical thermometer to be tested. Let both thermometers equilibrate…a few minutes will do…and compare the temperature readings of the standard thermometer against that of the thermometer being validated. Repeat with the other tumbler. Record your results. It’s that simple, fast and accurate.

Forensic sanitarian Robert W. Powitz, Ph.D., M.P.H., R.S., C.F.S.P., is principal consultant and technical director of Old Saybrook, CT–based R.W. Powitz & Associates, a professional corporation of forensic sanitarians who specialize in environmental and public health litigation support services to law firms, insurance companies, governmental agencies and industry. Dr. Powitz is a member of the Editorial Advisory Board of Food Safety Magazine and can be reached at [email protected] or through his website at

> Categories: Testing and Analysis: Laboratory Management, Methods, Physical