Food Safety Magazine

SANITARIAN’S FILE | October/November 2006

Infrared Thermometry

By Robert W. Powitz., Ph.D., MPH

Infrared Thermometry

I bought my first infrared thermometer (IR) in 1981. It was an expensive and heavy portable analog unit—and its replacement battery cost more than my first car. I used it for finding hot and cold spots in large baking ovens; measuring the operating temperature of animal crematoria and low-pressure boilers; and identifying unmarked pipes and finding wornout bearings on ventilation units, among a zillion other uses—all at the upper end of the temperature scale (>200°F). The old unit served me quite well for 15 years, and still does upon occasion. In the early 1990s an inexpensive, lightweight, hand-held infrared thermometer was introduced to the foodservice industry that could read temperatures in the range practical for food safety applications. This new unit had a temperature measurement range from approximately 0°F-525°F (–18°C to 275°C) within an accuracy of ±3.6°F (±2°C). I’ve never looked back since. New IR thermometers changed temperature monitoring in a radical way.

Although it took a lot of trial and error to understand its capabilities and limitations (and a large dose of self-training to use this new tool correctly), the IR thermometer eventually became a permanent and essential instrument in my food safety quality control tool kit. I can honestly say that I find it far more comfortable monitoring critical temperatures using the IR thermometer as a screening tool than with thermocouple and thermister thermometers alone.

The new IR thermometers have proved ideal tools for regulators. It allows temperature screening where foods are too hot to touch or difficult to reach; where the food is moving too fast on a conveyor belt; where rapid succession temperature measurements need to be taken; and/or where foods are possibly subjected to unusual environmental radiant temperature influences. It also lets us scan cooling systems, refrigerated display cases, truck interiors and storage areas with ease. In a food plant, with the proper application of the IR thermometer, it enables us to monitor temperatures to insure that processes are operating consistently and under optimum conditions. Through this application, we have been able to demonstrate improved product quality and increased productivity and reduced downtime by rapidly finding and fixing temperature anomalies and tightening temperature specifications. Not bad for a small and relatively inexpensive device.

Caution: IR Use and Abuse
There are two features of the IR thermometer that make it ideal for quality control and regulatory screening. First, it does not come in contact with the food and does not require decontamination between applications. Even so, periodically wiping down the thermometer’s exterior surface is a good sanitary practice. Second, heat is not drawn away from the food being measured as it can be with direct contact thermometry. If used correctly, the IR thermometer yields a true temperature and thereby improves repeatability of measurements. We are therefore able to screen large lots and areas for temperature discrepancies, and once baseline conditions are established, we are able to verify any off-specification findings with smaller, statistically significant lot sizes using our electronic probe thermometers. In short, the infrared thermometer aids us in providing greater statistical accuracy and minimizes bias when used in conjunction with our validated regulatory temperature measuring tools. It gives the regulator greater versatility in temperature screening, while significantly reducing both error and the risk of inspector-mediated cross-contamination.

However, with ease of its use, we soon learned about the ease of its abuse. Instead of being used as an ideal screening device, in some instances the IR thermometer devolved into an enforcement tool. This has resulted in horror stories of retail food operations discarding perfectly good food upon the detection of an “off” temperature by a local inspector, or ordering unnecessary repairs to properly functioning refrigeration units. And this was just the tip of the iceberg.
Although many of these complaints have abated, they have not entirely disappeared. Occasionally, I am still called upon to intercede when situations become untenable. For this reason, we’ll review here some of the basics of the infrared thermometer and suggestions for its proper use in our industry. I am relying heavily on three sources from which this information is distilled. My gratitude and thanks go to the engineers and technical writers at Omega Engineering, Inc., Raytek (now a Fluke Company), and the joint committee on infrared thermometers that authored Underwriter Laboratories Standard 2333: Standard for Infrared Thermometers.

Understanding the Infrared Thermometer
Understanding the basic operating principles of the IR helps us to better utilize the capabilities and limitations of the unit in our work.

The human eye can see wavelengths from about 0.4 to 0.7 microns. The infrared spectral range is an invisible portion of the electromagnetic spectrum from 0.7 to 1000 microns. Anything warmer than absolute zero (0Kelvin) emits energy somewhere within this range. An infrared thermometer aimed at an object, has this energy passing through the unit’s optical system, which is then converted to an electrical signal at the detector. The converted signal is displayed as a temperature (°F or °C). The principle of IR thermometers resembles that of the human eye. In general, a wavelength as it passed through the eye will be interpreted as a color. In some instances, color can be used to determine temperature; as with infrared spectra.

Infrared energy is also transmitted through objects from other sources and reflected off object surfaces. Infrared thermometers sense not only the emitted energy of an object, but also the reflected, and transmitted energy which it translates into a temperature reading as well. Because of these possible interferences, the infrared thermometer can only be used as a screening tool.

The accuracy of the IR thermometers used in food safety is largely determined by two important concepts: emissivity and distance-to-spot size ratio. Emissiv-ity is the ratio of radiation emitted by a surface to the radiation emitted by a perfect black body at the same temperature. Another way to describe this is an object’s ability to emit or absorb infrared radiation. A perfect black body neither reflects nor transmits energy and has an emissivity of 1.0. Water and most organic materials such as foods have an emissivity of 0.95 … or close to it. Therefore, most of the IR thermometers used in our industry have a preset fixed emissivity 0.95. While the temperature taken of a food may be accurate, temperature readings may vary when using a preset unit on stainless steel or any shiny surface; glass, plastic or ceramic. In other words, even though the temperature of foods stored in hotel pans have come to equilibrium, the temperature detected of the food surface versus the hotel pan surface may vary considerably. To compensate, the area being measured can be covered with masking tape or a self-adhering black spot, or flat black paint. After the temperature of the tape, spot or paint has come to equilibrium with that of the surface, the area temperature can be accurately measured.

Distance-to-spot size ratio is the relationship between the distance of the measuring device from the surface and the portion of the surface being measured. The target area diameter increases proportionally as the distance from the thermometer to the surface increases (distance-to-spot size ratio). The unit is taking an average temperature of the target area and a larger area may result in less accurate measurements if temperatures vary across a given surface. There are distance limitations as well. Remember that the IR thermometer will also measure reflected and transmitted scattered light energy from sources near the target. I’ve found that once I get much beyond 3- to 4-feet distance from the target, the IR looses significant accuracy (measurements taken of walls, ceilings and other large surfaces excepted). There is a good rule-of-thumb to follow: To get the best results, the distance to the object should not be greater than the size of the object.

Most of the IR thermometers we regularly use have a distance-to-spot size ratio of 6:1. This means, if you hold the IR thermometer 6-inches distant from the target, you are measuring a spot with a 1-inch diameter; at 12-inches distant, the spot being measured is 2-inches in diameter, and so on. Units with laser or light aiming devices that denote the exact area being measured have completely taken the guesswork out of distance-to-spot size ratio. The laser or light aiming is not a part of the IR technology it is only to identify the object where the temperature was measured. The beauty of these devices is: What you see is what you measure.

There are limitations with the IR thermometer that must be kept in mind when using these tools. Here are the most important:

• Do not attempt to measure a temperature through glass or plastic film. The thermometer will measure the temperature of the glass or plastic, not the object beyond.

• Do not attempt to measure temperatures through dust, smoke, fog or steam. The temperature might be from the particles in the air rather than from the object you wish to measure. Keep the lens of the IR thermometer clean for the same reason.

• The IR thermometer will only measure surface temperatures and not internal temperatures. Normally, foods heat and cool from the outer surface to the interior. Therefore, a surface temperature reading may give a false indication of the interior temperature.

• Compensate for the differences of emissivity between foods and shiny surfaces by applying masking tape, a black spot or black paint to the shiny surface and taking the temperature measurement when the tape, spot or paint comes to equilibrium.

• Be cautious in measuring temperatures of foods, particularly where the background area is hotter than the food being measured. The IR thermometer tends to measure the reflected background energy and may interfere with accurate readings.

• IR thermometers will not function properly if they are subject to an abrupt temperature change. If the IR thermometer will have a temperature difference of 20°C or more, wait 20 minutes before taking the reading. For instance, when working in a warm kitchen and before taking temperature measurements in a walk-in refrigerator, “condition” the thermometer by placing it in the refrigerator for 20 minutes before taking any readings.

• Change the battery of the IR thermometer regularly and well before the end of its life. Some units show fewer problems, particularly with thermal shock, when a fresh battery is in place. None of the IR thermometers made for our industry work consistently well in sub-freezing temperatures, regardless of thermal conditioning or battery strength.

• The temperature of the air cannot be measured with the IR thermometer. Temperatures can only be measured of a solid object. However, the ambient temperature can be estimated by measuring an object within the area that is in equilibrium with its ambient air temperature.

• The IR thermometers used in our industry cannot be calibrated. They can however be validated in much the same way as measuring the ambient temperature. The procedure is rather simple. Affix a black spot or masking tape to a file cabinet or other piece of furniture in the room. When the spot or tape has come to equilibrium, temperature measurement can be taken. By comparing the reading of the IR thermometer to a temperature-standard thermometer at equilibrium, the accuracy of the IR can be validated.

Well, there you have it. As a final suggestion, practice with the infrared thermometer as often as possible. And at the same time, use an electronic probe thermometer to compare readings. When the comparisons are made between the two, it will give a sense of the IR thermometer’s limitations as well as its capabilities within that environment. The greater the experience working with these units, the greater will be your level of accuracy and comfort.

Forensic sanitarian Robert W. Powitz, Ph.D., MPH, RS, CFSP, is principal consultant and technical director of Old Saybrook, CT-based R.W. Powitz & Associates, a professional corporation of forensic sanitarians who specialize in environmental and public health litigation support services to law firms, insurance companies, governmental agencies and industry. For more than 12 years, he was the Director of Environmental Health and Safety for Wayne State University in Detroit, MI, where he continues to hold the academic rank of adjunct professor in the College of Engineering. He also served as Director of Biological Safety and Environment for the U.S. Department of Agriculture at the Plum Island Animal Disease Center at Greenport, NY. Among his honors, Powitz was the recipient of the New Jersey Environmental Health Association’s 2006 Harry R.H. Nicholas Award, which recognized Dr. Powitz’s outstanding and dedicated service to the advancement of public and environmental health in New Jersey. He is the first to hold the title of Diplomate Laureate in the American Academy of Sanitarians.

Dr. Powitz welcomes reader questions and queries for discussion in upcoming columns, and feedback or suggestions for topics you’d like to see covered can be sent to him directly at or through his website at