Microbial Detection – Taking It to the Limits
By Margaret D. Hardin, Ph. D.
As the food industry increases the demand for more and more testing, whether it is for customers, regulators or for self-monitoring, the concept of detection becomes more and more important. Testing often includes food product testing and environmental monitoring for pathogens, indicators and/or spoilage organisms. During a recent visit to a small processor, the owner of the company asked several questions about a laboratory report they had just received from their third-party testing laboratory.
The questions included the following: What does less than 10 coliforms mean and why is it less than 10 coliforms in one sample and less than three coliforms in another? What is the difference between a result of less than 10 and a negative result? Why are some results (for pathogens) negative in 25 g and other results negative in 375 g? The concerns this processor expressed all lead back to the products (matrix) being tested, how much product was in the sample (10, 25 or 375 g), how many organisms were present in the sample, as well as what detection method was used, what was it detecting and how much could it detect? For the sake of discussion here, microbial detection will be defined as the ability to recover and discover or determine the existence or presence of microorganisms.
Since microorganisms are so very small, the detection of their presence in a food sample is challenging at best, even when present at levels in the thousands to millions of cells per gram or milliliter of a food product or rinsate or on an equipment surface. The detection and quantification of microorganisms in food and food processing begins with the ability of the method to move (or remove) the microbes from the food (or surface) to the detection system. The methods by which microorganisms are removed or detached from a food or surface are quite varied. Taking a sample from a food product or surface may be accomplished by swabbing or sponging a food or food equipment surface, excising a thin layer of food product surface or rinsing the food product or piece of equipment in or with a buffered diluent. Frequently used methods for removing (detaching) microbes from food product or swab samples include blending, stomaching (or macerating) and rinsing. Whichever method is used for sampling and detaching microorganisms, it is important to remember that this generally represents only a fraction of the actual level present. Besides the ability of any method to fully detach or remove microbes from a sample, additional questions arise as to what happens to the organisms as the sample is shaken, stomached or blended. Do the microbes reattach to other portions of the food sample or are they driven more deeply into the sample or into a sampling sponge or swab during this mechanical action of shaking, blending or stomaching?
Another consideration is the dilution effect of the extraction method. For instance, when taking an excise surface sample, is the sample taken at 1–2 mm or 10 mm deep? If the microbes are expected to be on the surface, then minimizing the depth of the sample (unless it is a very irregular surface) and maximizing the surface area is important to reduce the amount of food material that is unlikely to contain the organisms of concern and is likely to dilute the sample with additional food product. If the surface is rinsed or diluted to facilitate blending or stomaching, the amount of diluent (100, 250 or 2,000 ml) must be balanced with the size of the sample and the expected effect of the dilution (sample to diluent ratio 1:1; 1:5; 1:10, etc.) on recovery and detection of microbes in the final sample. These are important factors to consider when validating methods, evaluating a new method or changing methods, as the performance of the method is also a reflection of the sampling and extraction technique. If a sample is shipped to the laboratory performing the analysis, storage and transportation of the sample must also be taken into consideration to limit their effects on the organisms(s), food product and final results. This includes factors such as sample transport media, product temperature and time in route.
The laboratory methods used in food microbiology include qualitative presence/absence tests often used for pathogen detection or indicators of the presence of end products or byproducts of chemical or biochemical reactions, and quantitative tests used for enumeration of viable cells. Historically, detection of microorganisms in foods has ranged from more traditional methods, as simple as focusing on a microorganism in a microscope (direct microscopic count) or extracting cells from the food to identify or quantify by culturing and growing organisms on culture media and using phenotypic characteristics or byproducts to identify the organisms, to more recent technologies that focus on the extraction of cellular components of a particular organism, such as DNA or RNA, and amplification to detectable levels. Traditional methods for the detection, identification and enumeration of microorganisms in foods and the food environment depend on the use of appropriate pre-enrichment, enrichment and culture media that may suppress background flora and thereby allow the target organisms to thrive.
Qualitative methods provide a presence or absence result that indicates microbial contamination of a sample based on the size of the sample, such as 10 ml, or 25 or 375 g or 100 cm2. The resulting limit of detection is based on the sensitivity of the method that may vary with food type, incubation time and temperature, amount of sample and level of the organism in the product. Quantitative methods provide a numerical result, indicating the total number of microbes present in the sample. The limit of detection can be affected by the method, such as most probable number or direct plate count, the media used, the dilution factor, the incubation time and temperature, and the stress and associated recovery of the organism. While traditional methods can be slow and labor-intensive, they are more specific for the identification of the organism, whereas more rapid methods usually target an alternative analyte, such as the DNA or RNA sequence unique to the organism. The advantages of rapid methods such as PCR are numerous, including increased sensitivity and speed of detection, and identification of microorganisms from numerous and varied sample matrices. Compared with techniques such as traditional culture methods, the detection time required for the assay can be reduced from days or weeks to hours. In addition, these methods are particularly applicable to target organisms expected to be present in low numbers. For any method, the intrinsic properties of a food product, such as pH, water activity, ingredients such as salt or spices and the chemicals and sanitizers used in food environments that may interfere with the test method, must be considered. More recently, the questions of sub-lethally injured cells and their roles in foodborne illness and the appropriate methodology for resuscitation of these cells have also come to light.
Limits of Detection
Whatever the method and whatever the matrix, the bottom line is the limit of detection for the method and the associated results of the analysis. The limit of detection of a particular method, the ability to recover and determine the presence of microorganisms, is often described as a particular level of organisms recoverable in a sample of a particular size. The limit of detection needed for a specific process or food type varies with the reason for taking the sample in the first place. If the goal is to reach zero tolerance, as is expected for many ready-to-eat food processes, selecting a method with the lowest limit of detection may be necessary. However, taking the approach of “how low can you go” is confounded when the performance of a method and associated limits of detection of actual microbes present in a sample may also be substantially influenced by the effectiveness of sampling and extracting the organism(s) of concern from the sample, the matrix itself, the recovery method and sensitivity of recovery and the laboratory or analyst conducting the analysis. While the limit of detection is particularly important when trying to quantify the cleanliness of an environment or eradication of a particular organism from a food, such as in the validation of a lethality or antimicrobial process or when quantifying and characterizing a potential health risk, the limitations of microbiological testing must also be considered in applying these results. The lack of a sufficient number of samples, the non-homogeneous distribution of microorganisms throughout most food matrices and environments and the influence of non-random sampling may cause incorrect conclusions to be drawn. In addition, while the results of microbiological testing often identify outcomes, they do not necessarily reveal the associated causes or controls of contamination.
Researchers and test manufacturers are continuously searching for improved tools that are faster, more accurate and more sensitive than ever before. Traditional methods of microbial detection tend to be labor-intensive, taking days and even weeks before microbial colonies or their byproducts are visibly detected. In many cases, this time factor does not include the time for pre-enrichment and selective enrichment from a food matrix or the time it takes to select and differentiate the organism from background flora on a plating medium. Time to results may be further delayed if additional subculturing or biochemical confirmation of organisms is needed. However, these long, drawn-out methods, with all their limitations, are still the standards by which all other methods are measured and validated. No matter how low you go in microbial detection, microbiological testing is still an invaluable tool for establishing baseline data, monitoring raw materials and ingredients, verifying control of Hazard Analysis and Critical Control Points system(s), and in monitoring and assessing control of the process and the environment.
Margaret D. Hardin, Ph.D., is vice president of technical services at IEH Laboratories and Consulting Group. She is a member of the Editorial Board for Food Safety Magazine, the International Journal of Food Microbiology and the Journal of Food Protection. She has served as a member of the National Advisory Committee on Microbiological Criteria for Foods and the National Advisory Committee for Meat and Poultry Inspection.
Food Microbiology Testing Methods
There are many microbiological testing decisions that food plants have to make regarding sampling locations, test frequency and analytical methods to ensure that their products meet label claims, are safe and wholesome and reach regulatory and customer requirements.
According to Strategic Consulting Inc.’s latest market research report, Food Micro, Fifth Edition: Microbiology Testing in the U.S. Food Industry, the total volume of microbiology testing in the U.S. increased by 14.4% in the past 2 years. The test volume for routine/indicator organisms went up by just over 10% between 2008 and 2010. During that same 2-year period, pathogen testing increased by more than 30%, driven by a number of important factors including regulations, recalls and increased customer requirements.
For pathogen analysis, the use of traditional methods continues to decline, and in 2010, accounted for just 11% of test volume. Antibody-based methods were the leading analytical method followed by molecular-based methods. This trend away from traditional methods for pathogen analysis is forecasted to continue over the coming 5-year period, along with an increase in the use of molecular methods.
For more information on Food Micro, Fifth Edition, visit www.strategic-consult.com, email firstname.lastname@example.org or call 802.457.9933.