Last Fall, I attended the 28th Annual Rapid Methods Workshop at University of Wisconsin-River Falls.  Dr. Purnendu C. Vasavada, the driving force behind the program, brings people from all over the world to discuss rapid microbiological methods, food safety and quality issues and future trends. 

1 sugar cube dissolved in 2700 liters or 730 gallons

A key point covered in the workshop reminded me how long I have been involved with the food industry. Back in college when we learned about detection methodologies, the sensitivity of most chemical methods was in the parts per million range. As the years passed, chemical methodologies improved and sensitivities increased to the parts per billion and parts per trillion range and beyond. In other words, zero is vanishing. 

In 1992, Dr. Jack Francis of the University of Massachusetts prepared a monograph for the Council for Agricultural Science and Technology called, “Food Safety: The Interpretation of Risk.” The monograph used simple means to show people the difference between parts per million and parts per billion. Francis used a sugar cube dropped in tank truck and tanker ship to differentiate these two concepts. (See illustration above.) Technology now allows us to detect the equivalent of a sugar cube in a ship; in per trillion, we can find that single sugar cube in ten tanker ships.

California’s proposition 65 was originally designed to protect the public, but it has evolved into what some call the fair labor act for lobbyists, lawyers and laboratories. Is exposure to parts per trillion of a chemical really hazardous? According to California law, it is if that compound is deemed a mutagen or carcinogen.  Francis’ monograph reports that analysts have been able to detect at both the attogram and milliatogram levels, or 10-18 and 10-21. He alludes to the fact that Avogado’s Number is 6 x 1023 per mole. Given these limits, sensitivity was approaching the molecular level in 1992.

2.7 million liters or 730,000 gallons

Based on discussions at the Rapid Methods Workshop, it appears that we may be headed in the same direction in microbiology. Vendors and researchers from academia and government presented their systems and research on a range of topics, but the emphasis was faster, more efficient and more effective.

Let’s look at sampling. The standard for the industry when sampling surfaces is the swab or sponge. Microbiologists realize that these techniques do not remove everything.  One vendor at the workshop spoke about a system that removes two to three times as many organisms from the surface as the swabs and sponges. Others spoke on methods that allowed more efficient recovery of organisms, specifically pathogens, from complex substrates. These new methods were more effective than blending or use of the Stomacher.  The use of antigen-antibody systems to selectively isolate bacteria, including pathogens, allows detection of low numbers of organisms in food systems. Enhanced recovery methods plus the use of these systems allow microbiologists to recover extremely low numbers of organisms from complex food systems. The ability to recover microorganisms from surfaces or food matrices has increased significantly. 

What happens if a food processor, contact laboratory or regulator moves away from traditional methods, adopts more effective and sensitive methods (swabs or sponges or other technologies) and uncovers pathogens where they have never been found before? What happens if detection increases to being able to find one pathogen in 25 grams of sample to one pathogen in 500 or 1,000 grams?  Is that product now unsafe?  Improved technologies create serious questions that must be addressed by industry, academia and government.