A three-year collaboration between academia and industry to develop a vision-based automated inspection system for commercial bakeries is reaching the front burner.

HAMBURGER BUNS ARE BIG BUSINESS IN COMMERCIAL bakeries, and maintaining consistency on a high-speed line is a challenge. Customers like McDonald's and Burger King are demanding buns with evenly distributed sesame seeds and uniform color, size and texture; suppliers who fail to meet tighter tolerances are in danger of losing the business. That reality is fueling demand for an in-line quality assurance system to flag out-of-spec product and either suggest or automatically make adjustments.

While a commercial at-line vision system exists, engineers at Georgia Tech Research Institute (GTRI) began work in 2001 on an in-line unit to provide real-time feedback on quality dimensions. To develop such a unit, GTRI forged a partnership with Bake-Tech, Tucker, Ga., and Flowers Bakery, with support from Thinkage for conveying equipment and Rockwell Automation for PLC controls. To advance the inspection science, the team set out to devise a system incorporating machine vision, logarithmic computations and intelligent controls that could communicate with ovens, proofers, depositors and other upstream equipment when product began to drift out of spec. Fabrication by Bake-Tech of a commercial prototype, that features sanitary design, is washdown ready and is sufficiently rugged to withstand an industrial environment is underway. Installation is scheduled this month in Flowers Bakery's Villa Rica, Ga., plant.

Advancing the inspection science to in-line quality control requires high-speed computations and supervisory controls that can use imaging data to modify the process. Mechanical, electrical and computer engineering expertise come into play. Heading up the effort are project leaders Bonnie Heck, a professor at Georgia Tech's School of Electrical and Computer Engineering, and Doug Britton, an electrical engineer and researcher at GTRI. Food Engineering spoke with Britton at the GTRI lab in Atlanta as developers were completing work with on a lab-scale unit.

Doug Britton, research engineer at Georgia Tech Research Institute
FE:QA vision systems have been used in automotive manufacturing and other industries for decades. What has delayed their deployment in food?

Britton: Vision inspection of output from a discreet manufacturing process can be calibrated to tolerances of thousandths of an inch, in which case identifying a unit that is out of spec is simple. The natural and acceptable variability in food is much greater, and ambiguity is a challenge for an automated system. For example, there's an acceptable amount and distribution pattern for seeds on a bun, but those seeds are in different places on each bun. The notion of a nonuniform product complicates things, though we've overcome the issue with grapefruit and chicken inspection systems. We recently licensed a vision-based poultry inspection system that we benchmarked at 200 birds per minute to Gainco. In the bun project, we're processing product at even higher rates.

FE: How much higher?

Britton: This system has been designed to inspect between 600 and 1,000 buns per minute, fast enough to provide 100 percent inspection on even the fastest production line. We call the demonstration video Blazing Buns. Greater computing power is the key. Two years ago, you couldn't get a 2 gigahertz processor. Now, 4 gigahertz CPUs are available. When we designed an inspection system for grapefruits five years ago, we needed four CPUs and eight cameras for the trial. Today, one CPU is able to process the video signal.

FE: How many cameras does Buns of Fire use?

Britton: We use two digital cameras, each of which captures product in a 13 in. wide area of a 26 in. conveyor belt. We're using IEEE 1394 cameras, also known as FireWire by Apple. IEEE 1394 is a standard defining a serial bus similar to USB but which runs at much higher speeds and is not centered on a PC operating system. The bandwidth is ideal for digital video cameras.

We purchased the cameras from Point Grey Research in Vancouver for about $800 each. Three years ago, comparable cameras would have cost $2,000. Point Grey provides excellent service response. The company was founded by computer engineers from the University of British Columbia. We're university researchers, too, and we like to work with people who will help us understand the technology and let us get into it.

FE: How does the system interpret the camera's digital signal?

Britton: The image is processed using a code written in house. A number of image-processing algorithms exist. That's central to our intellectual property. It's not based on a training paradigm in which the system looks for discoloration, blisters, foreign objects and other defects. A drop of grease doesn't look much different than a sesame seed to the camera, for instance, but a bun with grease on it has to be rejected. The image processing algorithms have to make the distinction between foreign objects and seeds in real time. The general shape and diameter of 4 in. and 5 in. buns also are calculated to determine variance.

One challenge in the project is being able to work with enough out-of-variance product in the lab to be able to refine the algorithms. The bakery has to maintain production and can't just burn 1,000 buns for us. When we move the system to the bakery's line, we'll be able to address the issue of sample size.

Doug Britton monitors readings of hamburger-bun quality dimensions while the product passes under digital cameras over a conveyor line in the background.
FE:How are system controls configured?

Britton: This originally was envisioned as a system of feedback loops that make recommendations to the operator on what needs to be changed: adjustments to the oven, proofer, the variable speed controls on conveyors and so on. The next step was to develop mid-level controls that would automatically adjust low-level set points by feeding back information on bun quality from the vision inspection.

The highest level is intelligent controls and modeling using data from a plant-wide monitoring system. Data are collected from the vision system, oven PLCs, photo detectors for oven loads, an infrared camera to measure surface temperatures of buns exiting the oven and an oven M.O.L.E., a device with thermocouples that rides through the oven on a pan to record bun and ambient temperatures. A couple of approaches are being tried to model all the data: a physical model that simulates the oven's thermodynamics, and a data driven model that combines temperature profiles and the physical model and uses neural networks and fuzzy inference systems to calculate optimum set points. A lot of work remains to be done.

This level of supervisory control of an oven has never been tried before, and we don't have a system yet where we'd feel comfortable leaving it in the bakery alone to supervise the process. For now, we're still modeling the decisions a baker would make and letting him verify the recommended action. Ideally, 10 years from now this would be completely automated.

FE: So many factors impact bread products beyond bake time and temperature. Do you feel comfortable you understand and can control those variables?

Britton: We've made progress, but baked goods are difficult. Sugar content, flour moisture, mixing conditions, even the temperature and humidity levels of the bakery's ambient air impact finished goods. When you add in the fact that there's an eight-minute delay in adjustments to the oven and each batch only lasts 15 minutes, predicting outcomes becomes a huge challenge.

FE: System reliability is critical, of course, but how affordable is the inspection system likely to be?

Britton: It's not our mission to establish system cost; companies that license the technology will do that. But one of the stated objectives of the project was to design a system that could screen all the buns on a high-speed line at an affordable cost. This system doesn't require any special conveying equipment or a special background; it can be added to any production line.

The lab system runs on Windows for convenience, but we're looking at using a Linux system for the commercial version to minimize licensing fees and lower system costs. The limiting factor with Linux, however, is the lack of tools.

Cost was a consideration in other decisions, too. LED lighting could be used for the cameras, but they add cost and can be difficult to replace. We used compact fluorescents because replacement bulbs are readily available, and they represent a $10 solution for maintenance purposes. The human eye doesn't notice it, but fluorescent tubes lose their light intensity over time. We want to encourage proper maintenance by driving down the cost of performing it.