Robotic motion is becoming commonplace in end-of-line packaging systems, but applications in the processing area of food and beverage plants remain few and far between. Development challenges are particularly acute in protein processing-beef, pork and poultry. Natural raw-material variation is accentuated with livestock, and sanitary and food-safety requirements put added stress on automation equipment. Fortunately for the industry, some technical centers exist to tackle nettlesome projects like robotic processing. One example is the Food Processing Technology Division of Atlanta’s Georgia Tech Research Institute (GTRI).
An intelligent deboning system for poultry is on the brink of validation, the culmination of several years of research and development. The project represents a public-private effort, with Pilgrim’s Pride Corp., Tyson Foods Inc. and Wayne Farms LLC collaborating with the university-affiliated researchers. Falling prices for robotic arms and powerful FPGA (field-programmable gate array) computer processors help make robotic deboning feasible, but modeling the motion of cutting required a dedicated team of highly trained engineers.
Heading the intelligent deboning research team is Gary McMurray, principal research engineer and chief of the Food Processing Technology Division at GTRI. A graduate of the Georgia Institute of Technology with BS and MS degrees in mechanical engineering, McMurray has been with GTRI for 20 years, developing new technology for food processing, biomedical devices and unmanned systems. Before joining the institute, he served as president and founder of Quanta, a robotic technology firm that successfully developed a NASA design for the teleoperation of Space Station Robotic Platforms. The technology later was licensed for commercial systems.
FE: When did the intelligent deboning project get underway?
McMurray: Some of the initial work goes back at least 10 years, when the poultry industry was complaining about yield and ergonomic issues associated with the cutting process. None of the early solutions were practical, however, and it took the team several years to determine that the issues were serious enough and the solutions too complex for equipment companies to solve them by themselves. Equipment companies do a fantastic job, but they deal primarily with fixed automation solutions, where there is some manual input to adapt to significant product variations, and then the machine just runs. This challenge required adaptive equipment that could use sensor inputs to change the trajectory of a blade cutting through biomaterial.
FE: How much of the deboning process did you seek to automate?
McMurray: We’re focused on making the initial incision, cutting from the clavicle to the shoulder joint, through the shoulder joint and down along the scapula. The cut from the clavicle to the shoulder and the cut along the scapula are important to maximizing yield.
On a manual cone line, each worker does one cut, and at the end of the line, the job is done. Line speed is one bird each two seconds. We have to make the initial incision in less than one-tenth of a second and identify where the meat, tendon and bone are quickly enough to complete the cut through the shoulder in one-seventh of a second.
FE: Are high-throughput plants still dependent on manual processes for deboning?
McMurray: Automation solutions have been developed, but all came with a cost in yield. In some cases, the additional loss was 5 percent, depending on the worker’s skill. Processors were willing to tolerate those losses in good times, when labor scarcity made automation a more attractive alternative. But labor is readily available today, so the focus is back on yield.
Breast meat commands the highest commercial price, of course, and each 1 percent loss in yield represents $2 to $3 million for a plant. In Georgia, there are about 20 processing plants, so reducing yield loss would represent a significant savings for such a commodity product.
The human benefits shouldn’t be minimized. Making this cut is one of the worst ergonomic tasks in the plant.
FE: Do you rely on a vision system to identify the point of incision and blade trajectory?
McMurray: We’re trying to find some very specific points on the bird, and that requires a lot more technology than standard image-processing techniques. The system’s image processing cell is based on three key points on the bird; an algorithm then predicts the initial cutting point and the internal structure of the joint. The position of the joint varies plus or minus 6 mm on each bird, which is almost half an inch. Additionally, the bird is mounted on a cone that can be rotated or tilted, so the range of possible positions is even greater.
Once the blade penetrates the meat, distinguishing between meat, bone and tendon is critical to avoid creating a food safety issue with bone chips. It would be relatively simple to establish a force threshold of, say 5 lbs., and if that level is exceeded, you know you’re in bone. However, by that time you could be so deep into the bone that bone chips are almost inevitable. Our goal is to detect bone at the first contact. At that point, an algorithm is required to guide the blade around the bone. In lab trials, we barely scratch the bone.
In effect, we have engineered an individual motion-control mechanism that provides the flexibility to adapt to every bird that comes down the line and account for its variation, as well as the unique placement of the bird on the cone.
FE: Were all these challenges being attacked simultaneously or sequentially?
McMurray: We actually were divided into two teams, one dealing with the cutting issues and the second team focused on image processing. Wayne Daley, a GTRI colleague, led the image processing team. They came up with the neat algorithm that enables us to predict the three key points on the bird to within 3 mm and provides the nominal trajectory of the blade as it moves through the joint. We’re now looking for a signature of what the bone looks like to trigger the force algorithm that guides the blade around the bone. Initial tests indicate we can accomplish this.
FE: What’s the next development step?
McMurray: By the end of June, we expect to have perfected and proved the three cornerstones of the work: tendon prediction, bone detection and force control. Then we can start tackling system design and factors such as monitoring the sharpness of the blade. After that, we can talk to industry about the next step to commercialization.
FE: Is the cell sufficiently robust to survive high-pressure washdown?
McMurray: There are robots being marketed as washdown. They’ll survive a garden hose, but they will not survive 1,000 psi washdown. We have developed a parallel robot with 2 degrees of freedom that can survive washdown in a meat or poultry processing plant, but to date there are none being used commercially. In the system’s final form, it will withstand high-pressure washdown.
FE: Can the algorithms deal with birds that deviate significantly from the norm-carcasses missing a wing, for example?
McMurray: Right now we’re focused on the core technology, rather than trying to address all the variations that might occur. When we reach the point where it is performing in a commercial plant, the solution might be to kick out the exceptions and process those manually. Even if they have to cut a few birds manually, most plants could live with that, provided robotics improved overall yield and reduced ergonomic issues.
FE: How useful was your experience with NASA in developing food industry robotics?
McMurray: NASA paid for my graduate school, but the space program always wants the man to be there, controlling the robot. That’s part of a different world now.
But NASA work taught me how to think about these types of challenges and to go after problems that no one has been able to solve before. There are a lot of smart people at an institution like GTRI, and if you get them focused on a particular challenge, you can solve some pretty amazing things.