Tech update: Don't fence them in
Some robots find new freedom as they become aware of their surroundings and act accordingly.
For good reasons, robots have been kept behind safety fences as they perform jobs that are potentially dangerous and back-breaking to humans. Whether it’s spray painting, slicing animal carcasses or stacking 1.5-liter cases of water bottles on pallets, these robotic applications are typically fenced to protect workers from getting knocked in the head by a robot doing its job, being hit by product dropped from the robot’s grippers or entering a dangerous environment.
Today, some robots are becoming “un-caged,” and they have found new freedom, working collaboratively with humans. This is largely possible due to new sensors and vision systems that provide sensory inputs, as well as high-performance processors and safety systems that allow robots and humans to work more closely together. However, some kinks still need to be worked out, such as robotic performance issues (typically collaborative applications are not as fast as fixed-robotic systems) and the robots’ ability to survive washdown situations.
The changing role of robots
Until the last couple of years, the role of robots on assembly and packaging lines was pretty well defined, hadn’t changed very much and, in many applications, were not expected to change any time soon. “Robots are integrated to minimize human intervention and improve the productivity of the packaging process,” says Eric Aasen, vertical product line manager, Bosch Packaging Technology. “For the time being, we do not perceive a trend toward robot-human interaction solutions for our key industries.”
While Stäubli Robotics does offer a collaborative robot with a full SIL3 safety-rated system, Bob Rochelle, food & packaging segment manager, agrees the role of robots is limited. “To date, we are not seeing a major increase in demand for collaborative robotics in the food and beverage segment,” Rochelle admits.
However, a gradual change is in the works, and suppliers are getting ready for it. Aasen notes, “In general, the factory of the future requires machines and robots that can flexibly take over from stationary manufacturing systems along the entire production chain. At the same time, the ability to interact directly with humans is important.”
Bosch’s APAS family of mobile production assistants has been specifically developed for smart factories as defined in Industry 4.0. All three models collaborate directly with human beings and support technical staff with critical process steps and monotonous or messy jobs. The APAS assistants have been tested in many Bosch facilities throughout the world and are also available for external customers.
FANUC America showed its Model CR-35iA collaborative six-axis robot at PACK EXPO 2015. The robot works alongside humans and can move a 35kg (77-lb.) payload without the need for safety fences. “The highly sensitive robot gently stops if it comes into contact with an operator, allowing it and the human to work side by side,” says Greg Bell, product manager.
In April 2015, ABB introduced its YuMi collaborative, dual-arm robot, which can operate in very close collaboration with humans. Intended for assisting with small parts assembly, the robot can pause its motion within milliseconds if it senses a collision with a coworker. The robot has a lightweight, rigid magnesium skeleton covered with a floating plastic casing wrapped in soft padding to absorb impacts.
In 2012, Rethink Robotics created a two-arm collaborative robot named Baxter. Subsequently, it has developed a one-arm, high-performance collaborative robot called Sawyer, according to Jim Lawton, chief product and marketing officer. While Baxter was designed for material handling, packaging and other tasks where two arms are required, Sawyer can be used for machine tending, circuit board testing and other precise tasks.
“Schneider Electric uses Baxter to test circuit breakers, moving and orienting parts from the assembly line into testing cells and triggering the testing process,” Lawton explains. Baxter also collaborates with production associates to audit product quality for industrial and OEM applications.
This fall, GE will deploy its first Sawyer robot in a Hendersonville, NC lighting plant, where it will be on a production line positioning parts into a light fixture as a GE employee completes the assembly. “In the food and beverage industry, our robots are often packaging or handling parts on the same product line as human employees,” Lawton says. The robots handle changes and normal fluctuations that are inherent on most lines, for example, a part that is not consistently placed in the same location.
What about food and beverage?
“Today’s collaborative robots are relatively slow and not sanitary, so they are used at the end of lines or in environments that are not subject to washdown,” states Craig Souser, president of JLS Automation. “But, they don’t easily handle bottles, insert items into carriers or add a non-food item into a case. Collaborative robots with 3-D sensing could address [these applications].”
Collaborative applications require a human to be in the area, so a standard robot should be employed where guarding and higher speeds or heavier payloads are involved, according to John Schwan, QComp Technologies Inc. VP of sales & marketing.
Depending on the application, the seeming disadvantages of a collaborative robot might actually be its advantages. For example, collaborative robots readily learn repetitive motions because they do not work at high speeds, says Laura Studwell, Omron Robotics food, beverage & packaging industry marketing manager. “Robotic integration on the processing side has increased as robots offer higher IP ratings for food safety as well as grippers that can handle delicate applications.”
QComp is a preferred provider for the ABB YuMi robot. The robot isn’t used directly in regulated food applications, but according to Schwan, “the food industry is in general an untapped market for collaborative robots. Many applications are being considered in this area.”
“The key is to find the right applications,” says Ken McLaughlin, JMP Automation general manager, automation. “There are, however, a myriad of variants between traditional fenced robotic applications and purely collaborative robotic applications. More powerful safety software and sensor technology allow a traditional robot to be used in ways that often achieve the same benefit as that of a purely collaborative robot. Things like fenceless systems and close proximity through robot safety software allow traditional robots to do what they never could in the past.”
“The other paradigm is a better understanding of the strengths and weaknesses of both robotics and humans,” adds McLaughlin. “Robots are consistent and reliable, but unable to think and improvise like a person. The best manufacturing processes are designed to incorporate the consistency and throughput of robotics and automation, combined with people to think, problem solve and handle any variability that comes up.”
“We’ve seen applications that involve collaborative robots, which require lower cycle times, precision and payloads, and not a lot of reach,” notes Kevin Ackerman, JMP Automation controls specialist. Applications such as dessert decorating and gluing have been predominant. “The key to recognizing whether a collaborative robot will benefit an application is asking a simple question: ‘Does the robot need to move while a person is beside it?’”
When it comes to end-of-line packaging and palletizing, collaborative robots aren’t likely to be used due to the required production rates, says Mark Langenfeld, Spiroflow robotic automation specialist. “However, interest in collaborative robots for low-speed machine loading and unloading applications is growing.”
Rethink Robotics’ Lawton believes collaborative robots are a good match for several food industry packaging and material handling applications. “We have customers that leverage our robots to package and count cups, package food products and more. We work together with quite a few contract packagers in this space, for whom the robots’ ability to move from one job to the next quickly and easily is paramount.”
The types of arms used in palletizing cells don’t lend themselves to using collaborative robotic applications, says Matt Wicks, Intelligrated vice president, product development, manufacturing systems. “This is due to the speeds and weights required for these applications. With that said, low-speed item picking and packing do lend themselves to collaborative applications. They have the potential to bridge the gap between a manual process and a highly automated solution.”
Another application for collaborative robots is what might be called “go fetch.” For example, an automated guided vehicle (AGV) can retrieve goods or ingredients from an AS/RS warehouse and deliver them to operations personnel, says Don Heelis, Cimcorp Automation Ltd. sales manager, tire and AGV systems. The company’s autonomous delivery and manipulation AGV, also known as ADAM (Antonymous Delivery and Manipulation), works alongside and with employees on the manufacturing floor.
Improving robot brains and technology
While most hardware currently used in systems is commodity (e.g., Intel Core i7 CPU with four processors on board, memory and video or network controllers), there is no question that software plays the biggest role in allowing robots to perform faster, adapt to their surroundings and operate more safely around people.
“The software is the big breakthrough area that allows these units to be safe and simple to deploy,” says JLS Automation’s Souser. “This [breakthrough] should spill over into guarded applications, simplifying the engineering effort required for development/commissioning.”
The advances in components and software to comply with safety levels like SIL/PL allow the robots to be designed to work more closely with humans, says Stäubli’s Rochelle. “For example, our system includes a safe zone around the robot and tooling that prevents crashes and protects against damaging the robot tooling, other machine components and the products being processed.”
Improved robot safety systems, along with increased reliability of safety scanner technology, have opened up a huge “semi-collaborative” subset of applications, says JMP Automation’s McLaughlin. “In these applications, a traditional industrial robot can often be applied in ways that offer the same benefits as a purely collaborative robot, but with a higher payload and speed at a lower cost.”
“Advancements in software development have significantly changed how we design robotic systems,” says Spiroflow’s Langenfeld. “This software development has enabled us to produce smaller, safer automation cells and safe zones in the robot work area. An example of this would be a palletizing system with a pallet on either side of the robot. We can create a safe zone that will allow an operator to enter the robot cell and remove one pallet while the robot palletizes on the other.”
A seemingly unlikely source of intelligence and programming comes from the open source community. “Open source robotics software such as ROS [robot operating system] and ROS-Industrial has provided the framework for many of the recent robotic advances,” says Intelligrated’s Wicks. “It has established an effective way for the various pieces of robotic software to interface in a standardized way and common environment. Universities and research organizations have been able to advance the technology. These advances are now starting to become more mainstream in the form of integrated machine vision and advanced motion control.”
Other enhancements also have improved performance. For robotic applications where speed is the major criterion (e.g., pick and place), decreasing the weight of robotic modules equals increased pick speeds, says Bosch’s Aasen. Lighter robotic machine modules, designed with 60 percent fewer components, can be placed in a series to form high-volume picking lines with speeds up to 450 products per minute. Hygienic, open-frame designs with fewer parts not only provide better visibility of moving elements, they also help with maintaining cleanliness and meeting food safety regulations.
For truly mobile robots, a source of reliable energy is needed, and that must be provided by batteries. “Battery technology advancements driven by the automotive industry are benefiting robots,” says Cimcorp’s Heelis. “Smart batteries that can handle a wide range of charging profiles and maintain life capacity over several years give flexibility to robot control systems. Fleets of robots can replenish battery charges and be kept charged by using ‘opportunity charging’ when idling/loading/unloading.” This obviates the need for robots to be charged offline every few hours.
Improving a robot’s sense of space and place
LiDAR (light detection and ranging), which typically uses lasers and is capable of 3-D distance determinations, and high-speed vision systems allow a robot to see its surroundings. “These systems can augment a robot’s collaborative status,” says Rochelle. Depending on the application, these technologies also may improve the robot’s ability to recognize product and make decisions on it, he adds.
“Performing a 3-D scan of the robot work area is an important emerging technology,” says Spiroflow’s Langenfeld. “When we combine this with the ability to create safe zones, we can give robots the intelligence to monitor a human working or moving in the work area. If the human gets too close, the robot can slow down and stop if necessary.”
Applying a complement of sensors, in combination with advanced algorithms for path planning and obstacle avoidance, allows robots to share pathways and working areas safely with humans, says Cimcorp’s Heelis. “Plus, advancements being made in sensors will further enhance a robot’s awareness of its position and orientation in three-dimensional space.”
“Sensors allow eliminating or minimizing traditional machine fencing, providing better visualization on the plant floor,” says JMP Automation’s Ackerman. “Plant floor real estate is better utilized because equipment can be packed closer together. And, most importantly, more intuitive systems make it easier for a person to work with a robot and quickly troubleshoot issues.”
Advances in vision technology have been a key ingredient in making robots safer, says Kyle Kidwell, technical marketing engineer, machine vision technology, Keyence Corporation of America. “Vision systems can be used for part location as well as in setting up pick-and-place applications to speed up part sorting and packaging.
“Vision systems also can convert the pixel world provided by a camera into the real-world coordinated systems of the robot,” adds Kidwell. “This is critical as vision systems can help account for the offset of the robot’s end-of-arm tooling for accurate visual guidance.”
High-speed machine vision can be integrated with robotics to allow a robot to be aware of its environment and adapt accordingly, enabling advanced functions for robotic motion, says Intelligrated’s Wicks. As people or objects interact with the robot, machine vision makes it aware of their presence and allows it to adapt its motion paths to accomplish the assigned task.
“Improved robot controls and vision systems aid in the collaborative area,” says QComp’s Schwan. “The human hand and eye are very difficult to replicate in automation in a cost-effective way. However, incremental steps have been made in improving 3-D image processing and inspection tools, allowing some products to be handled robotically. Force feedback on robots or proximity sensors and area scanners allow robots to be placed in areas without fences since the scanners provide feedback to slow or stop the robot if the designated area is breached.”
“At Rethink, we’ve incorporated advanced, force-sensing capabilities that prompt our robots to stop whenever they make human contact during operation,” says Lawton. “This feature is made possible by high-resolution force detection throughout their joints, coupled with our elastic actuators, which have some built-in flexibility and ‘give’ like a human arm. In addition, Rethink’s patented robot positioning system [RPS] technology enables Baxter and Sawyer to work in real-world manufacturing environments. The robots can adjust to slight bumps and misplaced parts, without stopping or needing to reprogram.”
Teaching the robot
Teaching or programming robots has become much easier for system integrators and end-users. “Kuka Robotics has provided significant advancements in this area,” says Spiroflow’s Langenfeld. Its robots can be completely programmed through the use of a PLC, allowing the average plant maintenance person to program and troubleshoot a system. “When it comes to machine loading applications, most collaborative robots are programmed in a fashion similar to the first industrial robots offered in the ‘60s and ‘70s,” Langenfeld continues. “It is a simple method with the operator moving the robot to a position, pressing a record button, moving the robot to the nest position, flipping on or off a switch to control a device and pushing the record button. It is very simple.”
This programming process is what QComp’s Schwan calls “lead-through.” “With this, the user can move the robot manually, hit a button to teach that point, open or close a gripper, etc. This is helpful in very simple, non-precision tasks.”
But programming a robot is just a piece of the total system pie. Sometimes, an expert is needed because some integration is involved, Schwan continues. “Factors not often considered are the integration of the robot with other equipment or material handling conveyors, vision targeting or inspections, changing SKUs, etc. Tying these pieces together, which are running on a PLC, prevents this [end-user programming] from being practical. What happens with an error? How does the system get reset or recover from errors? These are things a non-programmer may be able to resolve.”
“Robots have really evolved into simple, proven technologies,” says Darryl King, JMP Engineering branch manager, Cambridge. “Anyone who can operate an iPhone or Xbox can program a robot. Often, our systems are even recipe based, allowing an operator to enter parameters and set up the robot without ever picking up a pendant. ”
“You don’t need a degree in robotics to make a smart collaborative robot work for you,” contends Rethink Robotics’ Lawton. “The best person to train the robot is one who actually does the work.” Rethink uses a train-by-demonstration model where employees without an engineering or programming background show Baxter or Sawyer how to perform a task on the factory floor. “This [training module] opens up a whole new opportunity for small and mid-sized manufacturers that don’t have the budget for a system integrator or IT team to deploy traditional robots,” concludes Lawton.
Robots for everyone?
But are robots affordable for small or mid-sized food and beverage processors? “They are absolutely affordable for these processors,” says King. “Over the past 10 years, the reliability of robotics has dramatically improved. At the same time, the cost has been drastically reduced. Many operations use a loaded cost of $15/hour for labor. We are competitive in these applications, with ROIs in the two-year range.”
According to Stäubli’s Rochelle, the most important concerns when purchasing a robot are whether it will fit in with your food safety requirements and meet government regulations. “Robots in the food processing industry cannot just be automotive models with FDA-approved paint, food-grade lubrication or a cover. They must be designed with cleaning and sanitation in mind.” In addition, processors must consider their applications’ attributes or specific needs and look for robots with design criteria for food and beverage as part of their DNA.