Robots can pack things very quickly. That’s why they have become indispensable in distribution centers. But they’re still not very smart. That’s why humans can sometimes do their work much faster. Because they are able to anticipate things and think for themselves. The problem is that fewer and fewer people feel like doing this kind of work. That is why the TU/e (Eindhoven University of Technology), together with two other universities, a knowledge institute and four companies, are starting a four-year research project into a packaging robot that is capable of replacing a human being for these types of tasks.
Robots are very good at performing routine tasks. For one thing, you can program them to pick up objects and place them in a box or on a conveyor belt. Nevertheless, there is one serious drawback. Because robots are unable to foresee the effect of collisions, currently their arms stop just before picking up a package or placing it somewhere. This prevents an item from bumping into something too hard and causing damage.
People are much better at anticipating the effect of a collision, and that’s why they’re considerably faster. They are able to put an item on a conveyor belt in one swift, fluid movement. Or even throw it, without damaging the item, the conveyor belt or themselves.
Subscribe to our Newsletter!
Impact-aware learner robots
Researchers from the TU/e, together with colleagues from the Technical University of Munich (TUM), the École polytechnique fédérale de Lausanne (EPFL), and the French CRNS research institute are exploring how they can create an impact-aware logistics robot. The idea is that these new robots can reliably predict the effect of a hard or soft collision with their environment or with other objects, so that they can capitalize on that collision to pick up and place items more efficiently. Alessandro Saccon, research leader representing TU/e, expects that this will enable them to work 10% faster.
The researchers want to accomplish this in a number of ways:
- by designing advanced models that can predict what will happen when the robot, the package or the conveyor belt collides with each other;
- by using artificial intelligence that teaches the robot how to move quickly without causing any damage;
- and by developing advanced sensors that give the robot ears and eyes so that it will know what is happening when a collision occurs.
Packing robots play a major role in logistics
“Impact-aware robots can play a major role in logistics,” says Saccon. “E-commerce is booming, plus it’s becoming increasingly difficult to find employees in a tight labor market. Moreover, work in distribution centers is often mind-numbing and bad for your health. Impact-aware robots can therefore offer a solution so that employees are freed up for other types of work.”
In this project, the researchers work closely together with four companies that specialize in logistics, automation, robot production and physical simulation. Namely, Vanderlande and Smart Robotics, both from The Netherlands, Munich-based Franka Emika and the Swedish company Algoryx.
Impact Aware Manipulation
The research project is called I.AM, which stands for Impact Aware Manipulation. The European Union is supporting the project to the tune of € 4.4 million. More information about the project can be found on I.AM’s official site.
Cover photo: The I.AM research project is examining three scenarios for the application of impact-aware robots. Each scenario is more complex than the previous one. In scenario 1, a package is thrown on a conveyor belt (TOSS). Scenario 2, a package is placed in a crate (BOX). And in scenario 3, a heavy package is removed from a pallet (GRAB). (Illustration: Visueeltjes)
Innovation Origins is an independent news platform that has an unconventional revenue model. We are sponsored by companies that support our mission: to spread the story of innovation. Read more.
At Innovation Origins, you can always read our articles for free. We want to keep it that way. Have you enjoyed our articles so much that you want support our mission? Then use the button below: