When the robot takes your facial features, which you think13. May 2020
When the robot takes your facial features, which you think
Thomas Roszak is a maintenance engineer in a huge warehouse in Hatfield. The subject of robots was not new to him, but the fact that he was chosen to work with ARMAR – 6 filled him with pride and he longed for the day when he would face this wonder machine.
He wanted to know as early as possible what was in store for him. He wanted to be prepared. But he was calmed down, dropped some key words – but that was not enough for a satisfying picture. Among other things, it was said that he did not have to learn, but that the robot would show him what he had learned. This stimulated the technician’s imagination enormously, but he still did not come to a conclusion. Then suddenly the colleague stood before him, smiling, friendly voice. Roszak was perplexed: “Like a human” took it away from him.
Then a manager explained the assignment for the two of them: Repair and maintenance of the automatic sorting and packaging system used by the online supermarket to compile customers’ food orders. This often involves heavy physical labour. Roszack is strong, but the iron plate that he had to transport to the other hall caused him headaches. It weighed several hundredweight.
Roszack was surprised when the robot offered to take over the transport. How did the guy know what to do? The machine explained it to him: “I’ve been watching you and I could see what you were thinking about from your facial features.
It was a sensation, yes, there had never been anything like it before, Roszak was sure of it: a robot that could interpret human facial features and then be able to discuss it with its human partner. But now to the iron plate. “I’ll take care of that,” said ARMAR-6, and then the human was amazed again, because his movements were so fast that the maintenance technician anticipated one result: this was proof that in the future humans and robots will be able to solve difficult tasks together in a meaningful way.
ARMAR-6 was developed at the Karlsruhe Institute of Technology (KIT). The goal was to build a collaborative robot in which robotic arms and hands, vision systems, speech recognition were brought together with the help of AI, so that it no longer reacts only to commands, but anticipates how it can help its human partner.
For this purpose, the robot’s AI was trained by observing humans performing tasks. “I would say that 80% of the robot’s capabilities were actually learned through human demonstration,” confirmed Prof. Tamim Asfour of the Karlsruhe Institute of Technology (KIT). It turned out that the AI could be programmed much faster in this way than programming it from scratch by instructions, Asfour explained.
The team also wanted the robot to recognize language and react naturally. What was then demonstrated to them was summarized by Sebastian Stüker from KIT as follows: “The breakthroughs in areas such as natural language interfaces and task comprehension will lead to a better acceptance of robots by humans and enable easier, more natural use”.
To ensure that their robot can work side by side with a human, they equipped it with a vision system in which five cameras track the movements of the human worker and identify objects such as tools.
He was also trained to recognize how and when to use his power appropriately. “One of the requirements in the design of the arms was that they should detect collisions with the human body in time and stop immediately,” Asfour added.
Experts such as Prof. Nathan Lepora (head of the tactile robotics group at Bristol Robotics Laboratory) were also full of praise: “This is a huge challenge. They are trying to reproduce skills that currently only humans possess: working alongside other humans. It required expertise in building humanoid robots, online learning, where the robot teaches itself to do tasks, and sophisticated computer vision to navigate and interact with the environment”.