MIT: Robot hand for the potato chip world3. June 2020
MIT: Robot hand for the potato chip world
New York, 3.6.2020
Two new tools built by MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) can be considered a breakthrough in the field of soft robotics: robots that use muddy, flexible materials instead of traditional rigid equipment.
These types of soft robots are inspired by living organisms and offer numerous advantages in their versatile functionality. They are able to operate much more sensitively than their rigid counterparts. Until now, they have lacked the ability to perceive what objects they interact with. The MIT researchers wanted to solve this problem. So the researchers equipped their robots with various sensors, cameras and software that enable them to “see and classify” a range of objects.
Soft robot hands have sensorized skins that allow them to pick up a range of objects, from delicate ones, like potato chips, to heavy ones, like milk bottles.
The first robot was built in 2019 in research at MIT and Harvard University, where a team developed a robot gripper in the shape of a cone. It functioned by folding0 similar to a Venus flytrap. The result was that it could pick up a number of unfavorably shaped objects up to 100 times its weight.
By adding tactile sensors, the robot was able to understand what it was picking up and adjust the amount of pressure applied accordingly. Of the 10 objects used in the experiment, the sensors were able to identify them with an accuracy of more than 90 percent.
“Unlike many other soft tactile sensors, ours can be quickly manufactured and retrofitted into grippers and demonstrate sensitivity and reliability,” said Josie Hughes of MIT, the lead author of an article on the sensors. “We hope to be successful with this soft sensing method, which can be used for a variety of applications in manufacturing environments such as packaging and lifting,” Hughes said.
A second robot used an innovative “GelFlex” finger, which uses a tendon-driven mechanism and a series of sensors to provide “more nuanced, human-like senses”.
The team now plans to fine-tune the scanning algorithms and introduce more complex finger configurations, such as twisting.