Robots Learn to Use the Force
Soft robotic grippers use ML to learn how to use just the right amount of force to grasp and manipulate virtually any tool.
Advances in robotics are profoundly reshaping the world around us in countless ways. Medical robots are helping surgeons to be more precise and reduce patient blood loss, and are also helping with post-op disinfection to improve patient outcomes. In manufacturing settings, robots are optimizing workflows to produce products at higher speeds and with greater consistency than was possible in the past. And even in our homes, robotic vacuums take care of our chores so that we have more time to do what we would like to do. But one common thread that runs through all of these use cases is that each robot is purpose-built for a very specific task. Even with all of the innovations brought forth in recent decades, we are still a long way away from a general purpose robot like Rosie the Robot from The Jetsons that can perform virtually any task requested of it.
One capability missing from present robots is the ability to grasp and manipulate any arbitrary tool in the wild. Being able to do so is a prerequisite for a robot to be able to perform a variety of tasks in a human-like manner, so a team of researchers at MIT teamed up with a group at the Toyota Research Institute to work towards this important goal. They have recently published their results on SEED, or Series Elastic End Effectors in 6D, that defines a framework for the use of soft bubble grippers that employ a learning algorithm to exert precisely the right amount of force on a tool for its proper use. Continued development of this system could one day lead to the development of a robot capable of manipulating any tool that it happens across.
SEED utilizes a PicoFlexx IR-Depth camera that is mounted within the bubble gripper of the robot. While gripping a tool, the position of the contact patch is estimated using a background subtraction algorithm. A map of how the grippers deform over a six-dimensional space is generated, and this information is used to estimate the relative pose of the tool. Based on information learned from past experience, a model is built that maps tool positioning to force measurements. These measurements are then used to adjust gripper pressure to grip the tool just right, in a way that even Goldilocks would approve of.
The team implemented their system on a robotic arm to put it through its paces in a series of trials. In one scenario, they had the robot perform the deceptively difficult task of squeegeeing a liquid on a flat surface. The force applied to the squeegee needs to change rapidly in real time as it glides across the surface β too much or too little pressure and either the tool is ineffective, or it falls out of the gripper. It was found that SEED performed quite well in this task, whereas baseline methods struggled to get it right.
The researchers also applied their technique to writing with a pen on paper, and tightening a screw with a screwdriver. SEED showed itself to be very capable in all of these tests. There is still a long path ahead, however. At present, the system requires that a tool be cylindrical in shape, for example. There are many other limitations aside from this to be worked out as well, so it will be a good while yet before general purpose gripping robots become a reality, but this research is a significant step in the right direction.