SHeaLDS Give This Soft "Starfish" Robot the Ability to Detect Cuts — and Heal Them in a Minute
By detecting changes in light intensity along fiber-optic waveguides, this robot reacts to damage — and heals itself automatically.
Researchers at Cornell University have developed a soft robot capable of detecting when and exactly where one of its limbs is damaged — and then healing in, in order to continue its mission as quickly as possible.
"Our lab is always trying to make robots more enduring and agile, so they operate longer with more capabilities," explains Rob Shepherd, associate professor of mechanical and aerospace engineering, of the team's work. "The thing is, if you make robots operate for a long time, they’re going to accumulate damage. And so how can we allow them to repair or deal with that damage?"
The first problem to solve in addressing that question: figuring out a way for a robot to not only detect that it has been damaged but to locate precisely where that damage is. The solution: fiber-optic sensors, made from stretchable materials, which are integrated into the robot. When damage occurs, the sensors detect the change in light intensity — and can figure out where the damage must be.
Combined with a polyurethane urea elastomer featuring hydrogen bonds capable of rapidly healing the material when punctured, plus disulfide exchanges for increased strength, the team's resulting SHeaLDS system — Self-Healing Light Guides for Dynamic Sensing — are strong, flexible, and can repair damage without human interaction.
To prove the concept, the team built a soft robot looking similar to a four-legged starfish. One leg of the "starfish" robot was punctured six times, with the robot both detecting the damage and repairing itself within around a minute — and also being able to autonomously adjust its gait while the healing was taking place, favoring its undamaged limbs.
"[These materials] have similar properties to human flesh," Shepherd explains of the SHeaLDS' restorative capabilities. "You don’t heal well from burning, or from things with acid or heat, because that will change the chemical properties. But we can do a good job of healing from cuts."
The team's next step: integrating the sensors with an on-device machine learning system, which can interpret the data not only for damage detection but also to acquire information about its environment — using the "skin" to "feel" its way around.
The team's work has been published under open-access terms in the journal Science Advances.