The Future of Robots Is in Good Hands

Meta has released a suite of tools to support artificial touch perception and bring embodied AI and improved human-robot interaction to all.

Nick Bild
7 months ago β€’ Robotics
Meta has developed a complete solution for artificial touch perception (πŸ“·: Meta)

The latest and greatest artificial intelligence (AI)-based tools are quite impressive. They can guide drones through a flight around town, reason with a seemingly human-level understanding of language, and create beautiful art on request. Yet all of these cutting-edge systems are still lacking something β€” a complete understanding of the world around them. Many of these algorithms have no sensor inputs, leaving them blind to their surroundings. Those that do largely rely on computer vision alone to capture environmental data.

Vision is an important and dense source of information about the world, but it does not paint a complete picture. It is often said that the sense of touch is even more important than vision in understanding the world. It is especially difficult to interact with the things around us if we cannot feel them. From working with tools to flipping a switch or pressing a button, we really need tactile feedback for consistently good results. Unfortunately, artificially reproducing the sense of touch β€” at least in any way that even remotely resembles human-level capabilities β€” has proven to be extremely difficult, and there are few options available.

Researchers at Meta recently teamed up with some partners in industry, including GelSight Inc and Wonik Robotics, to develop a complete artificial touch perception solution. Their system contains a number of hardware and software components intended to bring precision touch sensing to the masses.

Central to this effort is Meta Digit 360, an artificial fingertip sensor based on GelSight technology, that mimics the human sense of touch in impressive detail. Designed to capture the minute forces and textures that characterize human tactile perception, Digit 360 incorporates over 18 sensing features, capturing even the smallest changes in spatial detail. By gathering data from the environment β€” such as texture, pressure, and even vibrations β€” Digit 360 allows AI systems to react to stimuli in a way that resembles natural reflexes.

Meta is also launching Meta Digit Plexus, a standardized platform that integrates various tactile sensors across a robot hand’s fingertips, fingers, and palm. This platform creates a cohesive tactile experience, similar to how our brain processes touch across our hand to inform motor actions. Plexus provides a comprehensive hardware-software ecosystem that facilitates data collection, control, and analysis, opening new opportunities for touch perception research.

To tie these hardware systems together, the team also released Meta Sparsh. It is a general-purpose encoder specifically designed for vision-based tactile sensing. Unlike traditional models, which require task- and sensor-specific customization, Sparsh works across a variety of tactile sensors and tasks by leveraging self-supervised learning. This approach allows Sparsh to learn from unlabeled data, making it scalable and adaptable to different contexts.

Combined with PARTNR, a benchmarking framework designed for assessing human-robot collaboration, these innovations lay the foundation for robots that are not only dexterous but also socially aware. By simulating both physical perception and collaborative intelligence, Meta's recent advancements are primed to unlock a future where robots can serve as intuitive partners in human environments, from homes to workplaces.

Nick Bild
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.
Latest articles
Sponsored articles
Related articles
Get our weekly newsletter when you join Hackster.
Latest articles
Read more
Related articles