The Ground Sounds Ruff

TRACEPaw is a robot dog leg that uses novel sensing methods to simply and inexpensively recognize terrain types and measure contact forces.

Nick Bild
1 year agoRobotics
This robot dog leg uses ML and simple sensors to understand its surroundings. (📷: Autonomous Robots Lab)

Creating robot arms and legs with a human-like ability to sense their surroundings, particularly through the sense of touch, involves the integration of complex and costly sensing equipment. To engineer these components, inspiration is drawn from the intricate sensory mechanisms present in the human body, where touch receptors play a vital role in perceiving and interacting with the environment. Replicating this capability in robots necessitates an amalgamation of cutting-edge technologies, each contributing to a holistic sensory experience that bridges the gap between the artificial and the organic.

Among the most important components are tactile sensors, devices designed to mimic the function of human skin's mechanoreceptors. These sensors are strategically embedded within the robot's limbs, covering various surfaces to capture a diverse range of tactile information. Further, micro-electromechanical systems are commonly employed to create arrays of sensitive elements that can detect pressure, vibration, temperature, and texture. These sensors generate a plethora of data points as the robot interacts with its environment, conveying information about the texture of objects, the force applied during grasping, and the subtleties of surface irregularities.

Unfortunately, integrating and interpreting these diverse data types can be very challenging, and the costs associated with creating such a setup can easily keep most hobbyists and smaller businesses from deploying advanced robotic systems. Solutions to this dilemma have been proposed, like GelSight sensors, that eliminate many of the complex and costly sensors and instead leverage machine learning to infer sensory information from simpler hardware setups. Recently, the Autonomous Robots Lab team extended the GelSight sensor concept and took it to a new level of simplicity in creating a smart paw for robot dogs called TRACEPaw.

TRACEPaw (Terrain Recognition And Contact force Estimation Paw) does exactly what the name implies — it recognizes terrain and estimates contact forces. But it does so in a very simple way, leveraging only the tiny Arduino Nicla Vision development board for all sensing and computation.

Capping the bottom of the foot is a hemispherical silicone pad that comes into contact with surfaces the robot encounters. As it steps, the silicone pads are deformed. The Nicla Vision’s camera captures these deformations from the inside and feeds the images into a machine learning algorithm that has been trained to produce an accurate measure of contact force based on the shape and severity of the deformation. The Nicla Vision’s microphone is leveraged to recognize a variety of terrain types — as the robot walks, sounds are captured and classified by another machine learning algorithm that can determine if it is walking on, for example, dirt or gravel.

All of the processing, including running the machine learning algorithms, takes place directly on the Nicla Vision’s microcontroller, so no wireless connection to the cloud is required. This also eliminates the latency that remote processing would introduce, which would otherwise slow down the robot’s reaction times.

The Autonomous Robots Lab team has released their source code, training data, and machine learning models, and has also documented the build in the form of a wiki. This information, in conjunction with the low cost of the Nicla Vision development board makes the TRACEPaw a viable solution for even a hobbyist with a small budget — as long as they are willing to do some DIY work.

Nick Bild
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles