MIT's Mini Cheetah Gets a Gait-Modifying Control System Upgrade, Based on Visual Cues

Equipped with a depth-sensing camera, this upgraded Mini Cheetah can anticipate gaps and other obstacles — and adjust its gait accordingly.

MIT's Mini Cheetah, the quadrupedal robot that has inspired any number of builds, has received an upgraded control system — allowing it to make the jump across uneven terrain in real-time.

MIT's biomimetic Mini Cheetah robot is a fascinating project, and one which has formed the inspiration for a wide range of spin-off designs — from the 3D-printed Baby Cheetah and its follow-up to the Champ — and has even triggered experiments in using the same brushless DC motor design for CNC applications.

The Mini Cheetah itself, meanwhile, now has a new control system — and it gives the robot brand-new capabilities for getting around. "Stepping in a gap is difficult to avoid if you can’t see it," Gabriel Margolis explains of the problem the new system aims to solve. "Although there are some existing methods for incorporating vision into legged locomotion, most of them aren’t really suitable for use with emerging agile robotic systems."

MIT's Mini Cheetah is now more agile than ever, thanks to a modular control system incorporating computer vision. (📹: Margolis et al)

The new control system, developed by MIT in partnership with Arizona State University and the University of Massachusetts Amherst, is dubbed Trajectory Modulation via Impulse Control (TMIC) — and uses visual guiding from an on-board camera system to modify how the robot gets around, allowing it to jump over gaps with ease.

The system is designed in a modular format, using existing "blind" control systems as a low-level controller and combining them with the new depth-sensing camera input in a neural network that operates as a high-level controller. "The hierarchy, including the use of this low-level controller, enables us to constrain the robot’s behavior so it is more well-behaved," Margolis explains. "With this low-level controller, we are using well-specified models that we can impose constraints on, which isn’t usually possible in a learning-based network."

"One novelty of our system is that it does adjust the robot's gait. If a human were trying to leap across a really wide gap, they might start by running really fast to build up speed and then they might put both feet together to have a really powerful leap across the gap. In the same way, our robot can adjust the timings and duration of its foot contacts to better traverse the terrain."

More details on the project are available on the official website, along with a link to the paper; the source code, meanwhile, is set to be published in the near future — but was not available at the time of writing.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Get our weekly newsletter when you join Hackster.
Latest articles
Read more
Related articles