Flex Your Robotics Muscles
A low-cost, easy-to-use, wearable EMG sensor employs your muscle movements to control devices, like a robotic arm that mirrors your actions.
Electromyography (EMG) is a procedure that evaluates and records the electrical activity generated by skeletal muscles. It offers valuable insights into the functioning and well-being of muscles and the nerves responsible for their control. By utilizing electrodes placed on the skin or inserted into the muscles, EMG captures and quantifies the electrical signals produced during muscle contractions and relaxations.
In robotics, EMG signals can be utilized to control robotic systems and prosthetic devices. By detecting and interpreting the electrical activity of muscles, EMG can enable individuals to control the movements of robotic arms, exoskeletons, or robotic prosthetics simply by using their own muscle contractions. This approach, known as myoelectric control, allows for natural and intuitive control of robotic systems, enhancing their functionality and usability. Similar techniques have proved to be useful in developing other novel human-computer interfaces as well.
Getting started with EMG may sound complicated and expensive, and it certainly can be, but it does not need to be. The engineers over at Ultimate Robotics have developed an open source, wireless, wearable EMG monitor — and it starts at under $50. And based on the demonstrations highlighted by the team, this platform can make for some very interesting projects — like a robotic arm that mirrors your own movements — a reality.
Called uMyo, the device is a single-channel EMG sensor on an adjustable band that can be worn like a bracelet. But it is not limited to capturing arm and hand movements, it can also be used on the legs, torso, or face. The sensors wirelessly communicate with a receiver via either nRF24 or Bluetooth Low Energy. Several of these sensors can transmit data to the same receiver if a project requires it. The device is compatible with either dry or gel electrodes.
Ultimate Robotics says that their most recent version of the device works pretty much right out of the box. Each user only needs to tweak the thresholds a bit to get good results. To show what can be done with the uMyo after strapping it on and adjusting the thresholds, they demonstrated it controlling a robotic arm.
A pair of sensors were mounted on the arm, near the elbow, to capture signals corresponding to finger movements. A third sensor on the wrist was able to detect motions in the thumb. Some experimentation is needed — the team tried a number of other locations, like the triceps, before landing on the wrist. The EMG signals are all sent to an Arduino that controls the servos that actuate the fingers in the robotic arm. As the wearer of the uMyo devices moves their fingers, the robot arm mirrors those movements.
The results were quite good, but not exactly perfect in this case. Ultimate Robotics intentionally kept the signal processing extremely simple (just a few lines of code) to show how easy it can be to get started controlling external devices. But with a boost from machine learning, they believe they can smooth out those wrinkles, and they intend to show off that result in future work.
Another demonstration showed how uMyo can be leveraged for novel human-computer interactions. They created a Tetris-like game that was playable on an LED matrix, then used their EMG sensor to capture movements and rotations of the arm. Moving the arm left or right moves the falling block around the display, while rotations of the arm spin the block. The controls look very smooth in the demonstration, showing that the device can perform with little latency and a good degree of precision.
The design of the PCB has been released, and more details about the project are available on the project website. If you would like to pick up a finished product, check out the Tindie store page.
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.