Important update
Since the time this project was developed, our team created a much cheaper and easier to use EMG sensor - uMyo - which can get the same results in a simpler way. Check it out!
So, getting back to the project:
Our team has a long story with robotic hands. For a while we tried to make a reliable prosthetic hand, but for this project I'm using a good example of existing open source hand: inMoov.
I won't get into details of hand assembly - it's well described on the project's site and is quite complicated. I'll focus on control here, since that's completely new :)
Also, check out how this technology evolved over time in the next project: https://www.hackster.io/the_3d6/seeing-muscles-at-work-8-channel-emg-with-leds-039d69
Control is based on EMG - electrical activity of muscles. EMG signal is obtained by three uECG devices (I know, it is supposed to be an ECG monitor, but since it is based on a generic ADC, it can measure any biosignals - including EMG). For EMG processing, uECG has a special mode in which it sends out 32-bin spectrum data, and "muscle window" average (average spectral intensity between 75 and 440 Hz). Spectrum images look like this:
Here frequency is on a vertical axis (on each of 3 plots, low frequency at the bottom, high at the top - from 0 to 488 Hz with ~15 Hz steps), time is on a horizontal (old data on the left overall here is about 10 seconds on the screen). Intensity is encoded with color: blue - low, green - medium, yellow - high, red - even higher.For a reliable gesture recognition, a proper PC processing of these images is required. But for simple activation of robotic hand fingers, it's enough to just use averaged value on 3 channels - uECG conveniently provides it at certain packet bytes so Arduino sketch can parse it. These values look much simpler:
Red, green, blue charts are raw values from uECG devices on different muscle groups when I'm squeezing thumb, ring and middle fingers correspondingly. For our eye these cases clearly are different, but we need to turn those values into "finger score" somehow so a program can output values to hand servos. The problem is, signals from muscle groups are "mixed": in 1st and 3rd case blue signal intensity is about the same - but red and green are different. In 2nd and 3rd cases green signals are the same - but blue and red are different. In order to "unmix" them, I've used a relatively simple formula:
S0=V0^2 / (( V1 *a0 +b0)( V2 * c0+d0))
where S0 - score for channel 0, V0, V1, V2 - raw values for channels 0, 1, 2, and a, b, c, d - coefficients which I adjusted manually (a and c were from 0.3 to 2.0, b and d were 15 and 20, you would need to change them to adjust for your particular sensor placement anyway). The same score was calculated for channels 1 and 2. After this, charts became almost perfectly separated:
For the same gestures (this time ring finger, middle, and then thumb) signals are clear and can be easily translated into servo movements just by comparing with threshold.
2. SchematicsSchematics is quite simple, you need only nRF24 module, PCA9685 or similar I2C PWM controller, and high amp 5V power supply that would be enough to move all these servos at once (so it requires at least 5A rated power for stable operation).
List of connections:
nRF24 pin 1 (GND) - Arduino's GND
nRF24 pin 2 (Vcc) - Arduino's 3.3v
nRF24 pin 3 (Chip Enable) - Arduino's D9
nRF24 pin 4 (SPI:CS) - Arduino's D8
nRF24 pin 5 (SPI:SCK) - Arduino's D13
nRF24 pin 6 (SPI:MOSI) - Arduino's D11
nRF24 pin 7 (SPI:MISO) - Arduino's D12
PCA9685 SDA - Arduino's A4
PCA9685 SCL - Arduino's A5
PCA9685 Vcc - Arduino's 5v
PCA9685 GND - Arduino's GND
PCA9685 V+ - high amp 5V
PCA9685 GND - high amp GND
Finger servos: to PCA channels 0-4, in my notation thumb - channel 0, index finger - channel 1 etc.
In order to get reasonable readings, it's important to place uECG devices, which are recording muscle activity, in right places. While many different options are possible here, each requires different signal processing approach - so I'm sharing what I've used:
It may be counter-intuitive, but thumb muscle signal is better visible on the opposite side of the arm, so one of sensors is placed there, and all of them are placed close to the elbow (muscles have most of their body in that area, but you want to check where exactly yours are located - there is quite a big individual difference)
4. CodeBefore running the main program, you will need to find out unit IDs of your particular uECG devices (it is done by uncommenting line 101 and turning devices on one by one) and fill them into unit_ids array (line 37).
#include <SPI.h>
#include <RF24.h>
#include <RF24_config.h>
#include <nRF24L01.h>
#include <Adafruit_PWMServoDriver.h>
#include <Wire.h>
#define SERVOMIN 150 // this is the 'minimum' pulse length count (out of 4096)
#define SERVOMAX 600 // this is the 'maximum' pulse length count (out of 4096)
Adafruit_PWMServoDriver pwm = Adafruit_PWMServoDriver();
int rf_cen = 9; //nRF24 chip enable pin
int rf_cs = 8; //nRF24 CS pin
RF24 rf(rf_cen, rf_cs);
//pipe address - hardcoded on uECG side
uint8_t pipe_rx[8] = {0x0E, 0xE6, 0x0D, 0xA7, 0, 0, 0, 0};
uint8_t swapbits(uint8_t a){ //uECG pipe address uses swapped bits order
// reverse the bit order in a single byte
uint8_t v = 0;
if(a & 0x80) v |= 0x01;
if(a & 0x40) v |= 0x02;
if(a & 0x20) v |= 0x04;
if(a & 0x10) v |= 0x08;
if(a & 0x08) v |= 0x10;
if(a & 0x04) v |= 0x20;
if(a & 0x02) v |= 0x40;
if(a & 0x01) v |= 0x80;
return v;
}
long last_servo_upd = 0; //time when we last updated servo values - don't want to do this too often
byte in_pack[32]; //array for incoming RF packet
unsigned long unit_ids[3] = {4294963881, 4294943100, 28358}; //array of known uECG IDs - need to fill with your own unit IDs
int unit_vals[3] = {0, 0, 0}; //array of uECG values with these IDs
float tgt_angles[5]; //target angles for 5 fingers
float cur_angles[5]; //current angles for 5 fingers
float angle_open = 30; //angle that corresponds to open finger
float angle_closed = 150; //angle that corresponds to closed finger
void setup() {
//nRF24 requires relatively slow SPI, probably would work at 2MHz too
SPI.begin();
SPI.setBitOrder(MSBFIRST);
SPI.beginTransaction(SPISettings(1000000, MSBFIRST, SPI_MODE0));
for(int x = 0; x < 8; x++) //nRF24 and uECG have different bit order for pipe address
pipe_rx[x] = swapbits(pipe_rx[x]);
//configure radio parameters
rf.begin();
rf.setDataRate(RF24_1MBPS);
rf.setAddressWidth(4);
rf.setChannel(22);
rf.setRetries(0, 0);
rf.setAutoAck(0);
rf.disableDynamicPayloads();
rf.setPayloadSize(32);
rf.openReadingPipe(0, pipe_rx);
rf.setCRCLength(RF24_CRC_DISABLED);
rf.disableCRC();
rf.startListening(); //listen for uECG data
//Note that uECG should be switched into raw data mode (via long button press)
//in order to send compatible packets, by default it sends data in BLE mode
//which cannot be received by nRF24
Serial.begin(115200); //serial output - very useful for debugging
pwm.begin(); //start PWM driver
pwm.setPWMFreq(60); // Analog servos run at ~60 Hz updates
for(int i = 0; i < 5; i++) //set initial finger positions
{
tgt_angles[i] = angle_open;
cur_angles[i] = angle_open;
}
}
void setAngle(int n, float angle){ //sends out angle value for given channel
pwm.setPWM(n, 0, SERVOMIN + angle * 0.005556 * (SERVOMAX - SERVOMIN));
}
float angle_speed = 15; //how fast fingers would move
float v0 = 0, v1 = 0, v2 = 0; //filtered muscle activity values per 3 channels
void loop()
{
if(rf.available())
{
rf.read(in_pack, 32); //processing packet
byte u1 = in_pack[3];//32-bit unit ID, unique for every uECG device
byte u2 = in_pack[4];
byte u3 = in_pack[5];
byte u4 = in_pack[6];
unsigned long id = (u1<<24) | (u2<<16) | (u3<<8) | u4;
//Serial.println(id); //uncomment this line to make list of your uECG IDs
if(in_pack[7] != 32) id = 0; //wrong pack type: in EMG mode this byte must be 32
int val = in_pack[10]; //muscle activity value
if(val != in_pack[11]) id = 0; //value is duplicated in 2 bytes because RF noise can corrupt packet, and we don't have CRC with nRF24
//find which ID corresponds to current ID and fill value
for(int n = 0; n < 3; n++)
if(id == unit_ids[n])
unit_vals[n] = val;
}
long ms = millis();
if(ms - last_servo_upd > 20) //don't update servos too often
{
last_servo_upd = ms;
for(int n = 0; n < 5; n++) //go through fingers, if target and current angles don't match - adjust them
{
if(cur_angles[n] < tgt_angles[n] - angle_speed/2) cur_angles[n] += angle_speed;
if(cur_angles[n] > tgt_angles[n] + angle_speed/2) cur_angles[n] -= angle_speed;
}
for(int n = 0; n < 5; n++) //apply angles to fingers
setAngle(n, cur_angles[n]);
//exponential averaging: prevents single peaks from affecting finger state
v0 = v0*0.7 + 0.3*(float)unit_vals[0];
v1 = v1*0.7 + 0.3*(float)unit_vals[1];
v2 = v2*0.7 + 0.3*(float)unit_vals[2];
//calcualting scores from raw values
float scor0 = 4.0*v0*v0/((v1*0.3 + 20)*(v2*1.3 + 15));
float scor1 = 4.0*v1*v1/((v0*2.0 + 20)*(v2*2.0 + 20));
float scor2 = 4.0*v2*v2/((v0*1.2 + 20)*(v1*0.5 + 15));
//print scores for debugging
Serial.print(scor0);
Serial.print(' ');
Serial.print(scor1);
Serial.print(' ');
Serial.println(scor2);
//compare each score with threshold and change finger states correspondingly
if(scor2 < 0.5) //weak signal - open finger
tgt_angles[0] = angle_open;
if(scor2 > 1.0) //strong signal - close finger
tgt_angles[0] = angle_closed;
if(scor1 < 0.5)
{
tgt_angles[1] = angle_open;
tgt_angles[2] = angle_open;
}
if(scor1 > 1.0)
{
tgt_angles[1] = angle_closed;
tgt_angles[2] = angle_closed;
}
if(scor0 < 0.5)
{
tgt_angles[3] = angle_open;
tgt_angles[4] = angle_open;
}
if(scor0 > 1.0)
{
tgt_angles[3] = angle_closed;
tgt_angles[4] = angle_closed;
}
}
}
5. ResultsWith some experiments that took about 2 hours, I was able to get quite reliable operation (video shows a typical case):
It behaves not perfectly and with this processing can only recognize open and closed fingers (and not even each of the 5, it detects only 3 muscle groups: thumb, index and middle together, ring and little fingers together). But "AI" that analyzes signal takes 3 lines of code here and uses a single value from each channel. I believe way more could be done by analyzing 32-bin spectral images on PC or smartphone. Also, this version uses only 3 uECG devices (EMG channels). With more channels it should be possible to recognize really complex patterns - but well, that's the point of the project, to provide some starting point for anyone interested :) Hand control is definitely not the only application for such system.
Comments