It's best to start with video showing it at work:
Quite a while ago I posted another project showing 3 uECG devices working in EMG mode and controlling a robotic hand (https://www.hackster.io/aka3d6/robotic-hand-control-using-emg-349254) While it really did work and it was possible to replicate it, technology was, well, quite far from perfect to say the least, and it wasn't practically usable due to a set of problems (especially in radio connection part) - someone could make it work with a lot of efforts, but well, there are too many half-ready projects out there. :-)
We wanted to get back to it on the whole new level, with dedicated EMG device ̶a̶n̶d̶ ̶g̶o̶t̶ ̶q̶u̶i̶t̶e̶ ̶f̶a̶r̶ ̶o̶n̶ ̶t̶h̶a̶t̶ ̶w̶a̶y̶,̶ ̶b̶u̶t̶ ̶s̶t̶i̶l̶l̶ ̶a̶r̶e̶n̶'̶t̶ ̶t̶h̶e̶r̶e̶ and finally made it! Check out uMyo EMG sensor we recently developed!
Right now we are improving python PC software to make gesture/muscle activity easy to use in other projects: current version is good for separate EMG channels but doesn't yet support more complex pattern recognition. So if you wanted to tackle something related to muscles - either from visual perspective, or by using it to control something - now you have a tool for that. In its simplest form, it can be used just to add some visual effects:
But if you will look closer at charts going in top right corner, you will see spectrum and processed EMG levels. All these data are fully available (after all, the whole project is open source / open hardware) - they are quite ready for processing with ML methods, and even simple threshold-based detection would give decent results.
System setup
For making some EMG-based project, you will need several uECG units (starting from 2 devices you can distinguish muscle groups responsible for different fingers, 4 units would give quite precise information) and one PC base station (that thing with USB). Base station also is used for wireless firmware update - and there will be a lot of new versions in coming months, as we'll catch bugs and add functions.
Placing the units
Units placement is not a simple question - as an above image illustrates, some muscles are right under the skin (indicated with blue color), some others lay deeper, beneath other muscle layers (green ones), and their positions, while generally similar for all humans, still aren't precisely located on every arm (and their skin projections shift as the arm moves!) - so the more precision you want, the more important and user-dependent placement becomes.
Still, for major finger movements even rough estimation works - in all my attempts I used old trusty "attach where it looks fit" approach without any measurements, and over different sessions I placed units in quite different spots - so I guess for low- and moderate-precision setups it shouldn't be a problem - just make sure that target muscle is close to the skin in a certain area and place the device somewhere around that. :-)
Processing the data
This part is actually under active development right now. While data look very promising, we want to create an API that would allow simple processing (I know from my own experience - modifying even a properly commented code is hard, and our code is far from being well commented) - and it's not ready yet. If you want some particular functionality, your comments would be greatly appreciated - we plan to release the first version within a couple of weeks, and when it's done - discuss in details what can be done with it, in a more technical and less demo post (as soon as it's ready, a link will appear in this place). :-)
Comments