I have been working on a real-time gesture recognition system for some time. I decided to write about it. It seems that many people have been interested in building interface systems. This is my contribution.
Note:I like to suggest that you check my first post regarding setting up the Ultra96-V2 board. Furthermore, I suggest you have Vitis AI installed on your PC along with dependencies. Let me know in the comment section if you want me to write about it.
Step One: Training, Validation, Quantization, and Compiling.
It is no news that you need data to train a model. I used MyoArmband Dataset from Ulysee Cote. You can go ahead to download it using this link https://github.com/UlysseCoteAllard/MyoArmbandDataset.
Next, you can go ahead to download starter code from my repo. The repo includes a python script for data preprocessing. A Jupyter notebook that uses Vitis-AI for training, quantization, and compiling can be found in the repo. If you need added information, you can let me know in the comments. Check the figure below for training accuracy and loss.
Step Two: Inference
I used the Pyomyo https://github.com/PerlinWarp/pyomyo by Peter Walkington to interface Myoband with the Ultra96-V2 via BLE dongle. You can clone the ultra96-inference-engine on your Ultra96-v2. For real-time inference, you can run the inference.py found in ultra96-inference-engine. You can also decide to use a Jupyter notebook.
Note: This is just a simple project; hence it needs a lot of optimization. I aim to create a quick guide that helps beginners walkthrough AI inference on the Ultra96-V2.
Comments
Please log in or sign up to comment.