In the recent years embedded ML has been one of the things that developers want to get their hands on. But for developing embedded ML projects have to be proficient in Python and C/C++. Not all embedded developers use Python. And not all ML developers use C++.
"Things are changing though"With the support from tons of industry giants and startups, the TinyML world is changing. Edge Impulse is helping developers to quickly go from development to production of embedded ML prototypes. You can now develop anything you want with super user-friendly TinyML software such as Edge Impulse within minutes.
Although Edge Impulse doesn't require you to have a Ph.D. for developing TinyML models BUT it isn't easy and simple yet to be used by middle/high schoolers especially students who are learning STEAM/STEM.
"Codecraft comes to the rescue"Seeed Studio's new TinkerGen IDE is mind-blowing. And the good thing is that it now has support for developing embedded ML prototypes right inside the web IDE. No need for using different tabs/software or even using the Edge Impulse Studio.
Let me show you a quick overview and walkthrough of the platform.
Head over to ide.tinkergen.com and sign up with your email address as shown below.
Moving on let us select the hardware that we're gonna be using for today i.e. the Seeed Studio Wio Terminal.
Next up we need to install Codecraft Assistant which will communicate with the Wio Terminal and Codecraft's Web IDE.
Click on the UPLOAD option and a PROMPT will get displayed as seen here.
Click on the Download Device Assistant and wait until it completes downloading it. Until then have a quick glance out of your window and enjoy the fresh air!
Alright since we're done with the downloading and installation let us move to try out a simple project using CodeCraft and our Wio Terminal.
So now connect the Wio Terminal using a Type-C USB Cable and click on the upload button in the Codecraft IDE.
Once it has successfully paired the device you see the following screen.
Roger that! We're now moving to try out some experiments with the Embedded Machine Learning feature of this IDE.
Clicking on the ModelCreation tab let us select the MotionRecognition model for this demonstration.
Now click on it and type a name for your project as shown below.
Upon clicking OK you get to see a page with Scratch-like programming as seen here.
But I have modified it a bit to look COOLER!
So here are some key points you need to remember, button A is the last button from left, B is the middle one and C is the first one.
Follow this for a quick understanding :
And now let us collect some DATA.
Firstly let us have an understanding of the motion we need to perform for the data collection process through the following gif's.
And perform each of the steps at least 6-7 times for getting good data for training the ML model (i.e. for 30-35 seconds).
I have collected 6 samples for each type with 5 seconds in each sample as shown below.
Next up : Model Training and Deployment
Click on the Training and Deployment tab choose the NN selection to be the default one i.e. Small. Higher NN will increase the training time and higher accuracy as well.
Let us keep the training cycles (known as Epochs) to be 20 for a faster model creation.
After all the changes it will look like this :
Just hit "Start training" and come back after a quick refreshment break.
This option lowers down usage of 3-4 steps that is done in EI Studio, but here you do it in just 1 click.
After 2-3 minutes :
You can also check out the Raw Data tab wherein it shows the data collected in a graphical representation.
Next up click on Model Training Report and VOILA!
So we got an accuracy of 95.8% (pretty good for the first attempt). Plus we can even see the inferencing time and Peak RAM and FLASH MEMORY usage.
Now click on Model deployment and be sure to check that you're deploying an int8 (quantized) model.
Now you will have to click on the Programming tab and simply drag and drop according to the program I've designed as shown below.
And just hit upload!
After waiting for 4-5 minutes you will see the following screen. Our device is now activated with the new program.
Now do a wave pattern it will print out a wave and similarly for flip and idle.
So we've actually made a complete ML project in less than an hour. From model selection to data collection and even deployment all without writing a single line of code scripts.
You can do anything you want with Codecraft and Edge Impulse, tons and tons more than the examples.
That's it for now. Feel free to comment below if you have any questions.
Signing off,
Arijit
Comments
Please log in or sign up to comment.