In this tutorial, we will use Edge Impulse studio to train and deploy a Machine Learning model intended to run on an embedded target in our case RzBoard.
Embedded ML - Embedded machine learning, also known as TinyML, is the field of machine learning when applied to embedded systems such RzBoard.
Edge Impulse studio - We will use the edge impulse studio to train our ML model, then we will connect RzBoard to the studio and finally, we will download the model and deploy our ML model on the board.
Edge Impulse project setup- Click on the Link to Edge Impulse, to log in If you have an account, or click sign-up to create one.
- After creating an account, activate your account then click on create your first project and get started.
- After login, they will take you to your first project page. If you want to create a new project, click on your profile picture, then proceed to create a new project.
- Below, are the screen shoot of the steps.
After creating the project, we will start with the ML-related tasks. Our next steps are Data collection, Impulse design, and Deployment.
Data collection- For more details on computer vision and image classification, visit Shawn Hymel's Computer Vision with Embedded Machine Learning course
- For this project, we will be using Roboflow's Rock Paper Scissors Dataset.
- Download and unzip it.
Uploading data to edge Impulse studio
In your project inside Impulse studio click on the "data acquisition" tab and then on the "upload data" tab.
Upload the files for the rock from the folder we unzipped earlier. Press ctrl-A to select all the files, then click open.
Next select Training for the "Upload into category" selection and "Enter label" for the "Label" category. Then click "Begin upload".
Do the same for paper and scissors, and also for the testing data.
Impulse DesignThe Impulse Design tool of the edge impulse studio does data pre-processing and Model training.
Creating an Impulse
- Click on the impulse design tab on the left, this will take you to create impulse page.
- Click on "Add a processing block" and select add Image.
- Next click "Add a learning block" and then, select add "Transfer Learning (Images)". when done, save the Impulse.
Pre-processing
Click on the Image tab on the left, without changing anything save the parameters. Then generate features.
Note: to use DRP-AI we need to keep RGB color depth.
Building a machine learning model
Currently, all Edge Impulse models can run on the RZ/V2L CPU which is a dedicated Cortex A55. In addition, you can bring your own model to Edge Impulse and use it on the device. However, if you would like to benefit from the DRP-AI hardware acceleration support including higher performance and power efficiency, please use one of the following models:
For object detection:
- Yolov5 (v5)
- FOMO (Faster objects More Objects)
For Image classification:
- MobileNet v1, v2
Training
Click on the Transfer learning tab, this will take you to the training page. Note that, on the training page you have to select the target before starting the training in order to tell the studio that you are training the model for the RZ/V2L. This can be done on the top right of the training page.
Click on target, then set Renesas RZ/V2L (with DRP-AI accelerator) as the target.
Once the target is set click Start training.
After the training is done, you will see our model's performance results.
Validating your modelWith the model trained let's try it out on some test data. When collecting the data we split the data up between a training and a testing dataset. The model was trained only on the training data, and thus we can use the data in the testing dataset to validate how well the model will work in the real world. This will help us ensure the model has not learned to overfit the training data, which is a common occurrence.
To validate your model, go to Model testing, select the checkbox next to 'Sample name', and click Classify selected.
Verify that the USB to serial cable is connected properly.
RXD is RzBoard's Receiver, which means the transmitting cable to Rzboard goes there. The Same for TXD which is RzBoard's Transmitter to the other device.
The RED cable connector should not be connected.
Software setupIn this section, we will go through, how to set up the RzBoard's image and install the required packages.
RzBoard Yocto image setupIf you are working from a windows computer, we provide an eMMC
yocto-image to flash onto RzBoard. If you have a Linux host system, we recommend booting RzBoard from the network, by either first building your own yocto RzBoard image or using the image we provide.
If not already installed, download and install Tera Term serial terminal software: https://osdn.net/projects/ttssh2/downloads/74780/teraterm-4.106.exe/.
Plug in the USB-to-Serial Debug Console Cable to the PC. Ensure it is recognized by the Windows Device Manager. It should come up under Ports (Com & LPT).
Note! The Comm port number assigned by the PC may differ from the image above.
When working with a windows system Windows.bat files are used to reflash the eMMC.
if your RzBoard does not have the latest Uboot, the following instruction will explain how to flash the bootloader onto the eMMC on the RZBoard.
Download the RZBoard-Linux-Yocto-UserManual-v2.1 document from the RZBoard product page for a detailed step-by-step procedure of eMMC flash programming. The key steps however are summarized here as follows.
Download RZBoard_EdgeImpulse_eMMC and unzip it
1) Prior to launching either of the.bat files, make sure of the following:
- RZboard is powered down!
- All Tera Term windows are closed!
- USB serial cable from PC has TX, RX, GND fly-leads connected to correct J19 header pins and the COM port has enumerated on the PC
- Ethernet cable from PC is connected to RZBoard and the Laptop/PC's Ethernet adapter IPv4 properties set a static IP address of 192.168.1.88
- The config.ini file has been edited to list: - the correct COM port for the attached USB-Serial cable- the correct name for the.wic Linux image file (if an updated version is used)
2) To reflash the bootloader images, set the bootmode to SCIF serial mode: This done by setting BOOT2=1 by strapping J19-pin1 to +5V(ie. connect fly-wire from J19-pin1 to J1-pin2 on the 40pin header)
With RZBoard powered OFF, from Windows file explorer run flash_bootloader.bat (this launches the applicable Tera Term macro using the edited config.inisettings) Choose eMMC as the flash to program, the macro then waits for power up...Press and hold S1 for 2 seconds to power-on RZBoard, the macro will now proceed. Wait for this to complete (<5 min).Make sure to shutdown the Tera Term window after this has completed
3) To reflash the Linux System image, set the bootmode to eMMC mode:ie. Set BOOT2=0 by removing fly-wire previously used (in reflash of bootloader)
With RZBoard powered OFF, run flash_system_image.bat (this launches a different Tera Term macro, using same saved config.ini settings)
Power-on RZBoard. A blue window should open and an Ethernet connection established within 30 seconds
Wait for the macro to complete (you shall see multiple blocks of data getting sent) with the sequence finishing in 5 to 10 minutes. After finishing, press any key to exit the BAT script. You will also then need to manually exit the Tera Term console.
RZBoard boot up
Open Tera Term and select the serial port associated with the USB-Serial adaptor then click OK.
In the Tera Term console select Setup from the top menu and then Serial Port. Change the speed to 115200 then click New Setting.
In the RZBoard, press and hold the S1 button to power on the RZ/V2L. Verify that the U-boot/Linux boot messages display via serial cable.
.
.
.
Poky (Yocto Project Reference Distro) 3.1.14 rzboard ttySC0
rzboard login: root
Password: avnet
After booting connect the RzBoad ethernet cable to your router for an internet connection. you will need the internet to set up edge impulse for RzBoad.
In the Tera Term Console ping your PC to ensure that the RzBoard is properly talking to your PC.
Linux Host SetupFollow the instructions in the hackster project for building a yocto image for RzBoard and network booting it.
Use these pre-built images for network boot (RZBoard_EdgeImpulse_Netboot), without building the images yourself.
Otherwise, follow the whole on how to build your own images, the next instructions guide on how to build an image that supports edge impulse.
Setting Up RzBoard build system to support Edge Impulse
Once you finish the build instructions, we need to add Edge Impulse CLI packages to the Yocto build. Edge Impulse CLI requires to have nodejs and npm packages installed in addition to upgrading the glibc version from 2.28 to 2.31. To do this, we need to add the following configurations to Yocto build configurations page at the end of local.conf file that is located inside the build directory build/conf/local.conf.
Below, are the instruction to configure the build system to be able to add edge impulse on RzBoad.
Add nodejs
and npm
packages at the end of build/conf/local.conf
file
############################
# Select CIP Core packages
CIP_CORE = "0"
IMAGE_INSTALL_append = " \
nodejs \
nodejs-npm \
"
BBMASK = "meta-renesas/recipes-common/recipes-debian"
#################################
Note, don't forget to comment out CIP_CORE = "1"
in the local config file.
Remove the qt packages, and comment out core-image-qt5.inc
$ cat meta-rzboard/recipes-core/images/avnet-core-image.bb
require recipes-core/images/core-image.inc
###require recipes-core/images/core-image-qt5.inc
Build the image:
$ source poky/oe-init-build-env build/
$ bitbake avnet-core-image
Build error:If you encounter the following error:
ERROR: Task (/home/rz/yocto_ren/yocto_rzboard/build-impulse/../poky/meta/recipes-support/liburcu/liburcu_0.11.1.bb:do_compile) failed with exit code '1'
Edit the Makefile to remove the unit directory so that it is not compiled.
$ vim tmp/work/aarch64-poky-linux/liburcu/0.11.1-r0/userspace-rcu-0.11.1/tests/Makefile.am
##SUBDIRS = utils common unit benchmark regression
SUBDIRS = utils common benchmark regression
Then continue to build:
$ bitbake avnet-core-image
Once the build is done and the Linux is setup for network booting.
Form your Linux host connect via serial connection:
- In the first console run
bn@nc:~$ sudo chmod 666 /dev/ttyUSB0
bn@nc:~$ cu -s 115200 -l /dev/ttyUSB0 --parity none --nostop
Connected.
- Open another console on Ubuntu PC and change RTS/CTS flow-control option:
bn@nc:~$ stty -F /dev/ttyUSB0 -crtscts
On the RZBoard, press and hold the S1 button to power on the RZ/V2L.Verify that the U-boot/Linux boot messages display via serial cable.
- Return to the first console then log in:
bn@nc:~$ cu -s 115200 -l /dev/ttyUSB0 --parity none --nostop
Connected.
.
.
.
Poky (Yocto Project Reference Distro) 3.1.14 rzboard ttySC0
rzboard login: root
Password: avnet
Installing Edge Impulse Linux CLIAfter RzBoard boots and is connected to the internet on the same network as you host (Windows host or Linux host). Type the following to install Edge Impulse Linux CLI onto your RzBoard.
root@rzboard:~# npm config set user root && npm install edge-impulse-linux -g --unsafe-perm
Connecting to Edge ImpulseWith all software set up, connect your USB camera to your RzBoard and run:
root@rzboard:~# edge-impulse-linux
Edge Impulse Linux client v1.4.2
? Select a microphone (or run this command with --disable-micro
phone to skip selection) USB-Audio - HD Pro Webcam C920
[SER] Using microphone hw:1,0
[SER] Using camera Video Capture 4 starting...
[SER] Connected to camera
[WS ] Connecting to wss://remote-mgmt.edgeimpulse.com
[WS ] Connected to wss://remote-mgmt.edgeimpulse.com
? What name do you want to give this device? RzBoard
[WS ] Device "RzBoard" is now connected to project "bngaboav-project-1"
[WS ] Go to https://studio.edgeimpulse.com/studio/185802/acquisition/training to build your machine learning model!
This will start a wizard which will ask you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
To run your impulse locally, just connect to your Renesas RzBoard and run:
root@rzboard:~# edge-impulse-linux-runner --clean
Edge Impulse Linux runner v1.4.2
? What is your user name or e-mail address (edgeimpulse.com)?
This will automatically compile your model with full hardware acceleration and download the model to your RzBoard, and then start classifying.
Or you can download an eim
model locally with the following command.
root@rzboard:~# edge-impulse-linux-runner --clean --download rock-paper-scissors-rzb.eim
Edge Impulse Linux runner v1.4.2
? What is your user name or e-mail address (edgeimpulse.com)? b
ernard.ngabonziza@avnet.com
? What is your password? [hidden]
[RUN] Downloading model...
[BLD] Created build job with ID 6317518
.
.
.
[BLD] Building binary OK
[RUN] Downloading model OK
[RUN] Stored model in /home/root/rock-paper-scissors-rzb.eim
root@rzboard:~#
Then use it with the above runner as follows:
root@rzboard:~# edge-impulse-linux-runner --model-file rock-paper-scissors-rzb.eim
[RUN] Starting the image classifier for Ngabonziza / bngaboav-project-1 (v1)
[RUN] Parameters image size 96x96 px (3 channels) classes [ 'paper', 'rock', 'scissors' ]
[RUN] Using camera Video Capture 4 starting...
[RUN] Connected to camera
Want to see a feed of the camera and live classification in your browser? Go to http://10.42.0.92:4912
classifyRes 4ms. { paper: '0.3870', rock: '0.6113', scissors: '0.0018' }
classifyRes 1ms. { paper: '0.4084', rock: '0.5894', scissors: '0.0022' }
classifyRes 1ms. { paper: '0.3691', rock: '0.6289', scissors: '0.0019' }
classifyRes 1ms. { paper: '0.3760', rock: '0.6221', scissors: '0.0021' }
You will see the model inferencing results in the terminal also we stream the results to the local network. This allows you to see the model's output in real-time in your web browser.
Open the URL shown when you start the runner.
In the Linux host start a browser and in the browser type:
http://10.42.0.92:4912
# This is specific to your RzBoard's IP-Address.
And you will see both the camera feed and the classification results.
Comments
Please log in or sign up to comment.