The two biggest tools for machine learning and AI computer vision applications are Edge Impulse and the Kria KV260 Vision AI SoM development platform. Xilinx does provide ML models for the Kria in the form of accelerated applications available in their new Embedded App Store, but Edge Impulse is a really great platform that allows users to develop their own ML models for deployment on the hardware of their choice. While both are great tools in their own right, I've been itching to see what they can do together.
The Kria KV260 SoM development platform currently isn't among the list of officially supported boards for Edge Impulse, but with the help of their handy porting guide, I was able to get the Kria KV260 connected to Edge Impulse. For this project post, I'll walk through how I got the Kria KV260 connected to Edge Impulse and demonstrate some basic data acquisition for creating/training an ML model in Edge Impulse Studio.
If you haven't already, be sure you have an account created with Edge Impulse here. You'll need your login credentials on the Kria to get it connected to the Edge Impulse server.
Prep Root FilesystemBefore installing Edge Impulse on the Kria KV260, the required dependencies need to be installed in the root filesystem of the Linux image running on the Kria. The easiest way to do this is to configure it from the PetaLinux project. I'm starting with the PetaLinux project I created for the Kria KV260 in my last project post here.
Edge Impulse requires gcc, g++, make, build-essential, nodejs, sox, and gstreamer1.0 to run on Linux. Luckily, these dependencies can be easily built into an embedded Linux image by enabling a few select package groups in a PetaLinux project.
Launch the root filesystem configuration editor from the PetaLinux project directory:
~$ petalinux-config -c rootfs
For gcc, g++, make, and build-essential navigate to the Filesystem Packages menu option, and under misc enable packagegroup-core-buildessential:
Then under the Petalinux Package Groups menu enable the following package groups:
- packagegroup-petalinux-audio
- packagegroup-petalinux-mraa
- packagegroup-petalinux-multimedia
The audio package group satisfies the sox dependency, the mraa package groups installs nodejs, and the multimedia package group covers everything for gstreamer1.0 plus adds the desktop environment to the Kria's Linux image.
Exit and save the filesystem configuration and rebuild the PetaLinux project:
~$ petalinux-build
Then create the wic SD card image:
~$ petalinux-package --wic --bootfiles "ramdisk.cpio.gz.u-boot boot.scr Image system.dtb"
Flash the microSD card with a program like balenaEtcher:
Install the microSD card into the SD card slot of the KV260 baseboard and boot the Kria by plugging in the 12V power supply. Connect an ethernet cable to the KV260 baseboard from your router to connect the Kria to your network & verify the connection:
xilinx-k26-starterkit-2021_1:~$ ping google.com
Install Edge Impulse for LinuxEdge Impulse for Linux is a Node.js project that uses the npm command line utility for installation. EI for Linux has tools that allow for data collection from any camera or microphone connected to the hardware. It can be used with any of Edge Impulse's SDKs (Node js, Python, Go, and C++ SDKs) to collect data from sensors and run ML models with hardware acceleration.
Install Edge Impulse for Linux as root using npm:
xilinx-k26-starterkit-2021_1:~$ npm config set user root
xilinx-k26-starterkit-2021_1:~$ sudo npm install edge-impulse-linux -g --unsafe-perm
There are a few warnings that appear since the version of Node js built in by PetaLinux is a bit on the older side (v6.14), but so far I haven't found this to be an issue.
With Edge Impulse successfully installed, connect a camera to the KV260. The accessory pack that you can opt to purchase with the Kria KV260 contains an AR1335 camera module. I personally really like this Logitech BRIO HD webcam as the higher video quality with built-in microphone makes its heftier price tag of $170 worth it for AI vision and ML applications.
Run Edge Impulse for Linux to connect to the Edge Impulse server and therefore Edge Impulse Studio.
xilinx-k26-starterkit-2021_1:~$ edge-impulse-linux
The first thing you'll be asked for is your login credentials for EI Studio. Then you'll be prompted to select a microphone from the list of those detected in the system to capture audio data from, and a camera from the list of those detected in the system to capture video/image data from.
After the video and audio source devices are selected, you'll be prompted to provide the desired device name you want the Kria to show up in the Devices tab of EI Studio as.
From your PC (connected to the same network as the Kria) verify connection in EI Studio by logging into your account from a browser and navigating to the Devices tab. There, you'll see the Kria KV260 appear with its MAC address and provided device name.
To create an ML model, you'll need to capture initial data to train it with. Switch over to the Data acquisition tab in EI studio where you can now select the Kria from your device list to start capturing training and test data.
Capture some image data from Logitech camera video feed:
And capture audio data from Logitech microphone:
You'll also see the command line of the Kria outputting the status of captures:
Use Ctrl+C to stop the Edge Impulse application and return the command line on the Kria. This will also disconnect the Kria from EI Studio, but it will remain in your device list. Simply rerun the Edge Impulse for Linux application again to reconnect.
And that's it for this post. I'll post more when I figure out exactly what my first custom ML model is going to look like for the Kria in Edge Impulse!
Comments