What is the MaaXBoard OSM93?
The MaaXBoard OSM93 is a Raspberry Pi shaped board with NXP's i.MX 93 processor. The i.MX93 has an integrated AI accelerator (Arm Ethos-U65 NPU), EdgeLock secure enclave, dual Arm Cortex A55s, and an Arm Cortex M33.
The processor is packaged in the MSC OSM-SF-IMX93 solder-down module so if you ever decide to take your project away from the MaaXBoard, you can easily develop your own baseboard without having to do chip-down design.
The 0.5 TOPs NPU is capable of applications like multi-face recognition, speech recognition, and pose detection.
To get started with the MaaXBoard OSM93, you'll need:
- A 27W (5V) power supply
- an FTDI USB to TTL adapater (I'm using the NXP MCU-Link)
Optionally, you'll need:
- A display
- A mouse and keyboard
- An ethernet cable
- USB camera
NXP provides a number of machine learning to make it easy to get started working with the OSM93 processor for machine learning:
- NXP eIQToolkit/Portalsoftware for Windows and Ubuntu that enables GUI-driven model training, optimization, deployment, and inference
- Vela compilerconverts to model format that runs using the Ethos-U Delegate on i.MX93
- NXP Model zooA collection of optimized machine learning models.
- NNStreamer GStreamer plugin for ML
- Yocto image & BSPincludes TFLite, Inference API, and Ethos-U delegate. Tria provides our own image and BSP based on these.
To set up the MaaXBoard, you'll need the following software installed on a host PC running either Windows or Ubuntu:
- Terminal tool to connect to the board via serial, e.g. TeraTerm or PuTTY
- Terminal to connect to the board via SSH (I use VS Code's terminal)
- The Vela compiler
In order to work with machine learning on the MaaXBoard OSM93, you'll need the Vela compiler. This tool takes a quantized Tensorflow Lite model and exports it toan optimized version that can be run on the Ethos-U NPU (known shorthand as a Vela model).
There are a couple ways to install and use Vela:
- Install it manually using the command line
- It's included with NXP's eIQ Toolkit Software and can be used from the GUI
- It's included in Yocto image running on the MaaXBoard OSM93
To install it from the command line:
I first created a virtual environment using Conda (any virtual environment can be used) with Python3.7, which Vela requires:
conda create -n vela python=3.7
Once created, I switched to my new virtual environment:
conda active vela
Install Vela from the command line using the following commands:
$ git clone https://github.com/nxp-imx/ethos-u-vela.git
$ cd ethos-u-vela
$ git checkout lf-5.15.71_2.2.0
$ pip3 install .
To install it as part of the eIQ Toolkit:
NXP's eIQ Toolkit Software packages inference engines, delegates, and developer tools to simplify porting machine learning models to run on the MaaXBoard OSM93's NPU.
On the eIQ Toolkit page, select "Software Details" and scroll down to the Downloads section. Here you'll find both Ubuntu and Windows installers.
BOARD SETUPYocto ImageThe board comes preloaded with the Yocto image.
If you would like to flash a different image, you can follow the instructions here to build and flash to the eMMC: https://github.com/Avnet/maaxboard-build-tools
The latest image is here: EW2024 Demo Image & Application
Connect to the boardMaaXBoard OSM93 provides two debug console interfaces, one for the A55 core and the other for M33. The majority of development tasks will utilize the A55 core.
Using your FTDI USB to TTL adapter, plug the USB end into a host PC, and the other 3 female connector pins (GND/RX/TX) into the bottom 3 pins of the debug console header, labeled A55.
Open TeraTerm or other terminal program and configure baud to 115200, data bits to 8, and stop bits to 1 to initiate a connection to the debug interface.
Power the MaaXBoard OSM93 with the power supply, plugging it into the Power/USB_A port as shown on the silkscreen of the board.
Once the board has booted, connect to it using the login "root":
MaaXBoard OSM93 provides two ethernet ports, A & B, which can serve to make a direct connection to a PC or to a router.
To connect to the board over network, connect an ethernet cable to either one of the MaaXBoard OSM93's ethernet ports. Using the debug console, use ifconfig
to fetch the IP address, or use your router tools to identify it:
Using VS code or linux terminal, ssh into the board using the following command:
ssh root@<IP_addr>
Plug the USB webcam into one of the USB ports on the device. Get the current camera names:
ls dev/video*
Takea photo
To take a photo and save it to a specific location, run the following command:
gst-launch-1.0 v4l2src device=/dev/video0 num-buffers=1 !
jpegenc ! filesink location=sample.jpg
You can view the photo by copying it to your PC. On your PC, run the command to copy it over to your current folder:
scp root@[OSM93 IP ADDRESS]:sample.jpg .
RUN A MACHINE LEARNING MODELNow that you've set up your board and camera, it's time to test out a machine learning model.
Run the eIQ Tensorflow Lite examplesOn the MaaXBoard OSM93, navigate to the Tensorflow Lite examples folder:
cd /usr/bin/tensorflow-lite-2.10.0/examples
Test out the python label image example:
python3 label_image.py
Convert a model to VelaNotes about Vela conversion:
- Conversion is one way; model is compiled and can’t be reverted to a TFLite model after the fact.
- Vela takes supported TFLite operators and converts them to custom operators that can run on the NPU.
- In addition, Vela condenses the model to reduce the model size (up to about 70%) and SRAM size (up to about 90%)
- You can find a list of Vela's technical constraints (e.g. supported operators) here.
There are three ways to convert a model to Vela:
- Use eIQ Toolkit
- Use command line tool
- Compile on board
To convert using the command line tool: You'll use the vela converter that you installed in the Host PC setup.
- Download a model like mobilenet_v1_1.0_224_quant.tflite
- You can convert it simply by typing vela + [your model name]:
$ vela mobilenet_v1_1.0_224_quant.tflite
To convert a model using Vela from the eIQ Toolkit GUI, first install the software from here (free account required).
- Open the eIQ Portal.
- Select "Model Tool."
- Click File > Convert > Tensorflow Lite Vela/iMX93 (.tflite)
- Select the vela.ini file saved in the “resources” folder
- Click “Convert” and pick a location to save the model
- The new model will be saved in the location
Test out the demos
The board comes preinstalled with a demo application that includes a driver monitoring system and a pose detection application.
You can find the source code on github: https://github.com/Avnet/Maaxboard-OSM93-Demos
NXP also provides several GoPoint demos that are enabled on our hardware.
Repo for the GoPoint demos: https://github.com/nxp-imx-support/nxp-demo-experience
GoPointUser Guide: https://www.nxp.com/doc/GPNTUG
Let us know what you end up building with the MaaXBoard OSM93!
Comments
Please log in or sign up to comment.