Previously, I've shown how to install and use the Tensorflow Lite Python Interpreter on the MaaXBoard. This tutorial will walk you through cross-compiling the Tensorflow Lite C++ API for aarch64/arm64 dev boards like the MaaXBoard. I'll then show you how to install and run an example on your MaaXBoard.
I found the Tensorflow documentation on building and using Tensorflow Lite C++ interpreter to be a bit spotty, but here are the links if you would like to check them out:
It will probably be helpful if you go through the basics of setting up MaaXBoard Headlessly with Debian and installing Tensorflow on MaaXBoard before you try out this tutorial.
Set up your Build MachineTo run the Tensorflow Lite C++ interpreter, we have to cross-compile on a separate build machine. This can be MacOS (Catalina), Windows 10, or Ubuntu 16.04+. I'll be showing the commands for running the build on Ubuntu 18.04. You'll need at least 50Gb of free space on your build machine because the Tensorflow source files alone takes up over 40Gb!
Once we're done cross-compiling, we'll end up with a file called tensorflow-lite.a that we'll copy to MaaXBoard.
Don't want to mess with all the cross-compilation? The tensorflow-lite.a file is included in the code section of this project, and you can find benchmarks that are pre-built for aarch64 here.
For the cross-compiling step, you'll need a Linux machine. Configure your build machine with the following steps:
Upgrade and then install openssl (required by cmake):
sudo apt update && sudo apt upgrade -y
sudo apt-get install libssl-dev
Install cmake:
sudo apt install cmake
sudo apt upgrade cmake
These commands install gcc-arm-8.3-2019.03-x86_64-aarch64-linux-gnu toolchain under ${HOME}/toolchains:
curl -LO https://storage.googleapis.com/mirror.tensorflow.org/developer.arm.com/media/Files/downloads/gnu-a/8.3-2019.03/binrel/gcc-arm-8.3-2019.03-x86_64-aarch64-linux-gnu.tar.xz
mkdir -p ${HOME}/toolchains
tar xvf gcc-arm-8.3-2019.03-x86_64-aarch64-linux-gnu.tar.xz -C ${HOME}/toolchains
export PATH="$PATH:/usr/local/gcc_arm/gcc-arm-none-eabi-4_8-2014q1/bin/"
Download Tensorflow Source:
git clone https://github.com/tensorflow/tensorflow.git tensorflow_src
Make a new build directory for your build files:
mkdir tflite_build
cd tflite_build
Run CMake inside the build directory with the following command (note the options that select aarch64 Linux):
ARMCC_PREFIX=${HOME}/toolchains/gcc-arm-8.3-2019.03-x86_64-aarch64-linux-gnu/bin/aarch64-linux-gnu-
ARMCC_FLAGS="-funsafe-math-optimizations"
cmake -DCMAKE_C_COMPILER=${ARMCC_PREFIX}gcc \
-DCMAKE_CXX_COMPILER=${ARMCC_PREFIX}g++ \
-DCMAKE_C_FLAGS="${ARMCC_FLAGS}" \
-DCMAKE_CXX_FLAGS="${ARMCC_FLAGS}" \
-DCMAKE_VERBOSE_MAKEFILE:BOOL=ON \
-DCMAKE_SYSTEM_NAME=Linux \
-DCMAKE_SYSTEM_PROCESSOR=aarch64 \
-DTFLITE_ENABLE_GPU=ON
../tensorflow/tensorflow/lite/
The Build TensorFlow Lite with CMake page has several other options for building Tensorflow Lite (scroll down). Unfortunately, most of these aren't relevant when building for MaaXBoard Debian.
Still inside the build directory, build Tensorflow Lite:
cmake --build . -j
Now if you look at the files in tflite_build, you should see tensorflow-lite.a. Copy this over to your MaaXBoard (replace ebv with your username [MAAXBOARD IP] with your IP address):
scp tensorflow-lite.a ebv@[MAAXBOARD IP]:
Build the example file on your build machineStill in the tflite_build directory, we have one more step. Build the benchmark:
cmake --build . -j -t benchmark_model
and then build the example file:
cmake --build . -j -t label_image
Copy these over to your MaaXBoard as well:
scp tools/benchmark/benchmark_model ebv@[MAAXBOARD IP]:
scp examples/label_image/label_image ebv@[MAAXBOARD IP]:
Build your own exampleTo make it easier, export your Tensorflow folder to path:
export TENSORFLOW_ROOT_DIR=~/tensorflow_src
In this example, we'll be building a benchmarking example called benchmark_performance_options
that you can find under tensorflow_src/tensorflow/lite/tools/benchmark/.
Note: Hackster does a weird thing with dashes, so you may have trouble running the following script. If so, delete all the dashes and hand type them:
g++ -I${TENSORFLOW_ROOT_DIR}/tensorflow/lite/tools/make/downloads/flatbuffers/include -I${TENSORFLOW_ROOT_DIR} -pthread -Wall -Wextra -pedantic -o benchmark_performance_options ${TENSORFLOW_ROOT_DIR}/tensorflow/lite/tools/benchmark/benchmark_performance_options.cc -L${TENSORFLOW_ROOT_DIR}/tensorflow/lite/tools/make/gen/linux_aarch64/lib -ltensorflow-lite
RUN TENSORFLOW LITE ON MAAXBOARDConfigure your MaaXBoard to run Tensorflow Lite C++ examplesIf you haven't yet, go through setting up MaaXBoard Headlessly with Debian. Once that is done, SSH into your MaaXBoard:
ssh@[MAAXBOARD IP]:
You'll now move the library that you copied over in the previous step to your .a file directory, and export it to path:
mv libtensorflow-lite.a /usr/local/lib/libtensorflow-lite.a
export LD_LIBRARY_PATH=/usr/local/lib/libtensorflow-lite.a
Run the ExamplesTo run the benchmark, download any.tflite model to your MaaXBoard. You can find lots of Tensorflow Lite models on the Tensorflow model zoo. My previous tutorial on how to run Tensorflow Lite on MaaXBoard has instructions on how to convert.pb models to.tflite models using Bazel.
If you just want to try a pre-converted model, you can download mobilenet:
wget download.tensorflow.org/models/mobilenet_v1_2018_08_02/mobilenet_v1_1.0_224_quant.tgz
tar -xzvf mobilenet_v1_1.0_224_quant.tgz
Benchmark the model. Include your model name after --graph= like this:
./benchmark_model --graph=mobilenet_v1_1.0_224_quant.tflite
Now, run the example file. To do this, we'll use the same mobilenet model that you used previously, but you'll also need an image and the set of labels that go with mobilenet. Any image could work, but we'll use this standard image of Grace Hopper:
# get photo
curl https://raw.githubusercontent.com/tensorflow/tensorflow/master/tensorflow/lite/examples/label_image/testdata/grace_hopper.bmp > grace_hopper.bmp
# get labels
curl https://storage.googleapis.com/download.tensorflow.org/models/mobilenet_v1_1.0_224_frozen.tgz | tar xzv labels.txt
Now run label image:
./label_image -m mobilenet_v1_1.0_224_quant.tflite -t 1 -i grace_hopper.bmp -l mobilenet_v1_1.0_224/labels.txt
Woo hoo! That's it! Note that while these steps are for MaaXBoard, they should also work on any aarch64 linux board, like Raspberry Pi 3 or 4. Good luck!
Comments