Hello,
A long time coming...I have been waiting to make this work for over two years. I am sort of satisfied so far with what is possible. The only things I would like to advance in so far are to offload the tensors building to the EVEs or to the DSPs on the BBAI.
Outside of that, the source works, the build works, and the AI is fast with telling an educated guess answer from a list of already compiled trainings in TensorFlow from MobileNet.
...
So, what we plan on doing here is this...
1. We are going to take a build, of sorts, and produce some info. in record time on the AI from a photo of a bird. We shall see if the tensor training of MobileNet knows exactly how to affiliate their pixels accordingly.
...
So, take this photo for instance:
and then...
We build on our AI and this can be a long drawn out process but there are specific installs that can reduce our time. If you are familiar w/ TensorFlow and TensorFlow-Lite for MCUs and SiP/SoCs, then this should be a breeze.
If you are new, like me, to involving tensors with commands to produce results, good. This should be valuable and interesting.
Okay. Jumping in here:
2. We need an image for our AI. Here is a list of images: https://rcn-ee.net/rootfs/debian-armhf/
uname -r: 5.10.59-ti-r22 # this is my image kernel currently
cat /etc/dogtag: BeagleBoard.org Debian Bullseye IoT Image 2021-10-02
If you are new to BBB.io and the.org, please visit their homepage or forums which can be located here:
https://forum.beagleboard.org/
and...
On the beagleboard.org site, there is a page that lists exactly how to flash an image with Etcher from Balena, install it on the BBAI or any family board, and then you can follow the below directions on how to install, use, and see the results of your tensors via Python3.9 files.
...
3. I built the tensorflow-lite manually but I think there is an actual Python script on Debian Bullseye that can be used to handle the Python Language scripting outside of C/C++ which is our first build of tensorflow-lite.
git clone https://github.com/tensorflow/tensorflow.git
sudo apt install cmake
mkdir tflite_build && cd tflite_build
cmake ../tensorflow/lite
cmake --build .
cmake ../tensorflow/lite/c
cmake --build .
...
Now to install our tflite for Python via Bullseye.
echo "deb https://packages.cloud.google.com/apt coral-edgetpu-stable main" | sudo tee /etc/apt/sources.list.d/coral-edgetpu.list
curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | sudo apt-key add -
sudo apt-get update
sudo apt-get install python3-tflite-runtime
The runtime works.
If you want more concrete exampling and ideas, please see this tensorflow link online: https://www.tensorflow.org/lite/guide/python.
Okay. So, now we need to mobilenet ideas and software to handle the result.
wget https://storage.googleapis.com/download.tensorflow.org/models/tflite/mobilenet_v1_1.0_224_quant_and_labels.zip
I grabbed that site and info. from here: https://classroom.udacity.com/courses/ud190/.
...
4. Now, we need to run our example from this source with particular commands once the mobilenet info. is unzipped onto our BBAI. Unzipping the above link file should get you a couple files to handle tensors:
a. labels_mobilenet_quant_v1_224.txt
labeling of trained models...
and...
b. mobilenet_v1_1.0_224_quant.tflite
binary of the tensor which can be built listening to some of the commands for building tensors on tenerflow.org
Just for reference, all of these ideas can be done directly on your BBAI without having to cross compile or transfer files.
...
So...here is some source:
#!/usr/bin/python3
# Copyright 2019 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from tflite_runtime.interpreter import Interpreter
import numpy as np
import argparse
from PIL import Image
parser = argparse.ArgumentParser(description='Image Classification')
parser.add_argument('--filename', type=str, help='Specify the filename', required=True)
parser.add_argument('--model_path', type=str, help='Specify the model path', required=True)
parser.add_argument('--label_path', type=str, help='Specify the label map', required=True)
parser.add_argument('--top_k', type=int, help='How many top results', default=3)
args = parser.parse_args()
filename = args.filename
model_path = args.model_path
label_path = args.label_path
top_k_results = args.top_k
with open(label_path, 'r') as f:
labels = list(map(str.strip, f.readlines()))
# Load TFLite model and allocate tensors
interpreter = Interpreter(model_path=model_path)
interpreter.allocate_tensors()
# Get input and output tensors.
input_details = interpreter.get_input_details()
output_details = interpreter.get_output_details()
# Read image
img = Image.open(filename).convert('RGB')
# Get input size
input_shape = input_details[0]['shape']
size = input_shape[:2] if len(input_shape) == 3 else input_shape[1:3]
# Preprocess image
img = img.resize(size)
img = np.array(img)
# Add a batch dimension
input_data = np.expand_dims(img, axis=0)
# Point the data to be used for testing and run the interpreter
interpreter.set_tensor(input_details[0]['index'], input_data)
interpreter.invoke()
# Obtain results and map them to the classes
predictions = interpreter.get_tensor(output_details[0]['index'])[0]
# Get indices of the top k results
top_k_indices = np.argsort(predictions)[::-1][:top_k_results]
for i in range(top_k_results):
print(labels[top_k_indices[i]], predictions[top_k_indices[i]] / 255.0)
As you can tell, we will need python3-pil and python3-numpy to handle this specific example.
Now, run your command:
./classify.py --filename Your_Photo.jpg --model_path mobilenet_v1_1.0_224_quant.tflite --label_path labels_mobilenet_quant_v1_224.txt
So, my results are:
bustard 0.796078431372549hornbill 0.047058823529411764kite 0.027450980392156862...
So, I am 79% sure from my mobilenet data and.tflite binary that this type of bird is indeed a Bustard. So, if I wanted, I could go online or to my book about birds to review and research what exactly a Bustard looks like.
...
This photo of a Bustard is from AUS and not my backyard. So, I know it is a bit off.
...
Now, I am going to check the Hornbill to see if it indeed is this type which is probably way off since it is only 4% correct from the mobilenet trained tensors.
Okay, so...
The hornbill is way off too. Anyway, Bustard is close but I live in a rich, diverse sanctuary of odd passerby birds of all Family and Genus.
So, it could be from anywhere and any Family.
Seth
P.S. If you get bored and are having trouble w/ this build, please let me know. I can try to see what is happening.
I just tried a photo from Wikipedia. Here is an albatross photo and the results.
albatross 0.9921568627450981
goose 0.00784313725490196
toilet tissue 0.0
You can see that it is 99% accurate. Nice! Inference and education from machines!
Comments