The effects of climate change favor the spread of increasingly destructive pests and threaten the survival of the most important plants and crops from an economic point of view, a situation that poses a growing threat to food security and the environment. source:UN.
Issue:Climate change causes the modification of temperatures, humidity and gases in the atmosphere, especially GHG accumulation, which can favor the growth of fungi and insects, altering the interaction of the disease triangle (host - pathogen - environment ) and therefore reductions in their production.
Various investigations have shown the fluctuation in the incidence of pests in both temperate and tropical zones, associated with events of a dry period and a combination of drought and high relative humidity.
The Food and Agriculture Organization of the United Nations (FAO) estimates that pests destroy up to 40% of global crop production each year, while plant diseases annually cost the world economy more than $220, 000. million dollars, and invasive insects at least 70, 000 million dollars. source:UN.
"The main conclusions of this evaluation should alert us all to how climate change can affect the degree of contagion, spread and severity of pests around the world, " said the Director General of the Organization at the presentation of the study.
SolutionDesign and build an agricultural monitoring system with AI capable of identifying pests and pests in crops in a way that allows the system to select the right herbicide and quantity to eradicate it autonomously and immediately. In the same way, to be able to record and analyze agroclimatic variables that allow analyzing and making decisions about crops so that losses due to climate change can be prevented.
FIRST STEPS:Configure Petalinux1. Setting up the SD Card Image (PetaLinux)
We must be registered in the Xilinx developer program to download the Petalinux 2021.1 image.
Once the image is loaded in Balena Etcher, we proceed to flash it.
2. Connecting Everything:
We make sure that all the cables are well connected. (as the picture indicates)
3. Booting:
We configure the connection through COM port with the card. So we proceed to use PuTTy and make sure that we have the following parameters configured:
- Baud rate = 115200
- Data bits = 8
- Stop bit = 1
- Flow control = None
- Parity = None
And we will have our board configured.
We can do some quick tests on the board with the following commands.VITIS AI Configuration
It is important to note that if you do not have a Linux operating system, you can perform the following installation in a virtual machine.
1. Install and configure Docker on the machine.
2. Clone the Vitis-AI repository to obtain the examples, reference code, and scripts.
git clone --recurse-submodules https://github.com/Xilinx/Vitis-AI
cd Vitis-AI
3. Download the latest Vitis AI Docker with the following command. This container runs on CPU.
docker pull xilinx/vitis-ai-cpu:latest
4. To run the docker, use command:
./docker_run.sh xilinx/vitis-ai-cpu:latest
Train the modelTo train the model we use TensorFlow to create the neural network. It is important to clarify that you must purchase Colab's PRO plan to use a 30Gb RAM GPU.
1. We import the libraries and download the database:
import tensorflow as tf
import tensorflow_datasets as tfds
datos, metadatos = tfds.load('plant_village', as_supervised = True, with_info = True)
metadatos.features
print(metadatos.features["label"].names)
2. Resize the images: Since all the images have different dimensions and can conflict with TensorFlow, we proceed to resize all the images:
import matplotlib.pyplot as plt
import cv2
plt.figure(figsize=(20,20))
tamaño = 50
for i, (imagen, etiqueta) in enumerate(datos['train'].take(25)):
imagen = cv2.resize(imagen.numpy(), (tamaño, tamaño))
plt.subplot(5, 5, i+1)
plt.imshow(imagen)
3. Split the database for training
train_data = []
for i, (imagen, etiqueta) in enumerate(datos['train']):
imagen = cv2.resize(imagen.numpy(), (tamaño, tamaño))
imagen = imagen.reshape(tamaño, tamaño, 3)
train_data.append([imagen, etiqueta])
#Prepare my variables X (inputs) and y (labels) separately
X_data = [] #imagenes de entrada (pixeles)
y_data = [] #etiquetas
for imagen, etiqueta in train_data:
X_data.append(imagen)
y_data.append(etiqueta)
4. Convert Xdata and Ydata to arrays
import numpy as np
X_data = np.array(X_data).astype(float) / 255
y_data = np.array(y_data)
5. Build the neural network: The initial layer, the hidden layers and the output layers with their respective activation function
modeloCNN = tf.keras.models.Sequential([
tf.keras.layers.Conv2D(32, (3,3), activation='relu', input_shape=(50, 50, 3)),
tf.keras.layers.MaxPooling2D(2, 2),
tf.keras.layers.Conv2D(64, (3,3), activation='relu'),
tf.keras.layers.MaxPooling2D(2, 2),
tf.keras.layers.Conv2D(128, (3,3), activation='relu'),
tf.keras.layers.MaxPooling2D(2, 2),
tf.keras.layers.Flatten(),
tf.keras.layers.Dense(100, activation='relu'),
tf.keras.layers.Dense(38, activation='softmax')
])
6. We compile the model:
modeloCNN.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
modeloCNN2.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
7. We retrain the model:
from tensorflow.keras.callbacks import TensorBoard
from tensorflow.keras.preprocessing.image import ImageDataGenerator
datagen = ImageDataGenerator(
rotation_range=30,
width_shift_range=0.2,
height_shift_range=0.2,
shear_range=15,
zoom_range=[0.7, 1.4],
horizontal_flip=True,
vertical_flip=True
)
datagen.fit(X_data)
plt.figure(figsize=(20,8))
for imagen, etiqueta in datagen.flow(X_data, y_data, batch_size=10, shuffle=False):
for i in range(10):
plt.subplot(2, 5, i+1)
plt.xticks([])
plt.yticks([])
plt.imshow(imagen[i].reshape(50, 50, 3))
break
from keras.models import Sequential
from keras.layers import Convolution2D, MaxPooling2D, Dropout
from keras.layers import Flatten, Dense
from keras.layers import Conv2D, GlobalAveragePooling2D
from keras.models import Sequential
from keras.layers import Conv2D, MaxPooling2D
from keras.layers import Activation, Dropout, Flatten, Dense
from keras.layers import Conv2D, MaxPooling2D, GlobalAveragePooling2D
from keras.layers import Dropout, Flatten, Dense
from keras.models import Sequential
8. We create the initial models:
modeloCNN_AD = tf.keras.models.Sequential([
tf.keras.layers.Conv2D(32, (3,3), activation='relu', input_shape=(50, 50, 3)),
tf.keras.layers.MaxPooling2D(2, 2),
tf.keras.layers.Conv2D(64, (3,3), activation='relu'),
tf.keras.layers.MaxPooling2D(2, 2),
tf.keras.layers.Conv2D(128, (3,3), activation='relu'),
tf.keras.layers.MaxPooling2D(2, 2),
tf.keras.layers.Flatten(),
tf.keras.layers.Dense(100, activation='relu'),
tf.keras.layers.Dense(38, activation='softmax')
])
X_train = X_data[:40000]
X_valid = X_data[40000:]
y_train = y_data[:40000]
y_valid = y_data[40000:]
modeloCNN_AD.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
data_gen_train = datagen.flow(X_train, y_train, batch_size=32)
tensorboardCNN_AD = TensorBoard(log_dir='logs/cnn_AD')
modeloCNN_AD.fit(
data_gen_train,
epochs=100, batch_size=32,
validation_data=(X_valid, y_valid),
steps_per_epoch=int(np.ceil(len(X_train) / float(32))),
validation_steps=int(np.ceil(len(X_valid) / float(32))),
callbacks=[tensorboardCNN_AD]
)
9. Training results:
10. We download the model:
modeloCNN_AD.save('my_model-cnn-ad.h5')
Model Quantization1. Import the model that resulted from the training and database to the quantization code created in Vitis AI.
2. We define our quantization model:
def quantize(train_generator, model):
# run quantization
quantizer = vitis_quantize.VitisQuantizer(model)
#quantizer = tfmot.quantization.keras.quantize_model(model)
quantized_model = quantizer.quantize_model(calib_dataset=train_generator, calib_batch_size=10)
3. We save the quantization model:
# save quantized model
quantized_model.save('quantized_model.h5')
return (quantized_model)
Compilation1. Crear un archivo llamado arch.json donde asignamos la configuración de la DPU
{
"fingerprint":"0x1000020F6014406"
}
2. Aplicamos el comando source compile.sh donde aparece la ubicación del archivo arch.json.
compile() {
vai_c_tensorflow2 \
--model quantized_model.h5 \
--arch $ARCH \
--output_dir build/compiled_$TARGET \
--net_name customcnn
}
compile 2>&1 | tee build/logs/compile_$TARGET.log
3. These.xmodel files generated by our model are uploaded to the board via SFTP connection.
sftp petalinux@ <ip adreess>
4. We execute the following command:
lcd ..
put -r compiled_$TARGET
5. Next, petalinux is started on the board and we proceed to create the following files:
- aiiference.json
- drawresult.json
- preprocess.json
6. These three files must be located in the directory:
sudo cp yolov2tiny/aiinference.json /opt/xilinx/share/ivas/smartcam/ssd/aiinference.json
sudo cp yolov2tiny/preprocess.json /opt/xilinx/share/ivas/smartcam/ssd/preproces.json
sudo cp yolov2tiny/drawresult.json /opt/xilinx/share/ivas/smartcam/ssd/drawresult.json
7. We test that our project works well with the camera and our machine learning model works:
sudo xmutil unloadapp
sudo xmutil loadapp kv260-smartcam
sudo smartcam --usb 0 -W 1920 -H 1080 --target rtsp --aitask ssd
Configuring the sensors with PynqFor this project we need to enable and use the Pmod port and connect the sensors with the Pynq Grove Adapter.
We will initially analyze:
- Temperature
- Humidity
- Water level
1. Create a directory to contain all the files we need:
project-spec/meta-user/recipes-apps
2. Create a.bb file:
> vim python3-pynq-temp&hum.bb
3. We install the package for the sensor directly from the root directory.
SRC_URI = "https://pynq.readthedocs.io/en/v2.0/_modules/pynq/lib/pmod/pmod_tmp2.html#:~:text=lib.pmod.pmod_tmp2-,Edit%20on%20GitHub,-Note"
SRC_URI[md5sum] = "ac1bfe94a18301b26ae5110ea26ca596"
SRC_URI[sha256sum] = "f522c54c9418d1b1fdb6098cd7139439d47b041900000812c51200482d423460"
SRCREV = "0e10a7ee06c3e7d873f4468e06e523e2d58d07f8"S = "${WORKDIR}/git"
inherit xilinx-pynq setuptools3
4. Xilinx-pynqclass will create a PYNQ_NOTEBOOK_DIR
variable that will be packaged in the notebook subpackage, but we still need to make sure the environment is correct for the recipe to run correctly. In this case we need to set the PYNQ_JUPYTER_NOTEBOOK
environment variables. BOARD setup.py
also expects the notebook directory to exist, so we need to create it. For this, we can prepend instructions to the different steps of the compilation.
do_compile_prepend() { export BOARD=KV260 export PYNQ_JUPYTER_NOTEBOOKS=${D}${PYNQ_NOTEBOOK_DIR}}
do_install_prepend() { export BOARD=KV260 export PYNQ_JUPYTER_NOTEBOOKS=${D}${PYNQ_NOTEBOOK_DIR} install -d ${PYNQ_JUPYTER_NOTEBOOKS}}
do_configure_prepend() { export BOARD=KV260 export PYNQ_JUPYTER_NOTEBOOKS=${D}${PYNQ_NOTEBOOK_DIR} install -d ${PYNQ_JUPYTER_NOTEBOOKS}}
5. We need to define the independences.
RDEPENDS_${PN} += "\
python3-pynq \
python3-pillow \
pynq-overlay \
libstdc++ \ "
RDEPENDS_${PN}-notebooks += "\
python3-jupyter \ "
6. RUN:
> petalinux-build -c python3-pynq-temp&hum
7. We may get a compatibility error that we need to fix by creating a patch that replaces the format string with a simple addition.
8. We must place the patch in a subfolder for it to be identified.
> mkdir python3-pynq-temp&hum
> cp $patch_file python3-pynq-temp¬hum/build-fixes.patch
It was necessary to advise us with a page.
9. Now we must execute the command so that Petalinux identifies the new installed packages.
> cd ../../
> vim conf/user-rootfsconfig
CONFIG_python3-pynq-temp¬hum
CONFIG_python3-pynq-temp¬hum-notebooks
> petalinux-config -c rootfs
10. Now we can create the full image:
> petalinux-build
> petalinux-package --boot --u-boot --atf --pmufw
11. We can test the board and it will give us the terminal address to boot directly into Jupyter now.
12. We run the code for our sensors, they can be found at the end of this project.
Google Cloud IoT ConfigurationTo visualize our crops in real time and obtain graphs of the sensor records, we configured Google Cloud with Jupyter Notebook.
It uses more than 5000 reference images of more than 20 different types of crops, you can get it here.
Advances after the contest: We hope to be able to integrate the electric pump systems for spraying herbicides, water for crop irrigation and the development of a mobile app integrated with Google Cloud so that farms can be automated.
In the same way, we want, together with Xilinx, to be able to implement by Pynq many more sensors with greater resistance to extreme conditions for real-time and constant monitoring of new variables such as levels of Phosphorus, Potassium, presence of gases, infrared cameras in our system. agroclimatic monitoring.
Many thanks to the entire Xilinx team for the opportunity to develop projects with state-of-the-art products and provide us with support at all times so that the development of this project will be achieved.
Comments