This project shows in a practical way how you can perform the classification of emotions, using as development system the device M5StickV. To perform this classification is necessary to have a database in which emotions are divided by classes which represent the different emotions. The KDEF database (http://kdef.se/) was used to perform the training, this database has a total of 4, 900 images of human facial expressions. The set of images contains 70 individuals showing 7 different emotional expressions.
Before starting the training phase, it was necessary to carry out a pre-processing stage, in which the images were resized. This is necessary because when using a mobilenet network architecture, it is necessary to resize each image to 224x224.
Windows 10 was used as operating system in which the training phase was executed, this phase can be late between 3 and 4 hours (depending on the characteristics of the computer). Once it is done in training, the system guara the model *.h5, once you have this model it is necessary to convert it to the format of tensorflow lite *.tflite. This is done using the command :
tflite_convert --keras_model_file " + keras_model_path + " --output_file " + tflite_path
Already having our *.tflite model, the following is transform it to *.kmodel. this is necessary because the KPIU of the M5StickV, only accepts model with this extension. This is done using ncc (https://github.com/kendryte/nncase/releases/tag/v0.2.0-alpha2).
ncc.exe -i tflite -o k210model --dataset " + path_all_dataset + " " + tflite_path + " " + kmodel_path
In this step it is very important that all the images of the dataset are located in a single folder called dataset.
At this point we must have something like this:
Once you have the model in *.kmodel format, you can access the model in two ways. The first using SD memory in my case my M5StatickV no longer reads it :(, this can be done in the following way:
task = kpu.load("/sd/model.kmodel")
Or we can use the application Kflash_gui and download the model to the address 0x300000, as shown in the following figure.
Now we only need the micro pytho program, this was done using the development environment maixpyide_2.3. The following is to convert our emotion classification code into boot.py, this is shown in the following video.
Finally the following video shows the classification of emotions, using images searched by google.
https://www.sciencedirect.com/science/article/pii/S0925231217311153
Comments