Traditionally, gait parameters analysis needs wearable equipment to gather data from patients, however, it will need a person wearable expensive wearable devices. This project is about constructing a reliable computer vision based pre-processing system for analysing human gait parameters.
Demo videos- Walking on treadmill
- Running on treadmill
- Dancing (multiple people)
- Dancing one person (baby)
- Vitis-AI 1.4.1
- Vivado 2021.1
- PetaLinux 2021.1
- VVAS 1.0
- ZCU104 Xilinx Ultrascale+ MPSoC
- HDMI monitor and cable
- HD camera (Optional)
- Follow the official instructions to set up ZCU104, and download SD-Card image to boot the system.
- If you would like to start from beginning, you can use Xilinx aibox-reid example (v1.0). Otherwise, you can use my Github codes directly.
- Copy required libraries, model JSON files and scripts to the board.
- To run the demo, please run the scripts.
To generate the model file for KV260 to run, you need to set use the docker environment of Vitis-AI. Please follow the official instructions to set up it.
Once you have set the environment correctly, you should be able to start it by the command:
cd <github repo folder, where you download it>
./docker_run.sh xilinx/vitis-ai-gpu:latest
You will be asked to type in "Y" to agree with the terms. If everything goes correctly, you will see the welcome logo of Vitis-AI.
In this project, we are using SPnet from Xilinx Model Zoo to detect human poses. This model was trained in the Caffe environment. Therefore, we have to use Caffe environment to compile the model.
Using this command to start the Caffe environment in Vitis AI docker.
conda activate vitis-ai-caffe
cd to the Model Zoo folder.
cd models/AI-Model-Zoo
Download the SPnet (GPU version) by the python script and unzip it to your folder.
Note: we use the GPU version because we need to compile it. The kv260 version here is not compatible with the firmware we use.
(base) catmouse@new-3090-server:~/Vitis-AI/models/AI-Model-Zoo$ python3 downloader.py
Tip:
you need to input framework and model name, use space divide such as tf vgg16
tf:tensorflow1.x tf2:tensorflow2.x cf:caffe dk:darknet pt:pytorch
input:cf spnet
['cf', 'spnet']
1: ['type', 'float & quantized']['board', 'GPU']
2: ['type', 'xmodel']['board', 'zcu102 & zcu104 & kv260']
3: ['type', 'xmodel']['board', 'vck190']
4: ['type', 'xmodel']['board', 'vck5000']
5: ['type', 'xmodel']['board', 'u50-DPUCAHX8H & u50lv-DPUCAHX8H & u280-DPUCAHX8H']
6: ['type', 'xmodel']['board', 'u50-DPUCAHX8L & u50lv-DPUCAHX8L & u280-DPUCAHX8L']
7: ['type', 'xmodel']['board', 'u200-DPUCADF8H & u250-DPUCADF8H']
input num:1
0.00%
81.65%
done
(base) catmouse@new-3090-server:~/Vitis-AI/models/AI-Model-Zoo$ unzip cf_SPnet_aichallenger_224_128_0.54G_1.4.zip
Now, you should be able to see the folder of SPnet.
Then, we need to go to the quantized folder to run compiling command.
cd models/AI-Model-Zoo/cf_SPnet_aichallenger_224_128_0.54G_1.4/quantized/
vai_c_caffe -p ./deploy.prototxt -c ./deploy.caffemodel -a ../../arch.json -o ./ -n sp_final
Note: you should replace the arch.json with yours. The arch.json can be found on the KV260 example page.
Then, you should see something like this:
And you will see the output xmodel file here. That is the one you need for the project.
Post-processing python program for gait analysis will extract time-joint angle signals for the knee and hip joints from the landmarks' time series. Kalman filter and discrete Fourier transform are applied to remedy the prediction error of deep learning model and eliminate unexpected noise in signal. Some fundamental gait parameters(e.g. flexion angle, max and min angle, phase difference) extracted from joint angle signal will be stored in a csv file with samples' basic information (age, sex, weight, height). A PCA model(eigen values and eigen matrices) trained by those discrete data can predict the recovery extent of patient by using hotelling distance.
Through real-time video analysis, we can obtain the main key points for human pose detection, and then apply the post-processing scripts to obtain the detailed joint angle parameters to be used for clinical analysis. This application can be used for many sport science and rehabilitation monitoring and feedback work, together with cloud and IoT frameworks, can be used in future smart healthcare systems.
Comments