In this project we will implement an assistant for visually impaired people. The system that we will build will help visually impaired people to better orient themselves in the environment;
The system will give them information about the obstacles in their path; In other words, we will create a system for detecting surrounding objects, and we will provide feedback with the help of some buzzers when we get too close to those objects;
Implementation plans:
In this project we will use the Zybo Z7 development board, which based on the information from some ultrasonic range finders (sensors which detect distance to objects) will provide information about the environment (surrounding objects) and will provide physical feedback through buzzers;
The innovative part of our project will be that we will use the ultrasonic range finders to build a phased array sonar because rather rotating the sensor to look direction, this will be achieved by combining the signals from an array of fixed sensors each with different phases.
The FPGA part from Zybo Z7 will be responsible with the ultrasonic range finders(control, measurement) and it will communicate with the ARM part of Zybo Z7, which will be responsable with buzzers control and the management of the informations received from FPGA part;
Both the sensors and the camera will be planted on a band that will be placed on the user's head for the system to be able to detect objects in the user's path;
We implemented 1 AXI IPs in VHDL to act as custom driver for PmodMaxSonar, because we want to be able to use 3 distance sensors in the same IP;
VHDL Hardware design
This is a short video in which you can see the status of our project:
Comments