PROBLEM: Usually visually impaired people have over dependence on other people to help them. This causes issue for visually impaired people when there is no one around to help them, And in some cases people feel very annoyed about these people and try to ignore them. I have a personal experience where I have seen people asking for help for strangers for help in public settings and most of them ignore them. The underlying causes of this issue is blindness and over dependence on people.
SOLUTION: The issue can be addressed now as there are advance language models to help assist people in many different ways. Why not we create a wearable AI device were the AI model from cloud process the data through different sensors and provide audio feedback to visually impaired people, each person will have his own AI assistant trained on his data. This will help to reduce the dependence on other people.
FEEDBACK FROM CONTEST MASTERS:
The problem as you have stated is extremely generalized and speaks of ableist behaviors. If you have identified the problem as blindness, which is very general, and the over-dependence on people with the result of annoying sighted people when asking for help, then I think a lack of empathy or understanding from individuals, and that seems to be the bigger problem and might have as much to do with being annoyed by an over-dependent visually impaired person asking a sighted person for assistance as the visual impairment itself.
The problem as you have stated is very general. Visually impaired people having a dependence on sighted people for assistance is the beginning of a conversation to figure out how visually impaired people can become more independent in many different smaller ways. A lack of assistance when no one is around could be applied to many different tasks or functions. If the sighted person becomes annoyed and is asked for aid, could be attributed to the lack of knowledge or empathy towards the blind, disabled, or even just strangers in need, could be a problem for many communities regardless of the technologies available many great people aren’t annoyed when help is needed and are willing to participate in improving the lives of others however they can, like including this Build2gether2.0 contest.
In my personal experience as a visually impaired person, I have been helped often when I’ve needed it. How can technology help visually impaired people become more independent?
So based on the feedback I received I decided to build an AI assistive technology that would help people with visual impaired to live independently with travelling inside the city.
Navigating public transportation and unfamiliar surroundings can be challenging for visually impaired individuals. This project aims to make travel safer and more accessible by combining AI technology with a compact, user-friendly device. Using an ESP32-S3 microcontroller and the Grove Vision AI Module V2, we’ve created a smart assistant that helps users identify the right bus to board and recognize objects around them.
The project consists of two main features: First, a bus detection system that tells the user which bus to board based on their destination, ensuring they never miss their ride. Second, an object recognition system that uses Google’s Vision API to identify everyday objects captured by the camera, providing valuable information about the user’s environment. Together, these features empower visually impaired individuals to travel more independently and confidently.
From the feedback I understood that I had to build something that might improve the lives of visually impaired people by just by even a little bit, so hence I taught of building something that would assist them to navigate through the city easily.
I Visually impaired person can say where they want to go to the unihiker which has an inbuilt mic which recognizes the speech checks the data base and finds for the correct route to reach the destination and then whenever the bus arrives the Grove vision module recognizes the bus number and sends it to unihiker if the bus numbers matches with that number in the database then the message to board the bus is sent to the user via Bluetooth. And whenever there is a fall due to small accidents the blues kit alerts the relatives about the fall
In addition to this I even added an object detection using xiao esp32s3 google vision API.
1. Bus Route Detection:Collecting Data:
- The first thing we need to do is to collect the data of the bus numbers routes and the source and destination of the bus in local area.
- I have collect all route data for buses in Bengaluru, and saved it in excel file.
- I stored the data in this format.
- Now we will have to get as many pictures of different kinds of bus as possible to train our AI model using Tiny ML.
- I tried to collect as many pictures as I can but unfortunately because of taking photos was prohibited I was not able to get much data, I was caught twice and got scoldings while taking photos so Iam still thinking of any alternative way to implement the bus number detection but currently we can just detected the buses not the number on the bus due to insufficient data to train the model 😔😓, I tried my best till the end.
- Connect Unihiker to a computer
go to 10.1.2.3 while the unihiker is connected
- Goto file upload
- Select destination folder that is the excel sheet with all the bus data and upload it
- Now this excel sheet is saved in the unihiker memory and can now we used via jupyter notebook.
- copy the Bus route Finder BusRouteFinder.ipynb code from the repo and paste it in jupyter notebook.
- Install all the necessary libraries.
- change the path to the path where u saved the csv file.
- Now to connect Bluetooth earbud to unihiker(we can also use a Bluetooth speaker or use I2S converter)
- Goto jupyter notebook page >new > terminal and enter the following commands.
Enable Bluetooth.
sudo systemctl enable bluetooth
sudo systemctl start bluetooth
Start Bluetooth control
bluetoothctl
Turn on Bluetooth and Set to Discoverable
power on
agent on
scan on
Once Bluetooth earbud appears in the list of discovered devices, note the MAC address
pair 12:34:56:78:9A:BC
trust 12:34:56:78:9A:BC
connect 12:34:56:78:9A:BC
After this Bluetooth will be cofigured with unihiker.
Train the model:- We can add any existing model to the Grove module via sensecraft
- Sensecraft is a platform with various AI models that are ready to use without any code.
- But we will be using Edge impulse as we need to train the model on custom data.
- First label the buses data.
- Create an impulse
- Pre-processing Besides resizing the images, we can change them to grayscale or keep the actual RGB color depth. Let's start selecting [
RGB]
in theImage
section. Doing that, each data sample will have a dimension of 27, 648 features - To export model goto dashboard and select Transfer learning model - TensorFlow Lite (int8 quantized) model and download it.
- after downloading the model change the extension of the model to .tflite
- Goto google collab now and run these commands
!pip install ethos-u-vela
!!vela --version
- upload the .tflite file and run the command
!vela ei-build2gether-int8.tflite --accelerator-config ethos-u55-64
You
will get a new folder ei-build2gether-int8_vela.tflite- Now go to Home - SenseCraft AI (seeed.cc) and select deploy own custom model
- After adding the model, you can connect the grove module and deploy the model in it
- First we need to install the Arduino SSMA library as a zip file from its GitHub and install it to the Arduino IDE (sketch > Include Library > Add .Zip Library).
- Now attach xiao esp32s3 to the grove module.
- Copy the ESP32S3detect.ino and paste it in arduino ide and run the code while esp32s3 is connected.
- For fall detection we assume that the unihiker will always be with the user
- We will be using I2C communication for this.
- After connecting the notecard to esp8266 we will have to install the Blues notecard library
Notecard Quickstart - Blues Developers
You can follow this documentation to set up blues notecard
I will be using the Wifi notecard as the cellular notecard does not work in India, you can use the cellular module as it provides GPS functionality. - Copy the blues.ino code change the Wifi
.
Change SSID and Password here
- Now go to Notehub and create a new project.
Product UID
- Copy product UID and paste it in the code and dump the code into the esp8266.
- wait for it to run the first time the check the notehub weather its connected.
- Now its time to add the routes
- First create a twilio account and the API key
Click on create Route
- Select Twilio and get accountSID, Auth token and from number from Twilio account
- Fill the details in the Twilio add desired message
- Go to filters > Notefiles> select notefiles and select emergency.qo for SOS signal.
- repeat the same for fall detection except this time use FD.qo
- The Unihiker is connected with the wheelchair so whenever the unihiker falls down the fall detected signal is sent
- This is much simpler and straight forward first copy the code ObjDect.ino code and change the API with ur google vision api
- connect the button to gpio 4 and the other part to gnd
- direct the esp32s3 towards object u wanna detect press the button
- Change the ip adress with ur unihiker ip adress and u can use any custom port
- for unihiker upload the code I provided.
- The output can also
Working on this project has been a great experience for me, and I've learned a lot of new things along the way. It feels good to build something that could actually help people, especially those who are visually impaired. Right now, the project can detect buses, but it’s not yet able to recognize bus numbers clearly. This is something I’m still working on, and I’m going to keep looking for better ways to solve this.
I know there’s more to learn, and I’m excited to keep improving this project. I believe that with some more effort and new ideas, I can make it even better and more useful. My goal is to make sure it can help people navigate easily and safely. I’m looking forward to what comes next and how this project can continue to grow and make a real difference.
Comments