Hello, and thank you so much for checking out this project! I know it's only been about 15 seconds since you've clicked on this submission, but I want you to do me a quick favor and close your eyes for about 5 seconds. Seriously.
Now, imagine making your way to a car parked outside your garage in that complete darkness. Difficult, right? Think about the number of walls and railings you would have bumped into, and the number of flower vases you would have knocked over clumsily making your way to the garage. You could have even tripped and gotten seriously injured!
Well, for the nearly 285 million people across the globe that are suffering from serious visual impairment, with an estimated 40 million that are completely blind, challenges like this are a part of every day life. For this project, I wanted to build a low-power, low-cost, and practical solution for people suffering from severe visual impairment and blindness to aid in indoor navigation.
But honestly, I wasn't sure where to start. So I looked at a tiny little creature that relies heavily on a sense besides vision to navigate-- the bat. When navigating cave environments, bats use echolocation. In this process, they emit sound waves and wait for them to bounce off of obstacles in their environment, giving them information about their proximity to various surroundings.
This really got me thinking. What if we could give humansthe ability to use echolocation? This would present an effective solution to the aforementioned problem regarding safely navigating indoor environments without vision. Well thanks to the folks over at Neosensory, this seemingly surreal task has been made quite possible.
My solution, echoSense, is a low-cost, low-power glove embedded with an HC-SR04 ultrasonic sensor, an Adafruit nRF52840 Bluefruit microcontroller board, and various other electronic components. Through the use of BLE it communicates MCU processed ultrasonic sensor data to the user via the Neosensory Buzz haptic feedback interface; this effectively empowers the user with theability to use echolocation.
Pretty cool, huh?
So How Does it Work?The concept behind how our device works is quite simple. The HC-SR04 ultrasonic sensor contains two pins connected to GPIO pins on the MCU: the "trigger" pin and the "echo" pin.
When the trigger pin is activated, the ultrasonic sensor sends out a sound wave, which can bounce off of objects/obstacles and return back to the sensor. Upon returning back to the sensor, the sound wave triggers the "echo" pin, and data is sent back to the connected microcontroller unit.
By comparing the time stamps between when the sound wave was released and when it returned and multiplying this difference by the speed of sound in air, the MCU can effectively gauge the distance to an object (with high accuracy up to 13 feet).
Now, we can take this processed distance data and present it to the user in a way that allows them to make sense of it by utilizing the Neosensory Buzz. And suddenly, humans gain the sensory superpower of echolocation!
Hardware DesignEchoSense has a pretty simple design, but it's important to get it right to avoid damage to your MCU. Let's first take a look at the main components of the device.
At the core we have an Adafruit Circuit Playground Bluefruit. Now I know what you're thinking. Isn't that a microcontroller meant for younger audiences? While this is true, this board comes with a vast array of onboard sensors, is fully compatible with the Arduino SDK since it has an nRF52840 chip, and has a nice design to easily allow the board to be sewn on to cloth (which is perfect for a wearable glove design).
The ultrasonic sensor is what enables our apparatus to have echolocation capabilities. However, it is important to consider that most HCSR04 sensors you'll come across operate on 5V logic, while Adafruit's microcontrollers operate on 3.3V logic. In order to avoid damage to your GPIO pins, it's important to add a pair of equal resistance resistors! The circuit diagram below shows the correct wiring for execution of our project using two 2 kΩ resistors (assuming that you're using a MCU that operates on 3.3V logic).
The entire apparatus itself was powered by a 3.7V battery 500 mAh LiPo battery from Adafruit. A portable phone charger could also be used to power the device through microUSB if you don't have access to this battery or don't feel like purchasing it. We tested it out, and it works!
When actually building the glove, the microcontroller was first sewn into it. Next all the components were soldered (without the breadboard) as shown in the circuit diagram (Figure 3) above. Finally, another glove was put around it in order to protect the components inside and give the glove a more aesthetically pleasing appearance.
Now that we've discussed all the major hardware components of our device and have it built, let's have a brief software overview.
Software OverviewAll of the code was developed in the Arduino IDE through the use of both the Neosensory Arduino SDK, and the Adafruit Circuit Playground library. We also made a modified HC-SR04 library based off of an existing one to make distance calculation occur automatically through the library code.
The software section is also where we discuss how the MCU converts the raw distance data to something that the user can understand through the use of the Neosensory Buzz!
Based on the distance to an object or wall (as detected by the HC-SR04 sensor), the MCU will trigger the Buzz motors to vibrate at different intensities at various key distance levels. For example, when an object is detected within less than 1 foot from the glove, the MCU signals the motors on the Buzz to vibrate at maximum intensity. This is because this would indicate that the user is in close proximity to a wall and could bump into it if they aren't careful. For the next distance interval (from 1 foot to a couple of feet), the vibration intensity drops to 30% of the maximum intensity, indicating that the user is close to an object or wall, but not in a position where he/she could bump into it or knock something over. Past around 4 feet, the Buzz does not vibrate at all, in order to prevent the apparatus from needlessly buzzing.
Of course, these threshold distance values can easily be adjusted in the code to be more sensitive as the user becomes used to wearing the device. For example, every 1 foot change in distance could cause the intensity to drop by 20%, up to 5 feet. However, for initial usage and for demonstration purposes, I utilized a less sensitive mapping of distance to vibration, since I'm not used to wearing this device yet.
We also added in the ability for light intensity sensation via an onboard light sensor on the Adafruit Circuit Playground. This may also be quite useful for the blind as light detection occurs through the eyes. Light sensing mode can be toggled by moving the built in sliding switch on the Circuit Playground.
All associated code can be found in the GitHub repository at the end of this project.
Demonstration and Real-World Testing:Now it's time for a video demonstration! Here is a quick 3 minute video presenting a recap of what was discussed in this project along with a short demonstration.
I also tested the device, completely blindfolded, in order to emulate how a blind person would utilize it for indoor navigation, and make sense of the echolocation data via haptic feedback from the Buzz.
Future Directions and ExtensionWhile echoSense is completely functional, and works really well, there are opportunities for improvements in order to take the device to the next level. As far as I see it, there are three key areas that could be expanded upon in order to improve this device and potentially bring it to market.
1)A more sensitive mapping system
As discussed earlier, a more sensitive mapping system would allow the user to more precisely estimate the distance to various objects. This could be integrated over time as the user gets better at understanding the vibrations from the Buzz. Perhaps there could even be an app with a training game that has vibrations corresponding to fixed distances, so that the user can learn what each detected distance feels like. The same thing can be done for the light intensity feature.
2)Amore secure, and professional design
While the device is fully functional, it's design is still a prototype. The overall aesthetics of the glove could definitely be improved. While the glove does allow for free finger movement (since the ultrasonic sensor is at the base of the palm), perhaps a more compact bracelet type device that sits on the wrist could be a more practical design. Additionally, a 3V ultrasonic sensor could have been utilized to eliminate the need for the resistors we added.
3) Integration with AI based computer vision
Advancements in machine learning and AI have allowed state-of-the-art computer vision models (like YOLO) to run on small computers like the NVIDIA Jetson Nano and even on some tiny microcontrollers like the OpenMV Cam H7 Plus. Perhaps by integrating data from these machine learning models, this project could be taken to the next level. We'll discuss this further in the next section.
Roadmap: Taking it to the Next Level with Edge ImpulseWhile echoSense is a fully functional device, this section will demonstrate some experimentalfeatures that we can use to improve the project in the future. This section is not part of our main functional product, but rather explains how Edge Impulse can be used to take our project to the next level in the future. It also features a fully functional machine learning model built in Edge Impulse.
As discussed in the last section, one improvement that could be added to echoSense in the future is integration with AI based computer vision. While these computer vision models normally require a pretty powerful graphics card to function, Edge Impulse and the OpenMV Cam H7 Plus make it possible to deploy powerful computer vision models to a microcontroller about as wide as two of your fingers, and half as long. Furthermore, this whole setup consumes very little power (about 1 Wh).
So how can we improve echoSense by integrating it with computer vision? Perhaps we could start by allowing the blind to recognize whether what is in front of them is a human or a non-human structure using Edge Impulse image classification. We can then set up a unique vibrational pattern that triggers when a human is detected, along with the haptic feedback from the echoSense's ultrasonic sensor. In this section, we'll deploy a model in Edge Impulse that will allow to accomplish this task.
The first step in training a model is creating a data set. Below, we've compiled a short data collection script in MicroPython for collecting data from the OpenMV Cam H7 Plus through the OpenMV IDE.
```
import sensor, image, time
sensor.reset()
sensor.set_pixformat(sensor.RGB565)
sensor.set_framesize(sensor.QVGA)
sensor.skip_frames(time = 2000)
clock = time.clock()
while(True):
clock.tick()
img = sensor.snapshot()
print(clock.fps())
```
After organizing the raw image data into classes, we can upload it to Edge Impulse for training of our image classification neural network.
After uploading all of the images, and splitting the data between training and testing categories, we can go ahead and develop our impulse.
In our example, we used a Keras based neural network for deep learning, but transfer learning from MobileNet is also a great option for image classification on the OpenMV Cam H7 Plus.
Overall, our model had a 97.9% accuracy with the testing data. Not bad! Since Edge Impulse fully supports the OpenMV Cam H7 Plus, we can directly deploy our impulse as an OpenMV library. Edge Impulse will give us a TFLite file that we can directly copy paste to the board. We can also easily deploy this model with some MicroPython code (it can be found in our GitHub repository).
This model works quite accurately on the OpenMV Cam and runs at about 5 FPS.
The next step in our proposed setup is to allow for communication between the OpenMV Cam H7 Plus and our Adafruit microcontroller. This can be done by connecting the appropriate pins on the two boards, as illustrated below.
The OpenMV forum has an example script that allows one to send data between the OpenMV Cam and the Adafruit device over serial. We can then run our Neosensory Arduino SDK on the Adafruit microcontroller and use input data from both the ultrasonic sensor and the OpenMV Cam H7's computer vision impulse in order to empower the user of echoSense with even greater sensory abilities. In this manner, Edge Impulse can be leveraged to improve our device. However, these microcontroller are still not advanced enough to perform more taxing object detection computer vision algorithms. Until these improvements are made, it would make more practical sense to not add computer vision to the echoSense (to minimize production cost), which is why we didn't include it in our base project.
Concluding RemarksWe had a lot of fun working on echoSense! Thanks for checking it out, and feel free to message us if you have any specific questions.
Also a big thank you to Neosensory and Edge Impulse for organizing this contest.
Comments