Whoever said that diamonds are a girl's best friend never had a dog. My best friend was Korsan, a brown Cocker Spaniel. Sadly, when I was eight, he passed away, and I was heartbroken.Then, while on holiday in France, I saw a guide dog with its visually impaired owner. My parents explained how a guide dog helps visually impaired people daily. I loved the idea that dogs could help people; however, it made me think of
Korsan, and how sad I was when he died. I thought that if I had been so upset how would a blind person feel when they lost their dog? Not only would they lose their best friend, but they would also lose their eyes again.#IC4USo, I decided to build ic4u, a robot guide dog. Before I started to build iC4U, I decided which features I would like to add, so I contacted the guide dog associations in the United States and the United Kingdom. They were amazing and answered all my questions. I learned how guide dogs train, behave, and their relationship with visually impaired people. A guide dog needs to obey the commands of the blind person, learn to navigate, and have intelligent disobedience. The guide dog needs to;
- Adjust its speed and movement in the right direction, if there is an obstacle in its path it changes direction,
- Navigates from A to B (on familiar routes)
- In dangerous circumstances, IC4U must learn not to obey the blind person's command, become intelligently disobedient, and take action accordingly and autonomously.
I immediately started to integrate these features into IC4U. I built the first version in 2018 with Arduino Uno.
A visually impaired person can give voice commands to control IC4U via a mobile app. As I had limited knowledge, I integrated the intelligent disobedience ability with the help of sensors. To detect obstacles, I used an ultrasonic sensor in IC4U's eyes. When an obstacle is detected, IC4U comes to a stop. IC4U also includes a gas and rain sensor. When IC4U detects a fire, gas, or rain, it notifies the visually impaired individual via phone notifications and a buzzer.
I updated IC4U with the Adafruit GPS/GPRS Shield. Using the MQTT protocol, I linked the shield to the Adafruit IO service. This feature made it possible to track the location of the visually impaired person on a map as well as the battery status and warnings of IC4U.
As I enjoyed working on the robot, I decided to use IC4U as a development platform for myself and began to work on the 2nd version: IC4U2. As I wanted to add AI & ML applications such as object detection and voice feedback, I needed a more powerful microprocessor so I chose to use a Raspberry Pi 3B+.
I installed a Google AIY Voice Kit to iC4U2. The voice kit gave the visually impaired person the ability to give voice commands directly to the robot instead of via an app. I added kneecaps to this version's legs and installed servo motors so that IC4U can sit down, stand up, and lie down.
In this version, I used artificial intelligence and machine learning to detect objects rather than an ultrasonic sensor. iC4U2 can process images by using a Raspberry Pi Camera, TensorFlow (Machine Learning Platform), the OpenCV (Image Processing Library), and the “MS COCO” dataset. For example, iC4U2 stops when it sees a stop sign and tells the visually impaired person why it is stopping.iC4U2 can make conversation with the visually impaired person and others by using the Dialogflow platform for this feature. I also included Google Maps so that the visually impaired person could get directions from iC4U2.
#IC4U3I decided that I wanted to develop IC4U’s AI abilities even more by adding naturalistic features so I decided to make IC4U3 the 3rd version of my robot guide dog. I wanted to add a lidar to IC4U3 so that it would be able to 3D map its surroundings. In November 2021, I was invited to the World Summit AI as a panelist, where I met some students from TU Delft University. They invited IC4U2 and me to visit their robotic labs and meet their Professor Dr. Chris Verhoeven. I told him all about my robot and what I planned to do. He helped me understand that I didn't need to add a lidar to IC4U3 as it would only cause unnecessary data overload. Instead of only recognizing objects coming in its direction, it would detect all objects in its environment. He suggested that it would be better that I make IC4U mimic a real dog.
Following his advice, I first added sound sensors as its ears so that IC4U3 would first hear a sound and turn its head to look in that direction to process the object. However, simply hearing the sound is insufficient, so I also included a ZED 2i Wide-Angle 3D AI Camera so that it could detect the object from a wider angle and detect the size and speed of the object more accurately and intelligently.
To power the ZED 2i camera and for higher-quality image processing I used an NVIDIA Jetson Nano. I was so impressed with the ZED 2i camera’s performance that I didn't want to limit its use to a simple object recognition task. I started to think of additional ways IC4U could help a visually impaired person.
One of these features was an upgraded version of IC4U’s current city object detection. IC4U can not only detect traffic lights, stop signs, etc, but now IC4U can also detect and recognize the different colors of traffic lights.
IC4U now can help with shopping. A visually impaired person can show IC4U an item that they would like to buy. IC4U can tell them where to buy it, how much it costs, and if it is in stock by collecting data from various online retailers. I used web scraping. However, it will be much more effective if ic4u can connect to online retail store databases. Another feature for shopping is for IC4U3 to be able to detect money and the value of the banknote.
I also made it possible for IC4U3 to respond to “Okay IC4U” instead of “Okay Google.”
To see more of iC4U's videos please check out my Youtube Channel.
Comments