Swimming is a beneficial physical activity for people of all abilities, but it can be challenging and risky for visually impaired people who may have difficulty navigating the pool, avoiding obstacles, and staying within the lane. Existing solutions, such as tethering or using a guide, may limit the freedom and independence of the swimmer. Therefore, there is a need for a device that can provide real-time feedback and guidance to visually impaired swimmers without compromising their mobility and enjoyment.
What are you going to build to solve this problem? How is it different from existing solutions? Why is it useful?We are building SwimSense, a wearable device that uses haptic sensors to create tactile sensations on the forehead and wrists of the swimmer. SwimSense does not require any external assistance or modification of the pool environment. It uses a camera, an ultrasonic sensor, and a compass to capture the visual and distance information of the surroundings, and a Google coral dev board to process the data and apply machine learning algorithms to detect the lane lines, pool edges, obstacles, and other swimmers in the pool. Based on the detection results, SwimSense generates appropriate haptic signals to guide the swimmer to stay within the lane, avoid collisions, and maintain a safe distance from the pool edge. SwimSense also provides feedback on the swimmer’s speed, distance, and direction using different vibration patterns. These parameters can be configured using a smartphone app that allows the swimmer to adjust the sensitivity, intensity, and frequency of the haptic signals according to their preferences and needs.
How does your solution work? What are the main features? Please specify how you will use the Inclusive Innovation Challenge Hardware in your solution.SwimSense is a wearable device that consists of a headband, and a wristband. The headband contains a Google coral dev board micro, SBC with an edge TPU co-processor that can perform fast machine learning inference. The Google Coral dev board micro has an onboard camera and an ultrasonic sensor that can capture the visual and distance information of the surroundings. The wristband contains a 3-axis accelerometer and a 3-axis gyroscope that can track the location and orientation of the swimmer. The device uses haptic sensors to create tactile sensations on the forehead and wrists of the swimmer. The device uses acceleration on the Edge TPU, using TensorFlow Lite for Microcontrollers (TFLM), which is a framework for deploying machine learning models on edge devices. The software processes the input data from the sensors and applies computer vision and machine learning algorithms to detect the lane lines, pool edges, obstacles, and other swimmers in the pool. Based on the detection results, the device generates appropriate haptic signals to guide the swimmer to stay within the lane, avoid collisions, and maintain a safe distance from the pool edge. The device also provides feedback on the swimmer’s speed, distance, and direction using different vibration patterns. The device can be configured using a smartphone app that allows the swimmer to adjust the sensitivity, intensity, and frequency of the haptic signals according to their preferences and needs.
List the hardware and software you will use to build this.Hardware: 1. Google Coral Dev Board Camera 2. Ultrasonic sensor 3. A 3-axis accelerometer and a 3-axis gyroscope 4. Compass 5. Linear resonant actuators 6. Eccentric rotating mass vibration motors 7. Headband 8. Goggles 9. Wristband 10. Battery 11. Wires and connectors Software: 1. TensorFlow Lite 2. Python - OpenCV 3. Arduino 4. Android app
Do you plan to present a novel creation that you develop specifically for this contest, or do you view this opportunity as a chance to enhance a project you have previously worked on?SwimSense is a novel creation that aims to enhance the swimming experience and safety for visually impaired people which is dedicatedly designed for the "Build2gether Inclusive Innovation Challenge" contest.
Comments