This project leverage artificial intelligence to bridge accessibility and technology. The system interprets American Sign Language (ASL) commands via computer camera through an AI-trained model and uses them to interact with sensors and actuators connected to an Arduino board. This allows users to perform tasks such as retrieving environmental data and controlling connected devices through simple ASL gestures.
1. Why monitoring temperature and light is vital for our well-being and efficiency?The quality of our indoor environment has a profound impact on how we feel and how well we function. With so many of us spending most of our time inside—whether working, studying, or caring for loved ones—it's crucial to keep tabs on important factors like temperature, air quality, and lighting. This project focuses on making environmental monitoring not only smart but also accessible, using technology in a way that can benefit everyone, including those who communicate through American Sign Language (ASL).
Temperature: we all know how hard it is to concentrate when we’re too hot or too cold. Studies show that keeping the indoor temperature in a balanced range, around 20-23°C (68-73°F), can keep our minds sharp and focused. Too warm, and we might feel sluggish or sleepy; too cold, and our bodies struggle to stay comfortable, affecting our productivity and mood. Beyond focus, temperature extremes can also pose long-term health risks, especially for those with respiratory issues or sensitivities.
Luminosity: light isn’t just about seeing clearly; it deeply influences our body’s natural rhythms and our overall well-being. Bright, blue-toned light is energizing and great for mornings, helping us feel awake and focused by boosting serotonin and reducing sleepiness. But that same blue light can be harmful at night, disrupting our sleep cycle by tricking our bodies into staying alert when we should be winding down.
2. Key features of the project
First, I worked on ASL Gesture Recognition.The AI-based model has been trained to recognize specific ASL commands, including:
- "Temperature": Retrieves and displays the current temperature level from a connected sensor.
- "Light": Measures and provides the current luminosity level.
- "Light On": Turns on the connected LED lights.
- "Light Off": Turns off the LED lights.
I used Google Teachable Machine in order to create and train an AI-model.
Then I needed to connect the sensors with Arduino and write a code to connect a hardware with AI-model, so that the system communicates with an Arduino board, enabling real-time data collection from sensors and immediate activation of actuators. For this, I used Arduino board, temperature and light sensors and LEDs. In order to prepare a code, I used Adacraft Vittascience coding interface. Here is my code:
I created a video to explain exactly how to do these steps and added a test of the project at the end of the video. Here is a link to the video:
This project highlights the power of AI in accessibility and IoT, offering a practical solution for ASL users to interact with technology. Future iterations could include a broader range of ASL commands, additional sensor integrations, and enhanced AI accuracy for even more seamless interaction.
Comments
Please log in or sign up to comment.