People with mobility impairments face numerous challenges in their daily lives, ranging from basic tasks to managing their living spaces. Simple actions, such as turning on or off a light switch, can become significant obstacles, often requiring assistance from caregivers or family members. This dependence on others for routine tasks diminishes the autonomy and quality of life for individuals with mobility impairments.
The face following challenges while travelling
- Temporary lodgings, such as hotels or vacation rentals, may not be equipped with smart home infrastructure.
- Remote controls for electronic devices in accommodations often feature complex interfaces that may be challenging for individuals with mobility impairments to navigate.
- Individuals with mobility impairments often experience difficulty in reaching and manipulating physical switches or outlets.
- Basic tasks, like turning on lights, become reliant on external assistance.
- Navigating through spaces to reach light switches poses safety risks, especially in low-light conditions.
While travel inherently poses challenges for individuals with mobility impairments, the integration of smart plug technology can alleviate some specific issues related to managing their living environment while away from home. The ability to control electronic devices remotely becomes crucial for ensuring comfort, safety, and independence during travel.
SolutionThe proposed Voice-Activated Portable Smart Plug, vPlug, is an innovative solution for individuals with mobility impairments. This compact and lightweight device allows seamless control of electronic devices through intuitive voice commands, eliminating the need for physical interaction with switches. Its portability ensures easy travel, transforming any traditional AC outlet into a smart, remotely accessible one. With universal compatibility and straightforward installation, users can enjoy enhanced independence and convenience in managing their living environments.
How does it workA smart plug is like a regular electrical plug in physical design as shown in following image.
The vPlug has a tiny computer inside which uses Machine Learning to understand and follow users voice commands. This helps control how electricity flows to the electronic/electrical devices attached to the vPlug. The vPlug can also connect to the internet through Wi-Fi, allowing users to control it from far away.
To use it, just plug the vPlug into a regular electrical outlet, and then plug any electronic/electrical device, like a lamp or a fan, into the smart plug. The smart plug listens to users voice commands and does what user say.
Behind the scenes, the vPlug runs the TinyML model to understand different voice commands for home automation. This special learning program is lightweight, meaning it doesn't need a lot of resources, making it perfect for vPlug. The goal is to make home automation easier for people with mobility impairments by letting them control your devices with just your voice, making everything simpler and more convenient specially while travelling. The following image show the working of vPlug.
How it Made
This section describes how the TinyML model for vPlug is built and how firmware is developed to run the TinyML model and control the devices.
This project is built using Edge Impulse. The public repository can be found here. I have collected 10 seconds of data for each class.
I used three classes for labelling data:
- light_on - to Turn On Light
- light_off - to Turn Off Light
- nocmd - when no command should be sent
After the data is collected you need to split it.
You will then get 1s of sample data for each voice.
You can find step-by-step tutorial on how collecte and split data here. Here's how the data for each class/label looks like
Remember that the class nocmd is important, as the NN will detect this class most of the time when no command is sent. This class may have audio data with no and little background noice.
Now create impulse as follows
and generate features
then train your model. I achieved 100% training accuracy
and 92% testing accuracy
Finally export model as Arduino library and build device using attached Arduino code (firmware) and by following the circuit diagram (attached).
The following two variables needs to be declared in the beginning:
int RelayPin = 6;
int on_off_flag = 2; //no command (nocmd)
And the following code in loop()
function controls the light bulb.
for (size_t ix = 0; ix < EI_CLASSIFIER_LABEL_COUNT; ix++) {
ei_printf(" %s: %.5f\n", result.classification[ix].label, result.classification[ix].value);
if((strcmp(result.classification[ix].label, "light_on")==0) && result.classification[ix].value > 0.6 ){
on_off_flag = 1; //light_on
} else if((strcmp(result.classification[ix].label, "light_off")==0) && result.classification[ix].value > 0.6 ){
on_off_flag = 0; //light_off
}else{
on_off_flag = 2; //nocmd
}
}
if(on_off_flag==1){
digitalWrite(RelayPin, HIGH);
Serial.println("***** ON *****");
}else if (on_off_flag==0){
digitalWrite(RelayPin, LOW);
Serial.println("***** OFF *****");
}else{
Serial.println("***** No Command Detected *****");
}
Following are the results when I run my project. In the following video it is shown how a light bulb can be controlled by just issuing a voice command.
Comments