People with upper extremity disabilities present a number of challenges when using a computer due to the loss of mobility and sensation in their limbs. This loss of movement makes it difficult for them to use a standard keyboard and mouse to interact with a computer. Therefore, it is common to resort to alternative input devices such as voice controllers and special adapted keyboards or mice.
Many of the assistive devices for this type of disability are often expensive and are not adapted to the needs of users in many instances. on the other hand, the use of input devices, such as voice controllers, raises concerns about users' privacy. As they are susceptible to unscrupulous individuals eavesdropping on private information such as passwords or bank details. The lack of inclusive devices and software means that people with physical disabilities are unable to interact with computers and can no longer be independent.
Build2gether 2.0 ChallengeThis project was developed to participate in the Build2Gether 2.0 competition, which encourages the creation of innovative technological solutions aimed at improving the quality of life of people with disabilities.
This year's challenge is organized into two main areas, each divided into two specific tracks to guide the innovations.
VISUAL IMPAIRMENTS 👨🦯👩🦯
- Track 1: Adaptation for OUTDOOR Activities for People with Visual Impairments
- Track 2: Adaptation for INDOOR Activities for People with Visual Impairments
MOBILITY IMPAIRMENTS 👨🦼👩🦼
- Track 1: Accessible HOME & TOOLS for People with Mobility Impairments
- Track 2: Accessible SPORTS & HOBBIES for People with Mobility Impairments
To address the challenge of Track 2 for People with Mobility Impairments, which focuses on adapting indoor activities, the following technological solution was presented:
I built an assistive device that reads and classifies finger gestures through a muscle sensor, and converts these gestures into keyboard events via bluetooth. This device detects a gesture and will send the programmed key to the computer. This key can vary from A to Z, from 0 to 9 and some special characters. This system will not only allow you to type, but it will also allow you to access games, as some of them use the letters A, W, S, D to move, in the same way this device adds the mouse function.
In this project, six types of gestures inspired on the Morse alphabet have been selected. The gestures are performed by opening and closing the hand, where:
Dit: Represented by a quick closing of the hand.
Dah: It is represented by a closing of the hand that lasts between 1 and 2 seconds.
1. The gesture is performed.
2. The muscle sensor detects, amplifies and filters the electrical signals produced by the contraction of the muscles.
3. The Arduino board collects the data from the analog-to-digital converter (ADC) and employs a machine learning model to analyze and classify the gestures.
4. The machine learning model was developed on the Edge Impulse platform, which facilitated the creation and labeling of the dataset, the allocation of processing blocks for feature extraction, the training of the model, its validation and testing, as well as the retraining and deployment of the model as a library for Arduino.
5. Through Bluetooth communication, the classifications (inferences) made are sent.
6. Through a Python software, the device is linked and the data received through Bluetooth are captured to execute keyboard/mouse events and configure the characters corresponding to each gesture.
Hardware:Arduino Nano RP2040 Connect:
The Arduino Nano RP2040 Connect has the same microcontroller as the Raspberry Pi RP2040. It also takes advantage of the 32-bit Arm® Cortex®-M0+ dual-core to realize Internet of Things projects with Bluetooth® and WiFi connectivity thanks to the U-blox® Nina W102 module. In addition, it has a LSM6DSOXTR accelerometer-gyroscope sensor, RGB LED and an integrated MP34DT06JTR microphone.
MyoWareSensor:
To build this system, we must follow these steps:
Step One: circuit diagramQuickstart
1. Visit Edge Impulse and sign up for an account if you don't already have one. Login to your account and click on “Create new project” and name your project.
2. Visit the Edge Impulse Firmware section for RP2040 based devices and Download the latest firmware version compatible with your card.
3. Connect your RP2040 card to the computer in mass storage mode (hold down the BOOTSEL button while connecting the card to the USB port) and Copy the downloaded.uf2 file and paste it into the storage drive that appeared when you connected the card.
4. If you have not already done so, install the Edge Impulse CLI by following the instructions and run the following command in your terminal, to start the Edge Impulse daemon and connect your device:
edge-impulse-daemon
4. Follow the on-screen instructions to connect your device to your Edge Impulse project.
Placement of EMG electrodes
For detailed guidance on how to position the device for optimal performance, please refer to the User Manual. There, you'll find step-by-step instructions on identifying the correct muscles for sensor placement, ensuring accurate and reliable gesture recognition.
Data Adquisition:
Once you have uploaded and labeled all the data through the data acquisition window in Edge Impulse, all captured and labeled data will be listed in the left section (highlighted in red). In the upper right section (highlighted in yellow), you can easily capture data in real-time and label it. The lower right section(highlighted in green) will display a graph of the captured sample in the time domain, providing an immediate visualization of the acquired data.
Create Impulse:
For the Advanced Gesture Typing Solution project, the impulse was created using Edge Impulse's EON Tuner tool, which automatically configured a 4000 ms data acquisition window with a 2000 ms increment and a sampling rate of 200 Hz, optimized to capture EMG signals. EON Tuner selected the “Spectral Analysis” processing block to extract the most relevant spectral features from the EMG signals, and the “Keras Classification” learning block to train a neural network model capable of classifying gestures with high accuracy. Once these parameters were defined, the impulse was saved using the “Save Impulse” button, allowing to continue with the training and validation of the model within the project.
Using Edge Impulse's EON Tuner tool, the project data was analyzed and each processing and learning block was parameterized to maximize performance. EON Tuner provided us with different automatic learning model architectures, the architecture with the highest accuracy (88%) was chosen.EON Tuner set the data acquisition window at 4000 ms with a 2000 ms increment and a sampling rate of 200 Hz, adapted to capture the EMG signals accurately. In addition, the spectral analysis processing block was adjusted to optimize spectral feature extraction, and the Keras classification learning block was parameterized to train a neural network-based model as accurately as possible. These adjustments were performed by EON Tuner to ensure the best efficiency and accuracy in the project.
Through experimentation, adjusting the training cycles to 50 and the learning rate to 0.005, it was possible to develop a model with an accuracy of 93%.
Model test accuracy & Integration:
The machine learning model was evaluated using 90 samples different from the 600 samples used for training and validation. This evaluation revealed an accuracy of 83.33%, indicating a robust and generalized performance of the model.
Deployment model:
To download the machine learning model, follow these steps:
1. In the left pane, select Deployment.
2. In the search section, select Arduino Library.
3. Click the “Build” button to start the download.
Clone Edge Impulse project:
If you want to download the created model, follow the steps below:
Access Edge Impulse, log in with your account, access the project “Advanced-gesture-typing-solution_Buil2gether_2.0” and click “Clone project” to duplicate it in your account. Then open the cloned project, go to the “Deployment” tab, select “Arduino library” and click “Build” to download the generated files.
Step Three: Import the model into Arduino IDIf you don't have the Arduino IDE installed, download it from Arduino and install it on your computer. Run the Arduino IDE, go to Sketch > Include Library > Add.ZIP Library, navigate to the location of the downloaded Edge Impulse files and select the ZIP file to add it as a new library.
The system requires the model previously downloaded from Edge Impulse and the main.ino main program.
1. Download and unzip the project bundle from the following link on GitHub: Advanced-gesture-typing-solution-main.zip.
2. In the Arduino IDE, go to Files > Open. navigate to the download folder, find and select the main project file "Main.ino".
3. In the Arduino IDE, go to Sketch > Include library > and select the "Project_name_Inferencing"
4. Download and install the Mbed OS Nano Board package through the Arduino IDE Board Manager. Then, in the Tools > Board menu, select the Arduino Nano RP2040 Connect board. Connect your Arduino to the computer using a USB cable and choose the correct port under Tools > Port. Finally, click the Upload button to compile and upload the program to the Arduino Nano RP2040 Connect board.
If you don't have Python installed, download it from Python and install it on your computer. Run the terminal on your computer and execute the following command:
pip3 install pyautogui bleak PySide6
This application was developed in Python for compatibility with the main operating systems, Windows, Linux and MacOS. If you want to know how to configure and use the software, please review the User Manual.
Run the terminal on your computer, go to the directory where the Main.py file is located and execute the following command:
python3 Main.py
Using 3D modeling, a custom enclosure was designed to house the Myoware sensor and the Arduino Nano RP2040 Connect.
Once the 3D model was designed, it was fabricated using a high-precision 3D printer. This process allowed the creation of a customized housing that perfectly fits the Myoware sensor and the Arduino Nano RP2040 Connect, ensuring a secure assembly. An elastic band was added in order to adjust the device to the user's arm.
Once the 3D model was designed, it was fabricated using a high-precision 3D printer. This process allowed the creation of a customized housing that perfectly fits the Myoware sensor and the Arduino Nano RP2040 Connect, ensuring a secure assembly.
To connect the device, first press the “Scan” button and select the correct device from the list. If you select the wrong device, the connection will not be completed successfully. Once connected, you just need to execute the appropriate commands to start typing on the computer using muscle gestures.
Comments