Hello, and thanks for checking out OmniArm! Below is a comprehensive video demonstration that covers everything in a little under 5 minutes. For a more detailed overview of this project, feel free to keep reading!
The Problem with ProstheticsIn 2017, an estimated 57.7 million people worldwide were living with an amputated limb.
For many of these individuals, technologies like prosthetic arms can be life-changing. However, extremely high price barriers to purchase these prosthetic arms often prevents amputees from being able to utilize them.
While there are some prosthetic arms on the market that are somewhat affordable, these are usually very limited in functionality, often restricted to only the ability to grab objects with mechanical systems. The arms that do have an array of advanced functionalities (such as myoelectric control, 5 finger control, and sensory feedback), can easily cost between $50, 000-$100, 000, making them inaccessible to most.
Another problem besides high prices plagues myoelectric prosthetics-- rehabilitation. To properly use this arm, it can easily take between 6 to 12 months of continuous rehabilitation. Often times, insurance doesn't cover rehabilitation, making it a very expensive and time-consuming process. Furthermore, this also means that users cannot effectively utilize their arm until many months into rehabilitation, which is very inconvenient for them.
The Solution: OmniArmAfter learning about the problems with prosthetic arms, I was inspired to take action to develop an innovative solution that addresses these shortcomings. I call it OmniArm.
OmniArm is not only extremely affordable and accessible, but comes embedded with a vast array of advanced functionalities to address the most pressing issues with prosthetic arms. I developed OmniArm after speaking to a few individuals I was fortunate enough to get in touch with who have prosthetic arms. I integrated their feedback into my solution.
So what makes Omni such a great solution?
Powerful Control OptionsFlexibleMyoelectricControl
OmniArm can be controlled through myoelectric (muscle) activity. However, unlike nearly all myoelectric arms on the market, which have an EMG sensor at the point of attachment to the body, OmniArm's EMG muscle sensor can be placed anywhere on the body. The sensor streams EMG data to the arm through Bluetooth Low Energy technology, which enables this flexibility. This means that the user can use virtually any muscle in their body to control the arm. Additionally, this enables the user to remotely control their arm, without having to have the arm directly attached to their body (which is pretty awesome).
AI AssistedControl
As discussed earlier, one of the biggest problems with myoelectric prosthetic arms is rehabilitation. Users need to learn how to make unnatural muscle movements to properly control their arm to any reasonable degree, which is very time consuming, expensive, and inefficient.
OmniArm features a built in voice assistant to solve this problem. The user can easily switch between pre-built arm movements on voice command, and trigger that movement through a very natural myoelectric control system (simply by flexing the chosen muscle). This enables the user to perform hundreds of different hand movements right off the bat, without even a single minute of rehabilitation.
Furthermore, there are times when it's inconvenient for the user to rely on myoelectric control. For example, if the user was holding a bag for extended periods of time, it would be very difficult and tiring for them to continuously flex their muscle to maintain that hand position. However, with OmniArm's voice assistant system, the user can command the arm to hold a given position with a simple voice command, without any myoelectric input needed.
Responsive Integrated Somatosensory FeedbackHaving a sense of touch in your prosthetic arm is a feature available on only the most premium, expensive options on the market. However, I've managed to integrate these somatosensory functions in the OmniArm with the power of tinyML and haptic feedback.
A few months ago, I developed a device (called SensiGlove) worn on top of a prosthetic arm as a glove that instantly enables that arm to have a sense of touch. SensiGlove uses machine learning and capacitive touch sensors to enable users to feel a sense of touch on their prosthetic arm through haptic feedback. SensiGlove is fully compatible with OmniArm, and enables the user to distinguish what type of material they are touching with OmniArm, and where on their palm the material is touching. Please feel free to check out this link for more details!
AffordabilityOmniArm has a large array of advanced features. However, how much did it cost to implement all of this?
All together, I built OmniArm plus SensiGlove for just under $130. This price would be significantly reduced if I developed my own PCBs. This is nearly a thousand times less expensive than the $100, 000 myoelectric prosthetic arms I mentioned earlier.
Build Process and Hardware OverviewOmniArmBody
The design for OmniArm was based on two different open-source robotic and prosthetic arm projects. I decided to take the best features from each and combine them with my own design elements to create a practical, functional, and aesthetic arm.
The servo bed and forearm components were derived from the InMoov Project. This was integrated because the servo bed allowed for placement of 5 servo motors, each of which controlled an individual finger.
For the fingers and palm, however, I derived inspiration from another source. The InMoov hand was not designed for use as a prosthetic arm. Instead, I integrated the design of the fully-mechanical FlexyArm, which was designed as a functional, low-cost, mechanical prosthetic.
The pulley system is where I made significant, individualized changes. I found that the InMoov pulley design lacked the necessary torque to pull the fingers back. I created a distinct pulley system for the fingers that was strong and nimble.
All of the arm body components were printed with PLA filament on a Creality 3D printer.
BluetoothEnabledEMGSensor
The EMG Muscle Sensor was built with 3 components. A Myoware Muscle sensor connected to 3M electrodes sensed electrical activity in the targeted muscle. This was connected to an Adafruit Bluefruit Feather microcontroller, which read the sensor data, and streamed it via Bluetooth Low Energy to the Raspberry Pi embedded in OmniArm. A small 3.7v LiPo battery powered this circuit.
OmniArmControlCircuitry
A Raspberry Pi 3B+ serves as the brains of OmniArm. It is connected via I2C to an Adafruit PWM hat for precise control of 5 servo motors embedded in the arm (each of which control an individual finger). The Raspberry Pi and Adafruit Motor hat are both powered by rechargeable nickel metal hydride batteries.
All of OmniArm's code is written in Python, while the Bluetooth EMG Sensor code is written in Arduino C.
The text-to-speech and speech-to-text features utilized in the integrated voice assistant are all powered by Google libraries (gTTS, Google Speech). Bluetooth communication between the sensor and arm is handled by Adafruit's bluefruit BLE libraries. Adafruit's ServoKit library allows for integrated control of the servo motors.
ConclusionThank you so much for checking out this project. I had a lot of fun developing it for the China-US Young Maker's Contest!
If you have any questions, please feel free to message me!
Comments