I made an Alexa-controlled robot arm. You can get cheap robotic arm kits for $30-$40 bucks these days. I chose this one from OWI. The kit build is time consuming (about 5 hours) but pretty straightforward.
The arm is fairly fast, but not that strong. I was hoping it would pick up a beer or a cellphone, but it can only handle very light objects such as pens (or ribbons for cats). However the concept is there, and it would be easy to upgrade the equipment now that I have the code ready.
HardwareThe controller is just 10 switches for the 5 motors in 2 directions, and 1 switch for the LED light, so it was easy to unplug and control them each with a relay instead. The motors use 3V of signal, but a 12V relay will work (you just need a 12V power supply for the relay coils themselves). I used a 16-channel relay but only hooked up 11 of them.
There are 8 pins on the arm, and I tested them with a multimeter to see which ones worked. The manual lists the first pin as +3V and the last pin as -3V and the middle ones as 1-6. I numbered them 1-8. The connections to control the motors and lights are: 2-1, 2-8, 3-1, 3-8, 4-1, 4-8, 5-1, 5-8, 6-1, 6-8, and I believe the LED light is 1-8. You can test them just by shorting them with a jumper, or plug each pair into a channel on the relay for control with the Arduino.
I set up a Lamda skill by following the Alexa Controlled LEDs Through Arduino Yún Tutorial. Here is my Lamda function. (Replace {mydomain} with your publicly accessible or dynamic dns domain, {myport} with whatever port you run the node script below on, and token with any URL-friendly string.)
I'll admit that I never got the Yun talking to my IOT shadow correctly, so I wrote a simple node script that listens for requests on a special port and for a special token and forwards those requests to the Yun. (Replace {myport} with any port number and {mytoken} with the same token from above.) Later I would like to beef up the security with LWA, POST requests and Amazon's secure certificates.
Lastly, here is my Arduino code for listening for commands and executing them, based on the Yun Bridge example.
I made several directives -- "Open", "Close", "Forward", "Back", "Tilt Forward", "Tilt Back", "Left", "Right", "Up", "Down", and "Play with Cats". There are also stubs for "Hold This", "Let Go", "Chop", and "Throw this away" for a future date.
The arm has no sensors, so "Stop" is helpful if you push it too far, and there won't be any dynamic feedback. Initially I had on and off commands for every command, like "Turn on Open" and "Turn off Open", but the delay was too bad that it was better to only run short routines and have them stop on their own.
I'd definitely like to work on the delay, it can take 5 or 6 seconds to execute a command with the proxy, and I'd like to narrow that down. Plugging the Yun into ethernet and using a static IP would help for sure.
With less of a delay, more of a reach, and stronger motors, I do picture this project being helpful for people with disabilities or only the use of one hand. For now though, cat entertainment it is (and don't worry, I don't let my cat eat ribbon unsupervised).
Comments