I've been building this project as a way to learn computer vision. And I suppose because I have a death wish against my feet.
I've been documenting the build progress on my YouTube channel: https://www.youtube.com/channel/UCcT3iGeJJfpGU1QehvG48tg/videosw
If you want to see the final testing watch this video, and then below I'll put more details about the build process.
I started with CAD like most my projects, and ended up with an assembly of the design in SolidWorks:
I then printed it and assembled it. (What I don't show in the video is several failed prints and mistakes in my design where I was slightly off with the fit between the servos and the body)
So from the photo above you can see three servos (black), the body (grey) and a rack gear (beige). The larger servo is for aiming the robot, which turns around the robot's stand, fixed to the ground.
The two smaller servos were for drawing back the rack gear by turning a spur gear (not shown above). The groove in the back of the spur shown is for a rubber band, which is pulling the spur so it can fire back in to shoot out a lego piece. It's released when a section of the spur gear is reached where there's no teeth.
The lego pieces are stored in a 'magazine' (the tall part of the body), and fall into the chamber. The fully assembled robot can be seen below, as well as the spur gear as it pulls back the rack.
Having assembled the hardware, I had to work out how to make it track my feet. I've already made a face tracking nerf gun (that shot me in the face). However surprisingly, using haarcascades, which I used for the face tracking, it was far harder to track feet. For one I was only able to find a leg tracking haarcascade, which was far more temperamental, I suppose because legs have fewer distinct features.
In the end I ended up wearing red socks and using colour detection in OpenCV to track them (I know it's kind of cheating, but at least I didn't have to show my bare feet on the internet). If you're interested in seeing the source code and circuit schematics, let me know in the comments.
Either way you can see below the foot tracking in action. The arduino code was quite simple after that, I was sending it the coordinates of the centre of the feet (the green square) from the python script, and then the servo move so that the centre was within the centre of the image (the white bands). When it was within them it would then fire.
So it works. Very badly. But I think maybe it's not the best idea to spend my time trying to improve such a a stupid robot like this.
Thanks for reading! Again check out the full video if you want to see me stand on many many lego bricks, and go support the channel!
https://www.youtube.com/watch?v=I6gpKFjL6_8&t=4s&ab_channel=AdamBeedle
Comments
Please log in or sign up to comment.