A privacy-focused event companion for Embedded Vision Summit 2022! This little bot will take your photo, but only when the QR code for this project is in the frame. :)
This bot runs on the Arduino Nicla Vision, with an Adafruit Gemma M0 running the ear motors. The whole thing started when I realized that my Companion Core PCB fit perfectly inside a crappy phone-holder I had, and would work well with a little tripod – AND I could stick it in my shirt pocket! THEN I realized that both pieces fit well with the head assembly I'd been designing for another robot, and this prototype just kinda came together.
It also happened to be the fourth anniversary of finishing my first robot familiar, Archimedes, on the eve of Bay Area Maker Faire 2018! So, this felt pretty magical. :)
Vision systemThe Nicla is set up to read any barcodes in the frame, using the OpenMV IDE and compatible MicroPython code, following this code from Adrian Rosebrock: https://pyimagesearch.com/2018/03/19/reading-barcodes-with-python-and-openmv – You can find my mashed-up version in the Code section below.
Once it finds a barcode, if it matches this URL (https://bit.ly/furrybot), it illuminates the red LED (both the onboard LED and one I've attached to GPIO), takes a photo, and saves it to local memory with an incrementing counter. On this prototype, it currently resets the counter each time it wakes up, so I'll need to rewrite that to prevent new photos from overwriting the older ones. As it stands, the photos become readable on the board after it resets, which is handy because I can pull them onto my iPad using it as a USB flash storage device. I used this to back up the photos during Embedded Vision Summit.
The Nicla is hooked up to a right-angle Micro USB cable, which I plug into a battery pack stashed in my bag.
The servos are mounted in a "sandwich" between two layers, which are currently 3D-printed but could be laser-cut in the future. They're mounted at 45º angles, which helps with making lifelike animations (as seen in my fennec fox robot, F3NR1R).
This bot started out as the first one to use my Companion Core! However, I couldn't get both of the ear-servos to work at the same time, which I now believe is due to something in the code or power system of the Gemma M0.
Still, while ruling out hardware issues, I decided to cobble together a little protoboard so I could make sure it wasn't a problem with the Core. This board connects the Gemma (powered by its own LiPo) to the servos. The whole thing is hot-glued to the back of the phone mount.
I wrote the code in MakeCode block script, which you can grab here: https://makecode.com/_KRJWw9Hk97tE
Although both servos have worked at different times, I couldn't get it to be reliable, so for this conference, I dialed it back to flicking just the right-side ear. Room for improvement!!
MicrophoneThis little 'bot is ALSO inspired by those "dead cat" wind-screening fuzzy sock things they put over microphones, and serves much the same purpose. It holds a cardioid lavalier mic, and can be held like a regular microphone when I want to do interviews. (Unfortunately, if I haven't secured the cables, those can cause noise in the recording... as I found while trying to edit some of my footage! Alas! In future, I'll wind the cables around my arm and try to minimize moving the mic.)
I designed this mounting block in OnShape (project link), and while I had some slicer/printer issues at first, I got a really solid one eventually. Here's a fit test on a failed, crumbly one, and you can see the final version mounted below the Nicla in the photo under "Vision System":
It slips between the bars of the phone holder, and straddles the outside of the servo sandwich.
Next stepsNext, I've gotta fix that photo file-naming issue! And figure out how to get both ears working... I'd also like to get them running on the Nicla Vision itself, but the documentation for that is scant.
I also need to figure out framing on the photos. Unfortunately, there's no image preview (unless you have it hooked up to a laptop), and so it's really hard to get A) the code and B) a person in-frame! There's a lot of dancing around and hoping, and the results are... interesting. A bit like a wildlife camera trap 😅
Here are some shots from the conference! Judge for yourself!
Short link to this page: https://bit.ly/furrybot
Comments