Voice technology surrounds us everywhere, listening from our phones, computers, or even our remote controls.
Unfortunately, they lack a critical element that I believe is essential for a home automation assistant: an actual presence.
Additionally, most personal assistant technologies rely on the cloud to function. Is it really a "personal" assistant if it requires constant connectivity and data transfer to an external system?
In the following guide, I will demonstrate the Olive Personal Assistant, built with the goal of realizing both presence and privacy.
Designing for ExpressivenessThey say that the eyes are the window to the soul, and I think for robots this is especially true. In a robot, the eyes are one of the few places where intent and mood can be truly expressed.
Like a human eye, we can dilate or contract the iris to show surprise or intensity, but unlike the human eye, robots can also express mood in other ways!
Our first step is to find a suitable iris that can be precisely controlled. There are a few options for this:
- Source an iris mechanism from a camera or other piece of equipment (like surplus X-Ray machine parts!). Challenging, but can save time if you're lucky.
- Purchase an iris mechanism from an optics company.
- Print your own. Tricky but potentially rewarding. If you have a 3D Printer handy, there are some cool iris designs available for free. You might even be able to modify the designs to make things like servo control easier.
I was fairly fortunate in that I had luck with the first two options. For the main Olive prototype, I was able to purchase an absolutely fascinating iris mechanism for $30 that was once part of an X-Ray machine--the Collimator, specifically:
Here you can see it in motion:
X-Rays? Am I going to grow a third ear?
Well, maybe--but not because of this. A collimator doesn't generate X-Rays, it's just the part that shapes the X-Ray beam within the machine. The iris opens or closes to control the size of the beam. A part like this with such a cyberpunk history is just what a our robot needs.
For the second "Mini" prototype, I already had this surplus Iris Diaphragm from Edmund Optics:
NOTE: The Olive Personal Assistant is designed to seamlessly integrate with any existing Snips setup, using the MQTT events integration.
With that being said, you'll want to get a basic Snips installation working on your Raspberry Pi(s). I recommend reading through the Quick Start guide first:
https://docs.snips.ai/getting-started/quick-start-raspberry-pi
If you want to have multiple units like me, you'll want to set up one full Raspberry Pi as a base station, and set up Raspberry Pi Zero W's as your "Satellite" units. These satellite units will only be running the mic+speakers while the rest of the Snips core components run on the base station:
https://docs.snips.ai/articles/platform/satellites
Now, Let's Build!It's time to breathe some life into our first prototype!
We'll need to figure out:
- A controller system for the iris motors.
- Lighting effects to add more color to our expression options.
- Some way to integrate Olive with our Home Automation System.
There were several refactorings and blind alleys I ran into during the building of this project, so don't feel discouraged if you often run into some setbacks during your project as well--it's something we all do.
For the microcontroller, I chose an ESP8266 board (the D-Duino) made by the talented Travis Lin at DongsenTech. The ESP chips have built-in WiFi, and can be flashed with Arduino IDE, so programming and network access are relatively straightforward (see code attached near the end of this project).
For the lighting, I found an Adafruit Neopixel 16-LED ring fits perfectly within the iris opening, so let's wire it up and mount it:
A hole is drilled in the side for a small cable to be inserted and attached to the Neopixels.
Iris Leaves reassembled:
A foam ring and magnifying glass lens is added to give that shiny eye effect:
And the retention ring is replaced carefully:
Iris is mounted back into chassis, and chassis mounted to the case:
Initial wiring for motor control:
Packaging of components and addition of light+diffuser behind the iris:
Here are the completed prototypes, and some of what they can do.
Let's Talk About Space Thanks to the Snips SDK integration, Olive has the ability to interpret conversational speech. I have set up the agent to recognize when the speaker is talking about space:
Lighting Control Olive is not just good for banter, she can also control various home systems such as lighting.
ConclusionThanks for reviewing my project, it has been one of the most fun projects to build in recent memory. Now, not only do I have a cool voice assistant, I have a proper robotic sidekick with a real presence in my home.
Comments