Evie is meant to give a much-needed face to Alexa. As it is, Alexa is capable of many different things, but she has no way to express herself. Evie is a companion robot which will solve this.
BACKGROUND
Evie's design is an upgrade of a robot I had built in 2014 (as part of a MINDSTORMS community challenge on their old web page): Here's a picture:
Back then, I'd called it 3MOJIBOT. It could change its expression based on color or touch input (the 'nose' is the color sensor, the tire in the back is attached to the touch sensor) using its 'mouth' and 'eyebrows.' After looking at the design, I changed its concept from human to a dog; it seemed more fitting for the robot.
PRESENT-DAY
Evie, in many ways, is an upgrade of 3MOJIBOT. The basic physical structure is similar, but the mechanism for the mouth is different (and more complex). Evie also doesn't use sensor feedback for her 'facial' reactions; she relies solely on Alexa's input.
The robot uses four large motors: one for the mouth, one each for the eyes, and one for the neck (more on that later). She also has a color sensor. Evie is a self-contained robot, mounted upon an Echo dot. The Echo dot is modified so that the EV3 color sensor is be able to detect ambient light from Alexa's light ring better (basically, with the help of a soldering iron, I cut away little rectangles in Alexa's plastic body around the light ring to let the light pass through with higher intensity enough for EV3 color sensor to detect.
A LITTLE BIT MORE DETAIL
Evie is built around the EV3 brick. Evie's mouth has two straight pieces driven through a pair of Lego bevel gears on both sides of a large motor. This allows a more intuitive mouth.
Evie's eyes are pretty straightforward; they can rotate so that the angled Lego pieces can be oriented to any angle between 0-359 degrees. I was thinking of using the same mechanism from the mouth for the eyes, but realized that having the eyes as they are now were more expressive (for example, I could turn the eyes sideways).
Evie also has a neck, which is used when the Alexa wake-word is active:
self.neck_motor.on(15)
while True:
refl = self.colour.ambient_light_intensity
print("Col Val:{}".format(refl), file=sys.stderr)
if refl >= 5:
self.neck_motor.off()
The neck is driven by a large motor, connected to a turntable. She will look for the cyan colour from Alexa's ring-light and stop when she detects it. Evie's colour sensor is used for the neck, as described above. The sensor detects the cyan colour in ambient light mode. Originally, I had thought of using one of the Alexa Interfaces for this, but the StateListener Interface did not give the necessary information I needed
Evie is equipped with a smartphone stand on the top of her 'head.'
EVIE'S SKILL
Evie's corresponding Alexa skill is a demonstration of the expressions she can make, which are not fully exhausted in her main code. The skill allows the user to ask for a specific expression, and if Evie can make it, she will! She'll hold the expression for about five seconds before returning to neutral, and Alexa will prompt for a new one. If each eye has at least four different positions, and the mouth three, then Evie can form a total of 48 expressions. The skill only touches the surface of these expressions.
def on_custom_evie_gadget_control(self, directive):
"""
Gathers requested emotion from Alexa and displays accordingly.
"""
try:
payload = json.loads(directive.payload.decode("utf-8"))
print("Emotion: {}".format(payload), file=sys.stderr)
response = payload["type"]
emotive = payload["emotion"]
if response == "emotion":
print("Emotion: ({})".format(emotive), file=sys.stderr)
if emotive in Emotion.KIND.value:
self.left_motor.on_for_degrees(100, 180)
self.right_motor.on_for_degrees(100, 180)
self.center_motor.on_for_degrees(100, 25)
time.sleep(1)
self.left_motor.on_for_degrees(100, -180)
self.right_motor.on_for_degrees(100, -180)
self.center_motor.on_for_degrees(100, -25)
elif emotive in Emotion.CONTENT.value:
self.center_motor.on_for_degrees(100, 25)
time.sleep(1)
self.center_motor.on_for_degrees(100, -25)
elif emotive in Emotion.TIRED.value:
self.left_motor.on_for_degrees(100, 180)
self.right_motor.on_for_degrees(100, 180)
self.center_motor.on_for_degrees(100, -25)
time.sleep(1)
self.left_motor.on_for_degrees(100, -180)
self.right_motor.on_for_degrees(100, -180)
self.center_motor.on_for_degrees(100, 25)
elif emotive in Emotion.DISAPPOINTED.value:
self.center_motor.on_for_degrees(100, -25)
time.sleep(1)
self.center_motor.on_for_degrees(100, 25)
elif emotive in Emotion.ASLEEP.value:
self.left_motor.on_for_degrees(100, -180)
self.right_motor.on_for_degrees(100, -180)
time.sleep(1)
self.left_motor.on_for_degrees(100, 180)
self.right_motor.on_for_degrees(100, 180)
elif emotive in Emotion.NEUTRAL.value:
print("This is my neutral face!", file=sys.stderr)
except KeyError:
print("No Emotion Specified: {}".format(directive), file=sys.stderr)
WHY EVIE?
Smart-home assistants make life a lot easier by doing the little things we need. However, I always found it disconcerting that they never had a face associated with them, especially since the assistant's voices can be monotonous. Even now, with smart devices available with displays, they do not have a face or personality. As the future turns towards AI, it makes more and more sense for smart assistants to become more humanoid. Evie is a step towards that, and she poses the possibilities which can be explored for smart home assistants.
Comments