My Alexa Mindstorms Idea
One day, after my family and I came home after being away for a few hours,
we realized that my dress shoes inside the house were all chewed up and
torn by my dog. That was when I realized how big of an issue it was when
you leave your pet at home alone, they get hungry and anxious. To solve this
problem, I wanted to build a LEGO Mindstorms creation that could be
controlled from outside the house. I ran into a dilemma when I wanted to
access it remotely, so I thought why not use Alexa! That is because most
headphones and a few phones have built-in Alexa features, and it can be
used to change things such as the lighting in your house while you are away.
Things I wanted to achieve:
A dog feeder so food could be given while no-one is at home.
A treat dispenser to commend good behavior or engage the dog
A pet corrector (a spray bottle that produces a sound to stop bad pet
behavior.)
My ideas
All of the parts would be built inside a dog-shaped Mindstorms robot
o The mouth contains the pet corrector and its mechanism so that
the sound can come straight out.
o The back contains a treat dispenser that releases treats with a
conveyor belt and housing to hold them.
o The stomach contains the food bowl that extends out when you
command it to.
Here is a Demonstration of it WorkingYou should reference the Alexa - EV3Dev setup missions here. They give a full thorough overview of what you need and how to use it.
Original Drawings1st Prototypes
These original ideas had many problems such as reliability and size. The treat dispenser would often misfire, the pet corrector wouldn't let out the whole noise and would often bend, and the food dispenser would jam easily. I decided to address these issues with my 2nd prototypes.
2nd Prototypes
Pros
My second prototypes still have one or two problems here and there, but the pet corrector works much better and more consistently, and the treat dispenser is more reliable. The food giver now has a less bulky bowl and works much better, as it is much less bulky, and the treat giver now consistently gives treats one at a time.
Things that still need to be improved :
The walls on the treat dispenser need to be fixed, as they are easy to break, and the treat holder also needs some work along with the color sensor mount. Also, the moving mechanism of the food dispenser tends to break. It seems like fatigue, as the axles have been bent for a while.
3rd Prototype
In my 3rd prototype, I have started joining everything together into the intended dog shape which I outlined in my drawings. The food giver and treat dispenser are joined together in a box-like design to act as the body of a dog.
In this model, the Pet corrector didn't make the cut, as it was a very large item that required lots of torque to push properly. I am continuing to improve on this, but sadly, due to time restrictions, I won't be able to add that feature for the contest.
Final Physical Product
The second part of this is the coding. For this, I have used python for the LEGO Mindstorm running EV3 Dev, and I have used node JS for the Alexa Skills Toolkit.
Python CodeIteration
1
#....................................................................................
# Command values
#..............................................................................................
class Command(Enum):
"""
The list of preset commands and their invocation variation.
These variations correspond to the skill slot values.
"""
treat = ['treat']
dinner = ['dinner']
PetCorrector = ['Pet Corrector']
Dinnerend = ['end dinner']
#.....................................................................................
# Define
#.....................................................................................
class MindstormsGadget(AlexaGadget):
"""
A Mindstorms gadget that performs movement based on voice commands.
Three commands are available, Treat, Dinner, and Pet Corrector
"""
def __init__(self):
"""
Performs Alexa Gadget initialization routines and ev3dev resource allocation.
"""
super().__init__()
# Ev3dev initialization
self.leds = Leds()
self.sound = Sound()
self.Foodgiver = LargeMotor(OUTPUT_B)
self.Treatgiver = LargeMotor(OUTPUT_C)
self.PetCorrector = MediumMotor(OUTPUT_A)
#.....................................................................................
# Pet program
#.....................................................................................
def _activate(self, command):
"""
Handles preset commands.
:param command: the preset command
:param speed: the speed if applicable
"""
print("Activate command: ({}, {})".format(command, 200), file=sys.stderr)
if command in Command.treat.value:
self.Treatgiver.on_for_rotations(SpeedPercent(75), 95/360)
if command in Command.dinner.value:
self.Foodgiver.on_for_rotations(SpeedPercent(100), 1560/360)
if command in Command.Dinnerend.value:
self.Foodgiver.on_for_rotations(SpeedPercent(100), -1560/360)
if command in Command.PetCorrector.value:
self.drive.on_for_seconds(SpeedRPM(200), 3)
In this code, everything was working except for the pet corrector, but in order to start the treat, you have to say treat many times for the first treat to come out.
Iteration 2
#.....................................................................................
# Command definitions#.....................................................................................
class Command(Enum):
"""
The list of preset commands and their invocation variation.
These variations correspond to the skill slot values.
"""
treat = ['treat']
dinner = ['dinner']
PetCorrector = ['No']
Dinnerend = ['end dinner']
#.....................................................................................
# setup
#.....................................................................................
class MindstormsGadget(AlexaGadget):
"""
A Mindstorms gadget that performs movement based on voice commands.
Three commands are available, Treat, Dinner, and Pet Corrector
"""
def __init__(self):
"""
Performs Alexa Gadget initialization routines and ev3dev resource allocation.
"""
super().__init__()
# Ev3dev initialization
self.leds = Leds()
self.sound = Sound()
self.Foodgiver = LargeMotor(OUTPUT_B)
self.Treatgiver = LargeMotor(OUTPUT_C)
self.color = ColorSensor()
self.PetCorrector = LargeMotor(OUTPUT_A)
#.....................................................................................
# My code for the Pet Nanny
#.....................................................................................
def _activate(self, command):
"""
Handles preset commands.
:param command: the preset command
:param speed: the speed if applicable
"""
print("Activate command: ({}, {})".format(command, 200), file=sys.stderr)
if command in Command.treat.value:
while self.color.reflected_light_intensity <= 10:
self.Treatgiver.on(SpeedPercent(-60))
self.Treatgiver.off()
self.Treatgiver.on_for_rotations(SpeedPercent(-100), 100/360)
if command in Command.dinner.value:
self.Foodgiver.on_for_rotations(SpeedPercent(100), 1560/360)
if command in Command.Dinnerend.value:
self.Foodgiver.on_for_rotations(SpeedPercent(100), -1560/360)
if command in Command.PetCorrector.value:
self.PetCorrector.on_for_seconds(SpeedPercent(100), 3)
time.sleep(1)
self.PetCorrector.on_for_seconds(SpeedPercent(-100), 3)
Skill CodeModel
code
Iteration 1
Model.Json
{
"interactionModel": {
"languageModel": {
"invocationName": "pet.nanny",
"intents": [
{
"name": "AMAZON.CancelIntent",
"samples": []
},
{
"name": "AMAZON.HelpIntent",
"samples": []
},
{
"name": "AMAZON.StopIntent",
"samples": []
},
{
"name": "AMAZON.NavigateHomeIntent",
"samples": []
},
{
"name": "SetCommandIntent",
"slots": [
{
"name": "Command",
"type": "CommandType"
}
],
"samples": [
"give {Command}",
"spray {Command}",
"give a {Command}",
"activate {Command}",
"{Command}"
]
}
],
"types": [
{
"name": "CommandType",
"values": [
{
"name": {
"value": "treat"
}
},
{
"name": {
"value": "dinner"
}
}, {
"name": {
"value": "end dinner"
}
}
]
}
]
}
}
}
Iteration 2
{
"interactionModel": {
"languageModel": {
"invocationName": "pet.nanny",
"intents": [
{
"name": "AMAZON.CancelIntent",
"samples": []
},
{
"name": "AMAZON.HelpIntent",
"samples": []
},
{
"name": "AMAZON.StopIntent",
"samples": []
},
{
"name": "AMAZON.NavigateHomeIntent",
"samples": []
},
{
"name": "SetCommandIntent",
"slots": [
{
"name": "Command",
"type": "CommandType"
}
],
"samples": [
"give {Command}",
"spray {Command}",
"give a {Command}",
"activate {Command}",
"{Command}"
]
}
],
"types": [
{
"name": "CommandType",
"values": [
{
"name": {
"value": "treat"
}
},
{
"name": {
"value": "dinner"
}
},
{
"name": {
"value": "Corrector"
}
},
{
"name": {
"value": "end dinner"
}
}
]
}
]
}
}
}
Lambda
The Lambda code that I have used is very similar to that of mission 3.
Iteration 1
- Index.js
The code prompt wouldn't transfer the indents well, so I have linked it here.
- Package.json
{
"name": "agt-mindstorms",
"version": "1.1.0",
"description": "A skill using AGT with Lego Mindstorms that takes care of your pet when you are away from home",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "Amazon Alexa",
"license": "ISC",
"dependencies": {
"ask-sdk-core": "^2.6.0",
"ask-sdk-model": "^1.18.0",
"aws-sdk": "^2.326.0",
"request": "^2.81.0"
}
}
- util.js
The code prompt wouldn't transfer the indents well, so I have linked it here.
- common.js
The code prompt wouldn't transfer the indents well, so I have linked it here.
NotesFor the build instructions, I couldn't use modeling software for the base, as I have used pieces from the Spike Prime set, but the part that was majorly impacted by will have many detailed pictures to guide you through building it.
The final codes are the ones that I have included.
Comments