As the pet owner of a cat with an insatiable appetite, I wanted to find a simple and convenient way to ensure that my pet is always fed. By integrating LEGO MindStorm Ev3 and Alexa, we can make this process both smart and efficient.
Equipments needed:To build this project we need LEGO Education Core set 45544 and LEGO Education EV3 Expansion Set 45560, as we require many pieces to build the frame.
This project requires a custom structure. We will walk through the building instructions step by step so that you can modify it to fit your pet bowl needs and measurements. LEGO Build Instructions can also be found in the attachment section - this can illustratively guide you through your LEGO P3T FEEDER building process.
We first need to create a structure for slow spinning (which will play a part in dispensing the food). A structure as shown below can do the job.
Make sure the spinner fits the bowl; if it does not fit, we can extend the spinner itself. Once we achieve a perfect fit, use another 3m Beam to lock the spinner in position.
Once that is settled, we will be utilizing Medium Motor for the dispensing function -this both satisfies the spinning function and calculates the exact rotation of the feeder in ev3dev. Moving the L Shape Beam on the bottom is also a better option here since it can scrape the bottom of the plate.
The legs are built following the 6-page guide on LEGO Education Instructions. For the middle part, we will be using a 15m beam instead so as to raise the structure higher.
With the Core Set and Expansion Set, we should have enough to build out 4 legs:
We can uphold the structure itself with the leg that can be used to lock the structure, which can hold a bowl. These would require about 8x 15m Beams (another reason why the expansion is necessary).
When the build is complete, we should have a stable structure like the one below:
We can install the EV3 brick on one side and hold the structure like displayed. The Color Sensor will be used on Input 1 and Medium Motor will be attached to Motor A.
Now we can cut out a hole into the plate itself to allow an opening for the food to drop down.
A cone funnel can also be implemented to avoid spillage.
We can adjust the height a little bit here - play with the structure as needed. When the process is complete, we will have built an apparatus that looks like the following:
Now you can officially test out whether the feeder works in controlling the motor. This was also our first test regarding the structural and usage basis.
However, this structure itself still presents an issue - the pet's capacity and comfort when eating from the bowl. We can build additional legos to improve upon the feeder. First, let's merge the 2 biggest pieces of wings, which creates something like what you see below:
Then, we can merge a 8x11m beam and 2x9m beam to create a platform:
And 2x 13m beam to block the side, preventing spillage:
Put this at the bottom where the food trickles out of the cone:
The color sensor will be placed on a 15m beam so it can be attached to the platform itself.
Now we have an improved-upon, final P3T Feeder - an apparatus that is much more pet-friendly and user-friendly.
If you move to the brick to the top, you can see the LED indicator from a bird's-eye view - this makes it more user-friendly since humans of course look at pets and pet feeders from a much higher stance.
At this point, we can also place the brick on top so as to make it a little easier to take videos. When all is said and done, we now have our P3T Feeder!
As previously mentioned, the entire LEGO Build Instructions for this project can be found in the attachment section. We used Studio 2.0 to create it.
In order to complete this project, we would need to complete the Setup and all of the missions from AlexaGadget's guides. This helps us to set up all the devices as well as the Alexa app, and additionally provides insight into how this is done. The guide is located at:
https://www.hackster.io/alexagadgets
- Setup would help us to setup the ev3dev environment
- Mission 1 will help us to set up the EV3 brick with Amazon Echo
- Mission 2 uses the Music tempo
- Mission 3 sets up Alexa App on the server side
- Mission 4 focuses on EventsReceivedRequestHandler
When these mission are complete, we will be modifying from Mission-04 in our project.
Step 3: Writing Alexa Server AppThis step will be following up from Mission-04, so I'd assume the entire Alexa app with ev3 is already running. First, we change the invocationName from "mindstorms" to "pet feeder". Then we will focus on FeedIntent and AutoIntent - FeedIntent is to activate the feeder, and AutoIntent is used to detect whether the bowl is empty via color sensor.
{
"interactionModel": {
"languageModel": {
"invocationName": "pet feeder",
"intents": [
{
"name": "AMAZON.CancelIntent",
"samples": []
},
{
"name": "AMAZON.HelpIntent",
"samples": []
},
{
"name": "AMAZON.StopIntent",
"samples": []
},
{
"name": "AMAZON.NavigateHomeIntent",
"samples": []
},
{
"name": "FeedIntent",
"slots": [],
"samples": [
"Feed the cat",
"Feed the dog",
"Feed my pet"
]
},
{
"name": "AutoIntent",
"slots": [
{
"name": "OnOff",
"type": "OnOffType"
}
],
"samples": [
"Turn {OnOff} auto feeding",
"Turn {OnOff} auto",
"Turn {OnOff} auto mode"
]
}
],
"types": [
{
"name": "OnOffType",
"values": [
{
"name": {
"value": "on"
}
},
{
"name": {
"value": "off"
}
}
]
}
]
}
}
}
Under index.js, we first register the two Intent used which creates FeedIntentHandler and AutoIntentHandler
// The SkillBuilder acts as the entry point for your skill, routing all request and response
// payloads to the handlers above. Make sure any new handlers or interceptors you've
// defined are included below. The order matters - they're processed top to bottom.
exports.handler = Alexa.SkillBuilders.custom()
.addRequestHandlers(
LaunchRequestHandler,
FeedIntentHandler,
AutoIntentHandler,
EventsReceivedRequestHandler,
ExpiredRequestHandler,
Common.HelpIntentHandler,
Common.CancelAndStopIntentHandler,
Common.SessionEndedRequestHandler,
Common.IntentReflectorHandler, // make sure IntentReflectorHandler is last so it doesn't override your custom intent handlers
)
.addRequestInterceptors(Common.RequestInterceptor)
.addErrorHandlers(
Common.ErrorHandler,
)
.lambda();
Inside FeedIntentHandler, we are sending the type and command as "feed" into ev3dev2, the speechOutput will be "Dispensing food", which Alexa will tell us back.
// Construct and send a custom directive to the connected gadget with data from
// the FeedIntent.
const FeedIntentHandler = {
canHandle(handlerInput) {
return Alexa.getRequestType(handlerInput.requestEnvelope) === 'IntentRequest'
&& Alexa.getIntentName(handlerInput.requestEnvelope) === 'FeedIntent';
},
handle: function (handlerInput) {
const attributesManager = handlerInput.attributesManager;
let endpointId = attributesManager.getSessionAttributes().endpointId || [];
let speed = attributesManager.getSessionAttributes().speed || "50";
// Construct the directive with the payload containing the move parameters
let directive = Util.build(endpointId, NAMESPACE, NAME_CONTROL,
{
type: 'feed',
command: 'feed',
speed: speed
});
let speechOutput = 'Dispensing food';
return handlerInput.responseBuilder
.speak(speechOutput + BG_MUSIC)
.addDirective(directive)
.getResponse();
}
};
In AutoIntentHandler, we can choose to turn on and turn off the auto mode - the choices are being chosen inside "slot"
// Construct and send a custom directive to the connected gadget with data from
// the AutohIntentHandler.
const AutoIntentHandler = {
canHandle(handlerInput) {
return Alexa.getRequestType(handlerInput.requestEnvelope) === 'IntentRequest'
&& Alexa.getIntentName(handlerInput.requestEnvelope) === 'AutoIntent';
},
handle: function (handlerInput) {
let onoff = Alexa.getSlotValue(handlerInput.requestEnvelope, 'OnOff');
if (!onoff) {
return handlerInput.responseBuilder
.speak("Can you repeat that?")
.withShouldEndSession(false)
.getResponse();
}
const attributesManager = handlerInput.attributesManager;
let endpointId = attributesManager.getSessionAttributes().endpointId || [];
let speed = attributesManager.getSessionAttributes().speed || "50";
// Construct the directive with the payload containing the move parameters
let directive = Util.build(endpointId, NAMESPACE, NAME_CONTROL,
{
type: 'auto',
command: onoff,
speed: speed
});
let speechOutput = 'Turning auto feeding ' + onoff;
return handlerInput.responseBuilder
.speak(speechOutput + BG_MUSIC)
.addDirective(directive)
.getResponse();
}
};
Finally, we have EventsReceivedRequestHandler, which is being used when ev3 brick pushes information back; in this case, we will check for EventName to be "Feeder" to load the speech.
const EventsReceivedRequestHandler = {
// Checks for a valid token and endpoint.
canHandle(handlerInput) {
let { request } = handlerInput.requestEnvelope;
console.log('Request type: ' + Alexa.getRequestType(handlerInput.requestEnvelope));
if (request.type !== 'CustomInterfaceController.EventsReceived') return false;
const attributesManager = handlerInput.attributesManager;
let sessionAttributes = attributesManager.getSessionAttributes();
let customEvent = request.events[0];
// Validate event token
if (sessionAttributes.token !== request.token) {
console.log("Event token doesn't match. Ignoring this event");
return false;
}
// Validate endpoint
let requestEndpoint = customEvent.endpoint.endpointId;
if (requestEndpoint !== sessionAttributes.endpointId) {
console.log("Event endpoint id doesn't match. Ignoring this event");
return false;
}
return true;
},
handle(handlerInput) {
console.log("== Received Custom Event ==");
let customEvent = handlerInput.requestEnvelope.request.events[0];
let payload = customEvent.payload;
let name = customEvent.header.name;
let speechOutput;
if (name === 'Feeder') {
speechOutput = payload.speech;
}
return handlerInput.responseBuilder
.speak(speechOutput + BG_MUSIC, "REPLACE_ALL")
.getResponse();
}
};
Step 4: ev3dev2 python codeNow that the Alexa portion is complete, we can work on the ev3dev python portion. We will closely follow mission 4's python code. We will go through the code differences, and you can try out the code yourself in code section.
The project will focus on Custom.Mindstorms.Gadget which is located inside petfeeder.ini
[GadgetCapabilities]
Custom.Mindstorms.Gadget = 1.0
First we'd need take out the things we do not need from mission-4, as we will only be using MediumMotor from this project.
import os
import sys
import time
import logging
import json
import random
import threading
from agt import AlexaGadget
from ev3dev2.led import Leds
from ev3dev2.sound import Sound
from ev3dev2.sensor.lego import ColorSensor
from ev3dev2.motor import OUTPUT_A, MediumMotor, SpeedPercent
Inside Init, we will use feeder as MediumMotor and ColorSensor to check the color of the bowl. _autofeed_thread will be running in the background.
def __init__(self):
"""
Performs Alexa Gadget initialization routines and ev3dev resource allocation.
"""
super().__init__()
# Robot state
self.auto_mode = False
self.feeder = MediumMotor(OUTPUT_A)
self.sound = Sound()
self.leds = Leds()
self.color = ColorSensor()
# Start threads
threading.Thread(target=self._autofeed_thread, daemon=True).start()
Inside _autofeed_thread, we will check self.auto_mode. When it is on and the plate is green or blue (as it is in our demo plate), feeder will dispense one cycle (360 degrees) and use send_custom_event back to Alexa to speak: "Food is low, automatic dispensed". This part is being handled by EventsReceivedRequestHandler on the server
def _autofeed_thread(self):
"""
Performs random movement when patrol mode is activated.
"""
while True:
print(self.color.color)
if self.color.color == self.color.COLOR_GREEN or self.color.color == self.color.COLOR_BLUE:
gadget.leds.set_color("LEFT", "RED")
gadget.leds.set_color("RIGHT", "RED")
else:
gadget.leds.set_color("LEFT", "GREEN")
gadget.leds.set_color("RIGHT", "GREEN")
if self.auto_mode == True:
#when the bowl is not green or blue keep dropping feed
if self.color.color == self.color.COLOR_GREEN or self.color.color == self.color.COLOR_BLUE:
self._send_event({'speech':'Food is low, automatic dispensed'})
self.feeder.on_for_degrees(SpeedPercent(30), 360)
time.sleep(1)
def _send_event(self, payload):
"""
Sends a custom event to trigger a sentry action.
:param name: the name of the custom event
:param payload: the sentry JSON payload
"""
self.send_custom_event('Custom.Mindstorms.Gadget', "Feeder", payload)
In on_custom_mindstorms_gadget_control we will be handling the 2 Intents coming from Alexa, that is FeedIntent in form control_type "feed" and AutoIntent in form of control_type "auto"
def on_custom_mindstorms_gadget_control(self, directive):
"""
Handles the Custom.Mindstorms.Gadget control directive.
:param directive: the custom directive with the matching namespace and name
"""
try:
payload = json.loads(directive.payload.decode("utf-8"))
print("Control payload: {}".format(payload), file=sys.stderr)
control_type = payload["type"]
if control_type == "feed":
self._feed_handler()
elif control_type == "auto":
self._auto_handler(payload["command"])
except KeyError:
print("Missing expected parameters: {}".format(directive), file=sys.stderr)
Inside _feed_handler, once we receive the command from Alexa, we will spin the feeder twice (720 degrees) and dispense food into the feeder.
def _feed_handler(self):
#turning 720 to feed
self.feeder.on_for_degrees(SpeedPercent(30), 720)
Inside _auto_handler we will determine whether self.auto_mode is on or off - this option is then being handled by _autofeed_thread, as explained earlier.
def _auto_handler(self, onoff):
if onoff == "on":
self.auto_mode = True
else:
self.auto_mode = False
By the final stage, we will be able to dispense food:
You can test the functions with the following phrases:
To access the app: "Alexa, open Pet Feeder"
To dispense food: "Alexa, feed my pet" or "Alexa, feed the cat"
To turn auto mode on: "Alexa, turn on auto feeding"
To turn auto mode off: "Alexa, turn off auto mode"
We demonstrated these commands and functions in the video below:
Step 6: Using it with your timeAlthough Toby was initially camera-shy during our demo shoot, we caught him happily using it later on. He loves his new LEGO P3T Feeder!
He was so immersed in his meal that he didn't realize we were there!
Once he warmed up to the camera a bit, Toby graciously struck a pose in front of his new favorite toy :)
Comments