Welcome to the world of LEGO MINDSTORMS EV3 and voice with Alexa! These instructions are part of a series created to show you how to connect your EV3 Brick to Alexa, and build custom voice-based interactions that incorporate EV3 motors and sensors. With this knowledge, you can build and submit your own creation to the LEGO MINDSTORMS Voice Challenge – Powered by Alexa. Don’t be shy bringing in other LEGO elements — let your imagination run wild!
Here is what is included in this series:
- Setup: Get your EV3 development environment setup.
- Mission 1: Get your EV3 Brick connected to a compatible Echo device, and reacting to the wake word.
- Mission 2: Guild EV3RSTORM's legs, and react to music played from your Echo device.
- Mission 3: Add arms and a cannon to EV3RSTORM, and react to voice commands using an Alexa skill you create.
- Mission 4: Give EV3RSTORM eyes (IR Sensor), and make your Alexa skill react anytime an intruder is detected.
(you are here)
Helpful resources
Throughout these missions, several resources will be referenced along the way. These references will come in handy as you learn more about the different ways you can create your own LEGO MINDSTORMS EV3 creation that works with Alexa:
- Alexa Gadgets Toolkit Overview
- Alexa Skills Kit Documentation
- EV3 Python API Documentation
- EV3Dev Python Robot Examples
In Mission 3, you created an Alexa skill that sent custom commands to your EV3 Brick to make EV3RSTORM move. In Mission 4, you will explore how to get an Alexa skill to react to an intruder detected by EV3RSTORM’s proximity sensor. You will:
- Add the head (Infrared Sensor) to EV3RSTORM
- Modify the Alexa skill you created in Mission 3
- Send data from EV3RSTORM’s proximity sensor to your skill
- Make your Alexa skill react to the detected intruder
In the end, you’ll be able to interact with EV3RSTORM like this:
The code for this mission can be found in the alexa-gadgets-mindstorms/mission-04
folder in your VS Code workspace. Let’s walk through the steps to complete this mission!
For this mission, you will need to have followed the EV3RSTORM build instructions up to page 117. This will give EV3RSTORM everything he needs for this mission. Make sure you build the Cannon Arm and connected the Infrared Sensor (EV3RSTORM’s eyes). The build should look like this:
For this mission, you can build upon the skill you created for Mission 3. You will need to update the interaction model, and modify the skill code.
Let’s start by updating the interaction model of your skill to accept a new CommandType
called sentry
. You can find your skill by signing into the Alexa Developer Console, and selecting your MINDSTORMS skill to edit your skill:
1. Make sure Build is selected in the top navigation.
2. Click on JSON Editor under the Interaction Model section.
3. Look for CommandType
in the types
section (around line 132), and add a new value, that looks like this:
{
"name": {
"value": "sentry"
}
}
If you want, you can also copy the entirety of the model.json
file in the /alexa-gadgets-mindstorms/mission-04
folder, and paste it into the Interaction Model JSON Editor to override what’s there.
4. Once you’ve made those changes, click on Save Model and Build Model. It may take some time for the model to build.
Before you move forward, you will also need to update the following files by copying the entirety of the code from the files within the /mission-04/skill-nodejs/lamda/
folder and pasting into the respective files of your skill:
- util.js
- package.json
Once you’ve updated these files, you’re ready to make additional modifications to your skill.
Creating the Event HandlerIn order for a skill to receive an event from the MindstormsGadget
, you will need to add an Event Handler to your skill code. An Event Handler defines how your skill should respond when it receives an event from your EV3 Brick. To create an event handler, you use a StartEventHandlerDirective
, which can accept a number of different attributes. Learn more about Event Handlers in the documentation.
For this mission, you only need to specify the token
and timeout
attributes. The token
can be any unique string. It is associated with all events forwarded by this event handler. The timeout
value is in milliseconds. It represents the duration that an event handler will active.
In VS Code, take a look at line 40 in the alexa-gadgets-mindstorms/mission-04/lambda/index.js
file. A StartEventHandlerDirective
is created within the LaunchRequestHandler
:
// Set the token to track the event handler
const token = handlerInput.requestEnvelope.request.requestId;
Util.putSessionAttribute(handlerInput, 'token', token);
let speechOutput = "Welcome, voice interface activated";
return handlerInput.responseBuilder
.speak(speechOutput + BG_MUSIC)
.addDirective(Util.buildStartEventHandler(token,60000, {}))
.getResponse();
This event handler will be activated when you launch the skill by saying,“Alexa, open mindstorms”, which will then start listening for events coming from your EV3 Brick.
Handling the Expiration EventAn event handler can only be active for a maximum duration of 90 seconds. When the duration expired, a custom event, CustomInterfaceController.Expired
, is sent to the skill. Whenever you create an event handler, you should also implement a handler for this event. Within the skill code, the expiration request handler looks like this:
const ExpiredRequestHandler = {
canHandle(handlerInput) {
return Alexa.getRequestType(handlerInput.requestEnvelope) === 'CustomInterfaceController.Expired'
},
handle(handlerInput) {
// Set the token to track the event handler
const token = handlerInput.requestEnvelope.request.requestId;
Util.putSessionAttribute(handlerInput, 'token', token);
const attributesManager = handlerInput.attributesManager;
let duration = (attributesManager.getSessionAttributes().duration || 0) -1;
if (duration > 0) {
Util.putSessionAttribute(handlerInput, 'duration', );
// Extends skill session by starting another event handler
const speechOutput = `${duration} minutes remaining.`;
return handlerInput.responseBuilder
.addDirective(Util.buildStartEventHandler(token, 60000, {}))
.speak(speechOutput)
.getResponse();
}
else {
// End skill session
return handlerInput.responseBuilder
.speak("Skill duration expired. Goodbye.")
.withShouldEndSession(true)
.getResponse();
}
}
};
When you interact with the skill, you don't want the event handler to time out. If it did, you wouldn't be able to receive a custom event from EV3RSTORM. The code above extends the event handler when it expires – up to 10 minutes in this example.
Handling a custom eventNow that you have a way to receive an event within your skill, you need to tell the skill what to do when it receives one. To handle the Custom Event, the EventsReceivedRequestHandler
is added to the Alexa Request Handler at line 250 of index.js
:
exports.handler = Alexa.SkillBuilders.custom()
.addRequestHandlers(
LaunchRequestHandler,
SetSpeedIntentHandler,
SetCommandIntentHandler,
MoveIntentHandler,
// handles custom events from gadget
EventsReceivedRequestHandler,
ExpiredRequestHandler,
Common.HelpIntentHandler,
Common.CancelAndStopIntentHandler,
Common.SessionEndedRequestHandler,
Common.IntentReflectorHandler, // make sure IntentReflectorHandler is last so it doesn't override your custom intent handlers
)
.addRequestInterceptors(Common.RequestInterceptor)
.addErrorHandlers(
Common.ErrorHandler,
)
.lambda();
Like all Request Handlers, you use two functions – canHandle()
and handle()
— to define how the handler should respond. Let’s take a look at how these functions are used starting at line 150 of index.js
:
Validating the Event
This is invoked on every request, and true is returned to consume and handle the event. First, the event token is validated, to make sure that this event is coming from the expected event handler. Then, the endpoint ID is validated to validate the event is coming from an expected gadget (your EV3 Brick):
canHandle(handlerInput) {
let { request } = handlerInput.requestEnvelope;
console.log('Request type: ' + Alexa.getRequestType(handlerInput.requestEnvelope));
if (request.type !== 'CustomInterfaceController.EventsReceived') return false;
const attributesManager = handlerInput.attributesManager;
let sessionAttributes = attributesManager.getSessionAttributes();
let customEvent = request.events[0];
// Validate event token
if (sessionAttributes.token !== request.token) {
console.log("Event token doesn't match. Ignoring this event");
return false;
}
// Validate endpoint
let requestEndpoint = customEvent.endpoint.endpointId;
if (requestEndpoint !== sessionAttributes.endpointId) {
console.log("Event endpoint id doesn't match. Ignoring this event");
return false;
}
return true;
}
In this mission, you only have one connected gadget, so endpoint validation isn’t needed, but it is included as an example.
Parsing the Custom Event
Once you’ve validated the event, the handle
function processes the data (handlerInput
) of the event. In order to handle the request, you need to know the contents of the data, including the event name and the payload. Once you have this information, you can implement the rest of the logic. Here's how you obtain the payload
and name
from the handlerInput
:
handle(handlerInput) {
console.log("== Received Custom Event ==");
let customEvent = handlerInput.requestEnvelope.request.events[0];
let payload = customEvent.payload;
let name = customEvent.header.name;
Handling the Proximity Event
Later on in this mission, you will be writing code that send a custom event every time EV3RSTORM’s proximity sensor detects an intruder. When this occurs, the event handler in your skill tells Alexa to sound an alert:
let speechOutput;
if (name === 'Proximity') {
let distance = parseInt(payload.distance);
if (distance < 10) {
let speechOutput = "Intruder detected! What would you like to do?";
return handlerInput.responseBuilder
.speak(speechOutput, "REPLACE_ALL")
.withShouldEndSession(false)
.getResponse();
}
}
Handling the Sentry Event
After receiving the alert from Alexa, you may choose to eliminate the threat by having EV3RSTORM fire the cannon. Once EV3RSTORM fires, an event is sent to have Alexa announce that the threat has been eliminated:
else if (name === 'Sentry') {
if ('fire' in payload) {
speechOutput = "Threat eliminated";
}
Handling the Speech Event
Rather than firing, you may choose to have EV3RSTORM speak to scare away the intruder:
else if (name === 'Speech') {
speechOutput = payload.speechOut;
}
...
return handlerInput.responseBuilder
.speak(speechOutput + BG_MUSIC, "REPLACE_ALL")
.getResponse();
Before moving on, copy the entirety of the /mission-04/index.js
code from your VS Code workspace to the index.js
file in the Alexa Skill Code Editor. Once you’ve made this update, click Deploy in the upper-right of the Alexa Skill Code Editor.
With your skill modified to support EV3RSTORM’s new functionality, it’s time to review and run the Python code that can send events to your Alexa skill. Using the same VS Code workspace on your computer from Mission 3, open the mission-04
folder in the alexa-gadgets-mindstorms
folder. You should see a familiar series of files: an INI file, and Python file.
As with the other missions, the first thing you need to do is add your Amazon ID and Alexa Gadget Secret for your registered gadget. For other missions, you’ve also needed to add a capability, but since you’re using the same capability from Mission 3, we’re saving you a step and it’s already in the INI file. Within the alexa-gadgets-mindstorms/mission-04
folder, open the mission-04.ini
file, which should look like:
[GadgetSettings]
amazonId = YOUR_GADGET_AMAZON_ID
alexaGadgetSecret = YOUR_GADGET_SECRET
[GadgetCapabilities]
Custom.Mindstorms.Gadget = 1.0
Replace YOUR_GADGET_AMAZON_ID
and YOUR_GADGET_SECRET
with the ID and Secret tied to the Alexa Gadget you registered in the Amazon Developer Console.
In your skill code, you added a new Control
type called sentry
, and since you’ve already set up your EV3 Brick to receive directives of this type, you can make EV3RSTORM react to it. Open up the /alexa-gadgets-mindstorms/mission-04/mission-04.py
file to review the code.
You can handle this new control type in the same way as other directives in the on_custom_mindstorms_gadget_control
method on line 101 of mission-04.py
. This new directive will be received by your EV3 Brick as a new command string.
Depending on the flexibility you want with the voice interaction model, you might need to handle different variations of the sentry
command. The user might want to say “sentry”, “guard” or “sentry mode” to activate the proximity detection. To handle these variations, you can use an enumeration as shown on line 23:
from enum import Enum
class Command(Enum):
"""
The list of preset commands and their invocation variation.
These variations correspond to the skill slot values.
"""
SENTRY = ['guard', 'guard mode', 'sentry', 'sentry mode']
FIRE_ONE = ['cannon', '1 shot', 'one shot']
FIRE_ALL = ['all shots', 'all shot']
When you receive the Custom.Mindstorms.Gadget
directive, you can check the command value from the payload against the Command
enum like this:
if command in Command.SENTRY.value:
# Perform sentry mode activation
In this example, command
can be any string in this list: ['guard', 'guard mode', 'sentry', 'sentry mode']
Now that you’re getting the directive, you need to put EV3RSTORM into sentry mode when the directive is received.
For EV3RSTORM to detect an intruder, you can use the Infrared Sensor (EV3RSTORM’s eyes) for proximity detection. To implement proximity detection, you must connect the Infrared Sensor to an input port (port 1-4) on the bottom of the EV3 Brick. Then, import the InfraredSensor
class from the ev3dev2.sensor.lego
module. Learn more about the Infrared Sensor.
First, initialize the Infrared Sensor:
from ev3dev2.sensor.lego import InfraredSensor
class MindstormsGadget(AlexaGadget):
def __init__(self):
self.ir = InfraredSensor()
Once the Infrared Sensor is initialized, you can start receiving proximity values. The proximity detection code will need to run on a separate thread because it needs to repeatedly check the sensor.
First, you'll create a new thread and set its target to the proximity detection method. Then, define the proximity reading function. The proximity detection code should run only when the sentry mode is active. It should send an event to the skill if an object is within 10% (or 7cm) distance of the sensor:
import threading
from ev3dev2.sensor.lego import InfraredSensor
class MindstormsGadget(AlexaGadget):
def __init__(self):
self.ir = InfraredSensor()
"""
Starts a new thread to run the proximity detection code
"""
threading.Thread(target=self._proximity_thread).start()
def _proximity_thread(self):
"""
Monitors the distance between the robot and an obstacle when sentry mode is activated.
If the minimum distance is breached, send a custom event to trigger action on
the Alexa Skill.
"""
count = 0
"""
Without this infinite loop, you will need to restart the thread each time
"""
while True:
while self.sentry_mode:
distance = self.ir.proximity
print("Proximity: {}".format(distance))
"""
Breach distance is hardcode to be less than 10,
and needs to trigger at least 3 times
these values can be tuned to be more or less sensitive
"""
count = count + 1 if distance < 10 else 0
if count > 3:
print("Proximity breached. Sending event to skill")
"""
Turn LED red to indicate that a breach is detected
"""
self.leds.set_color("LEFT", "RED", 1)
self.leds.set_color("RIGHT", "RED", 1)
"""
Send the detection event with the distance to the skill
"""
self._send_event(EventName.PROXIMITY, {'distance': distance})
"""
For this mision, sentry require re-activation once triggered
"""
self.sentry_mode = False
"""
Polls the sensor every 200ms while sentry mode is active
"""
time.sleep(0.2)
"""
The polling rate of sentry mode flag (one every second)
"""
time.sleep(1)
Now that EV3RSTORM can detect an intruder, you’re ready to run your Python code and try it with your Alexa skill!
Test EV3RSTORM with the Alexa skillNow that you’ve created your skill, and reviewed the Python code that will put EV3RSTORM into sentry mode and interact with your skill, you’re ready to test out the experience. You will need to run the Python code before starting the skill:
1. Make sure VS Code is connected to your EV3 Brick. You should see a green dot next to your EV3 Brick’s name in the EV3 DEVICE BROWSER.
2. Copy the missions folder in your computer workspace to your EV3 Brick. Click on the Send workspace to device button next to the EV3DEV DEVICE BROWSER text that shows up when you hover over the text.
When you click this button, you will see the status of the files being copied in the bottom-right of VS Code.
3. Once the files have copied over to your EV3 Brick, you can run the Python code by navigating to the Python file in the EV3 DEVICE BROWSER, right-click on the Python file, and select Run.
4. Once the program starts, you should see a prompt in the debug console when your EV3 Brick connects to your Echo device.
You will also see a message on your EV3 Brick's screen.
5. You may be prompted to provide the password for your EV3 Brick, which is maker
Now comes the fun part! Make some room for EV3RSTORM, and try the following commands to see how EV3RSTORM reacts:
“Alexa, open mindstorms”
Alexa should respond acknowledging the skill session is active and listening for commands. The skill must be in session in order for the other following commands to work.
“Alexa, activate sentry mode”
Alexa should acknowledge the command, while EV3RSTORM performs a shuffling pose and his LEDs turns yellow.
Intruder detection
While sentry mode is active (LEDs illuminated yellow), place an object in front of the Infrared Sensor for ~1 second. Alexa should indicate that an intruder is detected and prompt for an action.
Issue fire command
While Alexa is listening for your command, say “fire one shot”, to make EV3RSTORM fire a single shot. Once the fire has been shot, Alexa should acknowledge the threat has been eliminated.
You should also see messages printed to the debug console:
You can always stop the program by clicking the Stop button at the top of the screen in Debug mode, or pressing the back button on your EV3 Brick.
Note: If Alexa responds with something along the lines of, "I don't understand", check and make sure the language that your skill is using matches the language used on your Echo device. e.g. EN-CA, versus EN-US.
Note: If you run into connectivity issues, you can try unpairing your EV3 Brick from your Echo device by following the Unpairing your EV3 Brick instructions in Mission 1. You may also want to try forgetting your EV3 Brick from the Bluetooth settings of your Echo device before pairing again.Other things to explore
Congratulations, you have completed Mission 4! In this mission, you updated your voice interaction model and integrated it into an Alexa-Hosted skill. Then, you leveraged the Alexa Gadgets Toolkit Custom Interface Controller
to send commands to put EV3RSTORM into sentry mode, and send custom events up to your skill to respond.
To expand on what you did, you could try changing the sensitivity of the proximity detection, make Alexa respond in different, or bringing in wake word or tempo from previous missions. Check out the Alexa Gadgets Toolkit documentation or explore other capabilities of ev3dev by referring to the documentation.
What to do nextYou completed all four missions and learned how to make EV3RSTORM to work with Alexa and voice commands! From here, you can build MINDSTORMS models of your own, add additional LEGO elements, and get your creations working with Alexa for submission to the LEGO MINDSTORMS Voice Challenge, powered by Alexa.
Comments
Please log in or sign up to comment.