According to National Institute of Health, "Forgetfulness can be a normal part of aging." On top of it elderly often face physical decline with aging as well. These contributes to problems from forgetting where things are to physical draining on getting the basic items such as medicine or cane.
So far, robotic care have already entered senior market as pet product, with the similar hardware and cost, we can actually build the robots to do more for the elderly. In this article we will be talking about how to use Lego Mindstorm EV3 robot to help the elderly to store and fetch items.
Step 1: Building the GRIPP3RGRIPP3R from MindStorm EV3 seems to be the most ideal robot to handle such task, as we can build out the robot and use it to grip things. The robot takes about 2-3 hours to build using lego MindStorm Set, but really only first 68 pages of the instruction book is necessary, as everything else is really cosmetic. Full instruction can be seen at https://www.lego.com/cdn/cs/set/assets/blt3cf20feda579587c/31313_GRIPP3R_2016.pdf
Once robot is fully built, we first need to test some basics to make sure it functions correctly. We can download MindStorms Home at https://www.lego.com/en-us/themes/mindstorms/downloads and play around with basic functions. Our first test is to see whether what we are trying to do is even possible, below is the little simple program that we wrote to grab the stuff, turn around and walk away.
When this is being run you should see the image below, the wiring is kinda needed for computer to run the program, we won't need the wiring on our final product. Now that this is done we can move onto next step.
Alexa gadget has written entire setup guide on how to connect Alexa with the Mindstorm, you can easily visit https://www.hackster.io/alexagadgets for more information. We will only highlight the major part in this step, it is highly recommended you go through there to get the environment set up.
Setup GRIPP3R side
In this step we setup the lego mindstorms environment with visual studio Code and EV3DEV enviroment. There are few things to understand specifically. The EV3DEV image can be downloaded from
https://www.ev3dev.org/downloads/
You can use Etcher to format the SD card, this can be downloaded via
https://www.techspot.com/downloads/6931-etcher.html
The usb port and sd card is located on the left of the brick if it's faced down, or left side of the robot if it's facing you. This took a little time to figure out.
We can now insert the SD card and bootup EV3, we can set up WiFi network from the ev3dev menu.
This would allow us to connect Visual Studio Code directly to ev3 instead of using usb wires. Once connected remember the ip as we might need it later, make sure your computer is on the same network as the ev3 brick.
Setup on Mac/PC side
And We'd need Microsoft VS Code which can be downloaded at https://code.visualstudio.com/. You can open alexa-gadgets-mindstorms
folder to install ev3dev-browser extension as described in Enviroment Setup Guide by AlexaGadget
Once you click show recommendation it should show up on the left side as below.
We can start a new workspace called gripp3r helper, and have a few empty files or just download my code form the source. Now we will do the GRIPP3R side, we can connect to wifi instead of doing USB programming. Upon clicking click here to connect with my device ev3dev should show up, if not you can enter the ip manually via "I Don't see my device"
Click the download button on the browser to test if it works. If it works everything should be loaded into the ev3 brick and we are all set for environment setup.
Since we are going to use bluetooth to control the unit, let's enable the bluetooth here as well.
This step closely follows the alexagadget's mission 1, I'd recommend going through that guide to get your brick ready to respond to Alexa's calling. We will summarize through.
We first need amazon developer account https://developer.amazon.com. Sign up for a developer account if you don't already have one. And we need to go create a product in Alexa Voice Service.
Just go keep things simple, let's create a product using same as Mission 1 from alexagadget.
- Name: MINDSTORMS EV3
- Product ID: EV3_01
- Product type: Alexa Gadget
- Product category: Animatronic or Figure
- Product description: Whatever you like
- Skip upload image
- Do you intend to distribute this product commercially?: No
- Is this a children’s product or is it otherwise directed to children younger than 13 years old?: No
Once product is created we will have the app id and app secret.
We will use the alexa mission-01 as a test bed, an to do so we just change the id and secret with your own, please make sure there are no quotes needed.
[GadgetSettings]
amazonId = YOUR_GADGET_AMAZON_ID
alexaGadgetSecret = YOUR_GADGET_SECRET
[GadgetCapabilities]
Alexa.Gadget.StateListener = 1.0 - wakeword
We can upload the code like the previous step, full explaination of the code is located on mission 01 itself.
To execute the code, we will do something a bit more different than alexa gadget, it takes a little time to get these code executed and sometimes we are waiting hanging without knowing the reason. We can do it manually via shell terminal itself.
After that we can execute our files just like any other linux.
At this point we can activate alexa wake word with the GRIPP3R itself.
Step 4: Building Alexa appThis step closely summarize the alexagadget's mission 3 and and mission 4, as we need can build out alexa app for basic functions.
We need to create a new skill under alexa console https://developer.amazon.com/alexa/console/ask, for consistency sake let's name it mindstorms just like the missions.
We will use custom and Alexa hosted backend through node js, this is just to keep consistency if you are following the missions, and to make it easier for writing different programs
We can upload JSON file from mission 4 and build it from there. just do not fire any cannon as the GRIPP3R motor isn't suppose to handle that. Upon getting the json copied and pasted click build model to continue.
This gives us enough to do the following, we will get to these later.
- Intent names (MoveIntent, SetSpeedIntent, SetCommandIntent)
- Slot name (Direction, Duration, Speed, Command)
- Slot values ("forward", "backward", "1", "2", "100")
Upon these are built, we will upload all of mission 4's file into lambda in alexa Code screen. Detail code explanation can be seen from here.
After that we can go to test mode to turn the Skill testing is enabled for "Development"
As for code, you can either upload the code base, or following code from mission 4 for testing right now.
#!/usr/bin/env python3
# Copyright 2019 Amazon.com, Inc. or its affiliates. All Rights Reserved.
#
# You may not use this file except in compliance with the terms and conditions
# set forth in the accompanying LICENSE.TXT file.
#
# THESE MATERIALS ARE PROVIDED ON AN "AS IS" BASIS. AMAZON SPECIFICALLY DISCLAIMS, WITH
# RESPECT TO THESE MATERIALS, ALL WARRANTIES, EXPRESS, IMPLIED, OR STATUTORY, INCLUDING
# THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND NON-INFRINGEMENT.
import os
import sys
import time
import logging
import json
import random
import threading
from enum import Enum
from agt import AlexaGadget
from ev3dev2.led import Leds
from ev3dev2.sound import Sound
from ev3dev2.motor import OUTPUT_A, OUTPUT_B, OUTPUT_C, MoveTank, SpeedPercent, MediumMotor
from ev3dev2.sensor.lego import InfraredSensor
# Set the logging level to INFO to see messages from AlexaGadget
logging.basicConfig(level=logging.INFO, stream=sys.stdout, format='%(message)s')
logging.getLogger().addHandler(logging.StreamHandler(sys.stderr))
logger = logging.getLogger(__name__)
class Direction(Enum):
"""
The list of directional commands and their variations.
These variations correspond to the skill slot values.
"""
FORWARD = ['forward', 'forwards', 'go forward']
BACKWARD = ['back', 'backward', 'backwards', 'go backward']
LEFT = ['left', 'go left']
RIGHT = ['right', 'go right']
STOP = ['stop', 'brake', 'halt']
class Command(Enum):
"""
The list of preset commands and their invocation variation.
These variations correspond to the skill slot values.
"""
MOVE_CIRCLE = ['circle', 'move around']
MOVE_SQUARE = ['square']
SENTRY = ['guard', 'guard mode', 'sentry', 'sentry mode']
PATROL = ['patrol', 'patrol mode']
FIRE_ONE = ['cannon', '1 shot', 'one shot']
FIRE_ALL = ['all shots', 'all shot']
class EventName(Enum):
"""
The list of custom event name sent from this gadget
"""
SENTRY = "Sentry"
PROXIMITY = "Proximity"
SPEECH = "Speech"
class MindstormsGadget(AlexaGadget):
"""
A Mindstorms gadget that can perform bi-directional interaction with an Alexa skill.
"""
def __init__(self):
"""
Performs Alexa Gadget initialization routines and ev3dev resource allocation.
"""
super().__init__()
# Robot state
self.sentry_mode = False
self.patrol_mode = False
# Connect two large motors on output ports B and C
self.drive = MoveTank(OUTPUT_B, OUTPUT_C)
self.weapon = MediumMotor(OUTPUT_A)
self.sound = Sound()
self.leds = Leds()
self.ir = InfraredSensor()
# Start threads
threading.Thread(target=self._patrol_thread, daemon=True).start()
threading.Thread(target=self._proximity_thread, daemon=True).start()
def on_connected(self, device_addr):
"""
Gadget connected to the paired Echo device.
:param device_addr: the address of the device we connected to
"""
self.leds.set_color("LEFT", "GREEN")
self.leds.set_color("RIGHT", "GREEN")
logger.info("{} connected to Echo device".format(self.friendly_name))
def on_disconnected(self, device_addr):
"""
Gadget disconnected from the paired Echo device.
:param device_addr: the address of the device we disconnected from
"""
self.leds.set_color("LEFT", "BLACK")
self.leds.set_color("RIGHT", "BLACK")
logger.info("{} disconnected from Echo device".format(self.friendly_name))
def on_custom_mindstorms_gadget_control(self, directive):
"""
Handles the Custom.Mindstorms.Gadget control directive.
:param directive: the custom directive with the matching namespace and name
"""
try:
payload = json.loads(directive.payload.decode("utf-8"))
print("Control payload: {}".format(payload), file=sys.stderr)
control_type = payload["type"]
if control_type == "move":
# Expected params: [direction, duration, speed]
self._move(payload["direction"], int(payload["duration"]), int(payload["speed"]))
if control_type == "command":
# Expected params: [command]
self._activate(payload["command"])
except KeyError:
print("Missing expected parameters: {}".format(directive), file=sys.stderr)
def _move(self, direction, duration: int, speed: int, is_blocking=False):
"""
Handles move commands from the directive.
Right and left movement can under or over turn depending on the surface type.
:param direction: the move direction
:param duration: the duration in seconds
:param speed: the speed percentage as an integer
:param is_blocking: if set, motor run until duration expired before accepting another command
"""
print("Move command: ({}, {}, {}, {})".format(direction, speed, duration, is_blocking), file=sys.stderr)
if direction in Direction.FORWARD.value:
self.drive.on_for_seconds(SpeedPercent(speed), SpeedPercent(speed), duration, block=is_blocking)
if direction in Direction.BACKWARD.value:
self.drive.on_for_seconds(SpeedPercent(-speed), SpeedPercent(-speed), duration, block=is_blocking)
if direction in (Direction.RIGHT.value + Direction.LEFT.value):
self._turn(direction, speed)
self.drive.on_for_seconds(SpeedPercent(speed), SpeedPercent(speed), duration, block=is_blocking)
if direction in Direction.STOP.value:
self.drive.off()
self.patrol_mode = False
def _activate(self, command, speed=50):
"""
Handles preset commands.
:param command: the preset command
:param speed: the speed if applicable
"""
print("Activate command: ({}, {})".format(command, speed), file=sys.stderr)
if command in Command.MOVE_CIRCLE.value:
self.drive.on_for_seconds(SpeedPercent(int(speed)), SpeedPercent(5), 12)
if command in Command.MOVE_SQUARE.value:
for i in range(4):
self._move("right", 2, speed, is_blocking=True)
if command in Command.PATROL.value:
# Set patrol mode to resume patrol thread processing
self.patrol_mode = True
if command in Command.SENTRY.value:
self.sentry_mode = True
self._send_event(EventName.SPEECH, {'speechOut': "Sentry mode activated"})
# Perform Shuffle posture
self.drive.on_for_seconds(SpeedPercent(80), SpeedPercent(-80), 0.2)
time.sleep(0.3)
self.drive.on_for_seconds(SpeedPercent(-40), SpeedPercent(40), 0.2)
self.leds.set_color("LEFT", "YELLOW", 1)
self.leds.set_color("RIGHT", "YELLOW", 1)
if command in Command.FIRE_ONE.value:
print("Fire one", file=sys.stderr)
self.weapon.on_for_rotations(SpeedPercent(100), 3)
self._send_event(EventName.SENTRY, {'fire': 1})
self.sentry_mode = False
print("Sent sentry event - 1 shot, alarm reset", file=sys.stderr)
self.leds.set_color("LEFT", "GREEN", 1)
self.leds.set_color("RIGHT", "GREEN", 1)
if command in Command.FIRE_ALL.value:
print("Fire all", file=sys.stderr)
self.weapon.on_for_rotations(SpeedPercent(100), 10)
self._send_event(EventName.SENTRY, {'fire': 3})
self.sentry_mode = False
print("sent sentry event - 3 shots, alarm reset", file=sys.stderr)
self.leds.set_color("LEFT", "GREEN", 1)
self.leds.set_color("RIGHT", "GREEN", 1)
def _turn(self, direction, speed):
"""
Turns based on the specified direction and speed.
Calibrated for hard smooth surface.
:param direction: the turn direction
:param speed: the turn speed
"""
if direction in Direction.LEFT.value:
self.drive.on_for_seconds(SpeedPercent(0), SpeedPercent(speed), 2)
if direction in Direction.RIGHT.value:
self.drive.on_for_seconds(SpeedPercent(speed), SpeedPercent(0), 2)
def _send_event(self, name: EventName, payload):
"""
Sends a custom event to trigger a sentry action.
:param name: the name of the custom event
:param payload: the sentry JSON payload
"""
self.send_custom_event('Custom.Mindstorms.Gadget', name.value, payload)
def _proximity_thread(self):
"""
Monitors the distance between the robot and an obstacle when sentry mode is activated.
If the minimum distance is breached, send a custom event to trigger action on
the Alexa skill.
"""
count = 0
while True:
while self.sentry_mode:
distance = self.ir.proximity
print("Proximity: {}".format(distance), file=sys.stderr)
count = count + 1 if distance < 10 else 0
if count > 3:
print("Proximity breached. Sending event to skill", file=sys.stderr)
self.leds.set_color("LEFT", "RED", 1)
self.leds.set_color("RIGHT", "RED", 1)
self._send_event(EventName.PROXIMITY, {'distance': distance})
self.sentry_mode = False
time.sleep(0.2)
time.sleep(1)
def _patrol_thread(self):
"""
Performs random movement when patrol mode is activated.
"""
while True:
while self.patrol_mode:
print("Patrol mode activated randomly picks a path", file=sys.stderr)
direction = random.choice(list(Direction))
duration = random.randint(1, 5)
speed = random.randint(1, 4) * 25
while direction == Direction.STOP:
direction = random.choice(list(Direction))
# direction: all except stop, duration: 1-5s, speed: 25, 50, 75, 100
self._move(direction.value[0], duration, speed)
time.sleep(duration)
time.sleep(1)
if __name__ == '__main__':
gadget = MindstormsGadget()
# Set LCD font and turn off blinking LEDs
os.system('setfont Lat7-Terminus12x6')
gadget.leds.set_color("LEFT", "BLACK")
gadget.leds.set_color("RIGHT", "BLACK")
# Startup sequence
gadget.sound.play_song((('C4', 'e'), ('D4', 'e'), ('E5', 'q')))
gadget.leds.set_color("LEFT", "GREEN")
gadget.leds.set_color("RIGHT", "GREEN")
# Gadget main entry point
gadget.main()
# Shutdown sequence
gadget.sound.play_song((('E5', 'e'), ('C4', 'e')))
gadget.leds.set_color("LEFT", "BLACK")
gadget.leds.set_color("RIGHT", "BLACK")
And the ini file
[GadgetSettings]
amazonId = amazonid
alexaGadgetSecret = secret
[GadgetCapabilities]
Custom.Mindstorms.Gadget = 1.0
This should give us the ability to control the robot. We will run through the same shell as below. When done, you should be able to control the robot.
Step 5: Set up Gripper Bot Alexa sideNow it's time to write the robot logic. In order to complete this project we need robot to do 3 things, come, take the item, and fetch the item, in that order from easiest to hardest. This translate to ComeIntent, TakeIntent, BringIntent
In this step we will be changing entire project, so the previous missions are not very useful now. We can use the json file below. One of the important change is "invocationName": "gripper bot" so we can call up gripper bot to help us in Alexa
{
"interactionModel": {
"languageModel": {
"invocationName": "gripper bot",
"intents": [
{
"name": "AMAZON.CancelIntent",
"samples": []
},
{
"name": "AMAZON.HelpIntent",
"samples": []
},
{
"name": "AMAZON.StopIntent",
"samples": []
},
{
"name": "HelloWorldIntent",
"slots": [],
"samples": [
"hello",
"how are you",
"say hi world",
"say hi",
"hi",
"say hello world",
"say hello"
]
},
{
"name": "ComeIntent",
"slots": [],
"samples": [
"Come",
"Come to me",
"Come here"
]
},
{
"name": "TakeIntent",
"slots": [
{
"name": "Item",
"type": "ItemType"
}
],
"samples": [
"Take this",
"Take it",
"Take this item",
"Take this {Item}",
"Take the {Item}"
]
},
{
"name": "BringIntent",
"slots": [
{
"name": "Item",
"type": "ItemType"
}
],
"samples": [
"Bring me the stuff",
"Bring me my item",
"Fetch my stuff",
"Bring me my {Item}"
]
},
{
"name": "AMAZON.NavigateHomeIntent",
"samples": []
}
],
"types": [
{
"name": "ItemType",
"values": [
{
"name": {
"value": "bottle"
}
},
{
"name": {
"value": "cane"
}
},
{
"name": {
"value": "medicine"
}
},
{
"name": {
"value": "cup"
}
}
]
}
]
}
}
}
As for index.js in lamdba, we are adding additional 3 Intent based on mission-4, this is important as we are passing those intent to the gripper bot itself.
// Construct and send a custom directive to the connected gadget with data from
// the ComeIntent.
const ComeIntentHandler = {
canHandle(handlerInput) {
return Alexa.getRequestType(handlerInput.requestEnvelope) === 'IntentRequest'
&& Alexa.getIntentName(handlerInput.requestEnvelope) === 'ComeIntent';
},
handle: function (handlerInput) {
const attributesManager = handlerInput.attributesManager;
let endpointId = attributesManager.getSessionAttributes().endpointId || [];
let speed = "50";
// Construct the directive with the payload containing the move parameters
let directive = Util.build(endpointId, NAMESPACE, NAME_CONTROL,
{
type: 'come',
command: 'come',
speed: speed
});
let speechOutput = 'Gripper Bot is coming to you';
return handlerInput.responseBuilder
.speak(speechOutput + BG_MUSIC)
.addDirective(directive)
.getResponse();
}
};
// Construct and send a custom directive to the connected gadget with data from
// the BringIntent.
const BringIntentHandler = {
canHandle(handlerInput) {
return Alexa.getRequestType(handlerInput.requestEnvelope) === 'IntentRequest'
&& Alexa.getIntentName(handlerInput.requestEnvelope) === 'BringIntent';
},
handle: function (handlerInput) {
let command = Alexa.getSlotValue(handlerInput.requestEnvelope, 'Item');
if (command === "" || command === null)
command = "it to you";
else{
command = command + " to you"
}
const attributesManager = handlerInput.attributesManager;
let endpointId = attributesManager.getSessionAttributes().endpointId || [];
let speed = attributesManager.getSessionAttributes().speed || "50";
// Construct the directive with the payload containing the move parameters
let directive = Util.build(endpointId, NAMESPACE, NAME_CONTROL,
{
type: 'bring',
command: command,
speed: speed
});
let speechOutput = 'bringing ' + command;
return handlerInput.responseBuilder
.speak(speechOutput + BG_MUSIC)
.addDirective(directive)
.getResponse();
}
};
// Construct and send a custom directive to the connected gadget with data from
// the BringIntent.
const TakeIntentHandler = {
canHandle(handlerInput) {
return Alexa.getRequestType(handlerInput.requestEnvelope) === 'IntentRequest'
&& Alexa.getIntentName(handlerInput.requestEnvelope) === 'TakeIntent';
},
handle: function (handlerInput) {
let command = Alexa.getSlotValue(handlerInput.requestEnvelope, 'Item');
if (command === "" || command === null)
command = "the item";
const attributesManager = handlerInput.attributesManager;
let endpointId = attributesManager.getSessionAttributes().endpointId || [];
let speed = attributesManager.getSessionAttributes().speed || "50";
// Construct the directive with the payload containing the move parameters
let directive = Util.build(endpointId, NAMESPACE, NAME_CONTROL,
{
type: 'take',
command: command,
speed: speed
});
let speechOutput = 'taking ' + command;
return handlerInput.responseBuilder
.speak(speechOutput + BG_MUSIC)
.addDirective(directive)
.getResponse();
}
};
// The SkillBuilder acts as the entry point for your skill, routing all request and response
// payloads to the handlers above. Make sure any new handlers or interceptors you've
// defined are included below. The order matters - they're processed top to bottom.
exports.handler = Alexa.SkillBuilders.custom()
.addRequestHandlers(
LaunchRequestHandler,
ComeIntentHandler,
TakeIntentHandler,
BringIntentHandler,
EventsReceivedRequestHandler,
ExpiredRequestHandler,
Common.HelpIntentHandler,
Common.CancelAndStopIntentHandler,
Common.SessionEndedRequestHandler,
Common.IntentReflectorHandler, // make sure IntentReflectorHandler is last so it doesn't override your custom intent handlers
)
.addRequestInterceptors(Common.RequestInterceptor)
.addErrorHandlers(
Common.ErrorHandler,
)
.lambda();
Full code can be seen on the code upload side
Step 6: Writing the GRIPP3R Helper Robot LogicWe'd Gyro sensor that does not come with the kit (it's included in core education kit if you have that one). We can do it without but it's fairly inaccurate to time the 180 degree turn without it.
from ev3dev2.sensor.lego import GyroSensor
self.gyro = GyroSensor()
There are 4 major code reviews here, first is come. The code below is quiet simple.
When ComeIntent to get Robot to come to us, it will launch _come to move forward. And when IR sensor detects anything under 55, it will stop, when the robot stops, it will open up the grip, waiting for item.
def on_custom_mindstorms_gadget_control(self, directive):
"""
Handles the Custom.Mindstorms.Gadget control directive.
:param directive: the custom directive with the matching namespace and name
"""
try:
payload = json.loads(directive.payload.decode("utf-8"))
print("Control payload: {}".format(payload), file=sys.stderr)
control_type = payload["type"]
if control_type == "move":
# Expected params: [direction, duration, speed]
self._move(payload["direction"], int(payload["duration"]), int(payload["speed"]))
elif control_type == "come":
self._come()
elif control_type == "take":
self._take()
elif control_type == "bring":
self._bring()
except KeyError:
print("Missing expected parameters: {}".format(directive), file=sys.stderr)
def _come(self, duration=10, speed=50):
self.leds.set_color("LEFT", "GREEN", 1)
self.leds.set_color("RIGHT", "GREEN", 1)
self.isComing = True
self._move(Direction.FORWARD.value[0], duration, speed)
def _proximity_thread(self):
"""
Monitors the distance between the robot and an obstacle when sentry mode is activated.
If the minimum distance is breached, send a custom event to trigger action on
the Alexa skill.
"""
if distance <= 55:
"""
When the bot is coming, it will stop and open up it's arm
"""
if self.isComing == True:
self.isComing = False
print("Proximity breached, stopping")
self._move(Direction.STOP.value[0], 0, 0,)
self.leds.set_color("LEFT", "RED", 1)
self.leds.set_color("RIGHT", "RED", 1)
while not self.touch.is_pressed:
self.grip.on_for_degrees(SpeedPercent(10), -90)
print("Lowered the grip")
For next 2 parts we will need turning, under same thread for infrared distance sensor, we will also be detecting turning, we are moving motor extremely slow just so we can detect the angle that is accurate, and whenever it hits 180 degrees the bot would stop, continue to perform next state.
self.gyro.reset()
self.isTurning = True
self.gyro.mode = 'GYRO-RATE'
self.gyro.mode = 'GYRO-ANG'
self.drive.on_for_seconds(SpeedPercent(4), SpeedPercent(-4), 40, block=False)
if self.isTurning == True:
print("angle: {}".format(angle), file=sys.stderr)
self.leds.set_color("LEFT", "YELLOW", 1)
self.leds.set_color("RIGHT", "YELLOW", 1)
if abs(angle) >= 179 or abs(angle) <= -181:
self.isTurning = False
self._move(Direction.STOP.value[0], 0, 0,)
self.gyro.reset()
self.gyro.mode = 'GYRO-RATE'
self.gyro.mode = 'GYRO-ANG'
self.leds.set_color("LEFT", "GREEN", 1)
self.leds.set_color("RIGHT", "GREEN", 1)
When TakeIntent gets hit from the server, we will be hitting _take function. Which would grab the item, turn 180 degrees, then head to the base. When IR sensor hits, it would drop the item, back off for a second and turn 180 degrees back.
def on_custom_mindstorms_gadget_control(self, directive):
"""
Handles the Custom.Mindstorms.Gadget control directive.
:param directive: the custom directive with the matching namespace and name
"""
try:
payload = json.loads(directive.payload.decode("utf-8"))
print("Control payload: {}".format(payload), file=sys.stderr)
control_type = payload["type"]
if control_type == "move":
# Expected params: [direction, duration, speed]
self._move(payload["direction"], int(payload["duration"]), int(payload["speed"]))
elif control_type == "come":
self._come()
elif control_type == "take":
self._take()
def _take(self, duration=10, speed=50):
self.grip.on_for_rotations(SpeedPercent(100), 1)
self.leds.set_color("LEFT", "GREEN", 1)
self.leds.set_color("RIGHT", "GREEN", 1)
self.gyro.reset()
self.gyro.mode = 'GYRO-RATE'
self.gyro.mode = 'GYRO-ANG'
self.isTurning = True
self.drive.on_for_seconds(SpeedPercent(4), SpeedPercent(-4), 40)
self.isTaking = True
self.drive.on_for_seconds(SpeedPercent(50), SpeedPercent(50), duration)
elif self.isTaking ==True:
self.isTaking = False
print("Proximity breached, stopping")
self._move(Direction.STOP.value[0], 0, 0,)
self.leds.set_color("LEFT", "RED", 1)
self.leds.set_color("RIGHT", "RED", 1)
while not self.touch.is_pressed:
self.grip.on_for_degrees(SpeedPercent(10), -90)
print("Dropping Item")
self.drive.on_for_seconds(SpeedPercent(-50), SpeedPercent(-50), 1)
self.gyro.reset()
self.isTurning = True
self.gyro.mode = 'GYRO-RATE'
self.gyro.mode = 'GYRO-ANG'
self.drive.on_for_seconds(SpeedPercent(4), SpeedPercent(-4), 40, block=False)
self.leds.set_color("LEFT", "GREEN", 1)
self.leds.set_color("RIGHT", "GREEN", 1)
The BringIntent is the most complicated one. Once Bring control being passed to the robot we first call the _bring function, it will turn 180 degrees, walk 1 second and grip the item, after that it will turn 180 degrees, come to us, once it detect us through infrared sensor it drops the item and walk back same amount of time it walk to us.
def on_custom_mindstorms_gadget_control(self, directive):
"""
Handles the Custom.Mindstorms.Gadget control directive.
:param directive: the custom directive with the matching namespace and name
"""
try:
payload = json.loads(directive.payload.decode("utf-8"))
print("Control payload: {}".format(payload), file=sys.stderr)
control_type = payload["type"]
if control_type == "move":
# Expected params: [direction, duration, speed]
self._move(payload["direction"], int(payload["duration"]), int(payload["speed"]))
elif control_type == "come":
self._come()
elif control_type == "take":
self._take()
elif control_type == "bring":
self._bring()
def _bring(self, duration=10):
self.isTurning = True
self.gyro.mode = 'GYRO-RATE'
self.gyro.mode = 'GYRO-ANG'
self.drive.on_for_seconds(SpeedPercent(4), SpeedPercent(-4), 40)
self.drive.on_for_seconds(SpeedPercent(50), SpeedPercent(50), 1)
self.grip.on_for_rotations(SpeedPercent(100), 1)
self.isTurning = True
self.gyro.mode = 'GYRO-RATE'
self.gyro.mode = 'GYRO-ANG'
self.drive.on_for_seconds(SpeedPercent(4), SpeedPercent(-4), 40)
self.isBringing = True
self.now = time.time()
self.drive.on_for_seconds(SpeedPercent(50), SpeedPercent(50), duration)
self.leds.set_color("LEFT", "GREEN", 1)
self.leds.set_color("RIGHT", "GREEN", 1)
def _proximity_thread(self):
if distance <= 55:
elif self.isBringing ==True:
self.isBringing = False
print("Proximity breached, stopping")
self._move(Direction.STOP.value[0], 0, 0)
self.later = time.time()
self.leds.set_color("LEFT", "RED", 1)
self.leds.set_color("RIGHT", "RED", 1)
while not self.touch.is_pressed:
self.grip.on_for_degrees(SpeedPercent(10), -90)
print("Dropping Item")
difference = int(self.later - self.now)
self.drive.on_for_seconds(SpeedPercent(-50), SpeedPercent(-50), difference)\
Step 7: Demo!With these 3 functions we can essentially let the robot store the item for us, and fetch the item when we need. The robot is capable of storing anything ranging from medicine bottles to a came. These functions are especially helpful for elderly who have mobility movement problems.
We built all this under $300, you can definitely follow the guide and get it done under 20 hours. This is an open sourced project, and we can improve upon these features and help elderly in the future.
Comments