Hide and seek is a popular game in our house but we have limited locations to hide. The goal of this project is to allow a player (or multiple players) to both hide and seek with Heidi the robot.
Alexa is critical to the game play as it allows natural communication with the robot (instead of via a screen) and allows the hider and seeker to communicate without knowing each other’s location. Playing the game feels intuitive as the implementation has concentrated on the flow of the game and has added many features to support natural communication.
In addition to the basic game play, there are additional features that enhance the game play by customising settings and using commands to help players find each other.
VideoThe video is at:
Please turn the sound up to hear some of the Alexa commands as the quality has reduced in compression.
The video shows the robot hiding, the robot seeking and some of the settings. Please see the below details for all the features.
Basic game playPlayer: “Alexa play Hide and Seek”
Alexa: “Would you like to hide or seek?”
Player: “Seek”
Alexa: “Close your eyes and count to 10”
Player: "1 2 3 4 5 6 7 8 9 10"
Alexa: "Now say ready or not here I come and then go find Heidi"
As the player is counting Heidi the robot will go and hide using its light sensor and distance sensor to move around and find a location to hide. If it finds somewhere dark it will stop, otherwise it will stop when the player gets to “Ready or Not here I come”. Alexa will validate that the player is counting but like the normal game of hide and seek, you need to trust the seeker is closing their eyes.
If the player is having a hard time finding the robot, they can say “Where are you?” and the robot will make a random sound (eg giggle, sneeze etc). The robot will also make a random sound every minute of searching using the Alexa interface to get the time since launch.
If the player is still having a really hard time finding the robot, they can say “Make some noise” and the robot will beep. "Make some noise X times will make the robot beep that many times. There is validation to ensure this is less than 10 times. Beep will also trigger the command.
When the player finds the robot they place their hand in front of the eyes and Alexa will say that you have found Heidi and reset the game asking to hide or seek again.
Player: “Hide”
Alexa: “Stand on a red piece of paper on the ground. 1, 2, 3, 4, 5, 6, 7, 8, 9, 10. Ready or not here Heidi comes”. Alexa will count with pauses between each number.
The robot will use the distance sensor and colour sensor to try to find you. Once found, Alexa will say “Found you” and reset the game and ask if you want to hide or seek again.
If you want to help out the robot, then you can say “warm” as it is heading towards you, “cold” as it is heading away from you and “hot” if it is nearly by. If Heidi hasn't found you in 4 minutes, Alexa will prompt you to help Heidi out by giving it warm/cold/hot clues.
See the additional features to change the colour paper.
SSML has been uses throughout the project to make a more natural feel to the conversation. For example, pauses are used for counting to make it seem more like a person deliberately counting. Excited emotion has been used for winning and disappointed emotions for when things go wrong.
For some of the new features and links to the documentation go here:
Additional features - Changing the name of the robot“Change the name of the robot to ______”
The robot's name is used in a lot of the conversations to give a more personal touch to the conversation. Allowing a player to change the name of the robot gives them the chance to the name they call their robot.
This feature uses the a slot with the type of Amazon.FirstName for validation of input.
Additional features - Setting the seek colourYou can change the colour the robot looks for when finding you. This is done in two ways.
The first way allows you to set a colour
"Change robot seeking colour to" _______
The slot type for this uses Amazon.Color and only allows the colours that EV3 can detect.
"Change the seek colour to the robots current colour"
This will send a message down to the EV3 to get the colour the sensor is currently looking at. The EV3 sends it back and the skill saves this value to use in the game.
Note: There are many variations in the way to activate both these intents to make it natural for the player to set.
Additional features - Persisting the informationSaving and resuming even when the session ends
The game information (robot name, current state, seeking colour) is saved between sessions so that a player can return to their current configuration.
Note: The directive for hide and for seeking tries to keep the session alive but in case this is closed off, the game will recover so that the user can resume where they left off.
All the Intents have variable utterances and use validation where appropriate.
Some examples are
- The counting is validated so that a user actually counts to 10 clearly.
- Setting the colour for the robot to seek to is limited to only the colours that EV3 can detect
- The make noise intent allows you to add a number but if you don't, then it just sets a default value
- The temperature has it's own slot type with custom slot values
For the full list, take a look at the model.
Time is required for the player to find the robot or the robot to find the player. The gadget needs to be able to talk back to Alexa when it has been 'found' or when it 'finds' another player.
The session is kept alive through a directive to the gadget and renewed on expiry where relevant. Expiry of the request give Alexa the chance to add to the game by prompting the user to help Heidi find them or by sending an instruction to heidi to make a noise.
Additional features - EV3 soundsSounds have been placed onto the EV3 to give the robot some personality. The robot will sneeze or giggle etc if it hasn't been found in the minute. Recorded verbal instructions are also used for feedback as the robot isn't connected to the computer for debugging.
Additional features - Allowing tether free movementThe EV3 needs internet access to communicate with the Alexa services. There are only a few wifi dongles that are compatible. Ignoring the shipping time/costs, the wifi dongles still have an issue that you need to stick the wifi password in.
Thankfully a 'what about' thought and a quick trial showed that the EV3 picked up the USB tethering option from my phone without hassle and gave it an internet address and access to everything it needed to run the project.
I haven't seen this connection method mentioned anywhere before and will do some further tests with other's android phones so that the info can be updated on the EV3dev site.
Getting Started - The Lego buildAny two motor driven device will work as a hide and seek robot with the configuration of
- Right motor in C
- Left motor in B
- A colour sensor
- A Ultrasonic sensor (note the Infrared sensor that comes with the EV3 home set could also be used for this project - see making your own changes)
This means you can use most of the entry EV3 projects in both the core or education sets. The project Lego design is deliberately simple as our EV3 gets re-purposed often. The below construction shows you how you can get started with a very quick Lego build.
You will also need A microSD or microSDHC card (8GB or larger and 32GB or smaller) and a Wireless internet connection - Either a supported wireless dongle, and android phone (see allowing tether free motion) or you can carry your laptop above the EV3 while testing.
Firstly, the critical part to a very quick and easy vehicle is a wheel connected to each motor and free moving back wheel/s. A castor is a very simple way to do this. This caster device uses a non-friction connector to spin freely. The wheel has an axel piece through the wheel sitting in two circle pieces.
Follow the setup instructions at https://www.hackster.io/alexagadgets/lego-mindstorms-voice-challenge-setup-17300f and mission 1 instructions (linked on the page) with the notes for troubleshooting.
Getting started - software for hide and seekFollow the mission 3 instructions but use the hide and seek code provided in this project.
If you have followed the setup instructions for connecting to the internet, then all you need to do to tether it to the phone is to connect the phone to the USB and turn USB tethering on.
Use the file browser on the EV3 to launch the hideandseek/hideandseek.py file
Troubleshooting - Setting up EV3 devI personally had an issue burning the image and it failed multiple times. You may need to follow windows disk management tools to delete the partition made and reformat the SD card before trying again. I had more success with a card reader that was a Micro SD card size instead of putting the Micro SD in a holder.
Etcher uses chrome and pressing SHIFT+CTRL+I will open the debug tools. Going to the console will show the progress and potential errors.
Troubleshooting - Setting up the SkillSet up an Amazon developer account - This is best done with the same email your Alexa account is using.
The instructions say to use Product ID: EV3_01. I needed to use ev3dev as the product ID. The product ID needs to match the machine ID the alexa gadget toolkit code sends through.
This took a while to find and was only through working through the alexa gadget python code.
Troubleshooting - connecting to the EV3 brickEV3 dev is a linux distribution and you can use linux tools to connect to it.
Putty will connect the EV3 and allow you to login to the brick.
WinSCP (if you are on windows) will allow you to connect to the brick and transfer files between windows and the brick easily.
Default username is: robot
Default password is: maker
Change your username and password if your brick is connected to any network or the internet.
Troubleshooting - debugging Alexa gadget toolkitThe gadget code can be found at /usr/lib/python3/dist-packages/agt/ on your EV3. see connecting to the EV3 brick for information on how to get to this file.
Troubleshooting - No such file or directory errorWhen running a newly created gadget code, I got an the error /usr/bin/env: 'python3\r': No such file or directory .
Your must switch your editor's line endings setting for the file from "CRLF" to just "LF". This is in the status bar at the bottom of visual studio code.
Troubleshooting - Understanding functionality of EV3 DevIf you want to know all the EV3 dev bindings you can see them at https://github.com/ev3dev/ev3dev-lang-python
Sensors are at:
https://github.com/ev3dev/ev3dev-lang-python/blob/ev3dev-stretch/ev3dev2/sensor/lego.py
motors, display leds etc are at:
https://github.com/ev3dev/ev3dev-lang-python/tree/ev3dev-stretch/ev3dev2
Sometimes your skill will not run. You can see the logs by selecting Logs: Amazon CloudWatch if you are using the Alexa Hosted Skills. If you are hosting the files yourself, then view your lambda's cloud watch logs.
If your Bluetooth has connected in the past and doesn't now, first try turning the EV3 fully off and on again. Sometimes Bluetooth doesn't load on turning on the brick and if you look in the settings, Bluetooth will say it is unavailable.
This is a bluetooth not loading on the device error that may be fixed with a reboot.
File "mission-01/mission-01.py", line 88, in <module>
gadget = MindstormsGadget()
File "mission-01/mission-01.py", line 36, in __init__
super().__init__()
File "/usr/lib/python3/dist-packages/agt/alexa_gadget.py", line 91, in __init__
self.radio_address = BluetoothAdapter.get_address()
File "/usr/lib/python3/dist-packages/agt/bluetooth.py", line 101, in get_address
bdaddr = p.stdout.decode('utf-8').split('BD Address: ')[1].split(' ')[0]
IndexError: list index out of range
If this doesn't work check Bluetooth is on in the settings
- Run the unpair script
- forget the paired device on the Alexa
- Run the mission/hide and seek code
- Repair the device on the Alexa
The below gives information on further enhancement or simple change ideas as well as where to start for each idea.
Replacing the ultrasonic sensor with an infrared sensor.
The InfraredSensor (found in the home EV3 set) can also measure distance and would be easy to swap out.
Change the reference from UltrasonicSensor to InfraredSensor and use the Infrared sensor's proximity call - note that the sensor values are different. Ultrasonic sensors distance_centimeters will give you a value up to 250cm away. Infrared sensor's proximity call will give you a value of 0-100 where 100 is about 70cm away.
All the EV3's sensors calls can be found at:
https://github.com/ev3dev/ev3dev-lang-python/blob/ev3dev-stretch/ev3dev2/sensor/lego.py
Changing the seeking or hiding movement logic
The seeking and hiding logic is reasonably simplistic. It would be best to expand and separate the code of the game thread so that components can be reused and the logic can get more complex. Tests can then also be created so that the movement code can be executed without running the full simulation.
A further project would be to use patterns to control the logic of the seek/hide. When a game is launched, the skill could send down the input parameters and the result time monitored. These could be plugged into a learning model to understand what logic is the most efficient.
Two robots
Investigate the endpoint management in the skill to store information against the endpoint. This way multiple robots could be used to either all hide or all seek. Playing against each other would need different game rules so the robot could identify each other.
Comments