This project will created a metronome that moves to the beat of Amazon Music coming from an Alexa device. Add the picture and it becomes... METRO-GNOME!
It is a simple extension on the missions from the LEGO MINDSTORMS Voice Challenge. Before you embark on this project, you need to complete the Setup and at least Mission 1 and 2. This will cover off the basics so I won't repeat those instructions here.
The Alexa interface currently only provides the music tempo but hopefully this will extend in the future to provide even more information about the track being played. I really wanted my project to react differently to genres or even specific songs but this information is not available as yet.
Note: I believe this only works for Amazon Music at this stage so tempo data will not be provided if you are using another music service.
I helped get my daughter started on her project by going through the sample missions with her. Mission 2 bugged me as the movements didn't seem in time with the beat. I thought I would create a simple project that improved on this ability and this is the result.
How is this different to mission 2?If you look at the _dance_loop code in mission 2, it calculates the milli_per_beat and then multiplies by 0.65 (!). I'm guessing this adjustment is to try to allow for the 150ms that each motor is running per beat. The result is an approximation but has an error. This error is evident if you watch closely to see if the movements keep to the beat throughout the track.
def _dance_loop(self, bpm):
"""
Perform motor movement in sync with the beat per minute value from tempo data.
:param bpm: beat per minute from AGT
"""
color_list = ["GREEN", "RED", "AMBER", "YELLOW"]
led_color = random.choice(color_list)
motor_speed = 400
milli_per_beat = min(1000, (round(60000 / bpm)) * 0.65)
print("Adjusted milli_per_beat: {}".format(milli_per_beat), file=sys.stderr)
while self.trigger_bpm == "on":
# Alternate led color and motor direction
led_color = "BLACK" if led_color != "BLACK" else random.choice(color_list)
motor_speed = -motor_speed
self.leds.set_color("LEFT", led_color)
self.leds.set_color("RIGHT", led_color)
self.right_motor.run_timed(speed_sp=motor_speed, time_sp=150)
self.left_motor.run_timed(speed_sp=-motor_speed, time_sp=150)
time.sleep(milli_per_beat / 1000)
For mine, I have created a _beat_loop which takes a different approach. I calculate the seconds per loop with no adjustment and use the timer to ensure that each motor movement is in time with the beat. To allow for the need to move faster for higher tempo tracks, the motor speed is also scaled based on the beat per minute. This ensures that the motor reaches the end point before it is time for it to make the next movement.
def _beat_loop(self, bpm):
"""
Perform motor movement in sync with the beat per minute value from tempo data.
:param bpm: beat per minute from AGT
"""
self.motor.position = 0
pos = 40
speed = min(1000,bpm*2)
seconds_per_beat = 60/bpm
next_time = time() + seconds_per_beat
print("Adjusted seconds per beat: {}".format(seconds_per_beat), file=sys.stderr)
while self.trigger_bpm == "on":
# wait until next movement time
while time() < next_time:
pass
# move motor to next position
self.motor.run_to_abs_pos(position_sp=pos, speed_sp=speed, stop_action="hold")
# change so next position is on opposite side
pos = -pos
# set time for next movement
next_time = next_time + seconds_per_beat
# move to straight position when tempo has stopped
self.motor.run_to_abs_pos(position_sp=0, speed_sp=speed, stop_action="hold")
print("Exiting BPM process.", file=sys.stderr)
Why the gnome?Easy. I'm a dad so bad puns are good! Besides, gnomes are cool and it makes the creation look more interesting.
Building the LEGOThe LEGO build is trivial but I've included build instructions in a Bricklink Studio 2.0 file. The file you need is in under the Schematics section.
To attach the gnome, you need to unleash your inner crafting skills. For mine, I just attached some cardboard with some split pins and then stuck the gnome picture to the cardboard.
This project doesn't need any Alexa Skill code as it relies on information made available to the EV3 through a standard interface whenever a song is played. If you follow the Voice Challenge instructions through to Mission 2, it is simple to add my project. Copy the mission-02.ini file and call it metrognome.ini. Download the metrognome.py file from the Code section of this project and upload both files into your ev3dev environment. Running the metrognome.py file is the same as Mission 2.
Note: ensure that the gnome is in the straight position when you run the program as this will be used as the reset position after each song.
Check it out!Here is a video of my metro-gnome in all of its awesomeness. Enjoy!
Where to next?The Alexa MusicData interface is only a beta version and just includes tempo. I'm hoping it includes more data about the track being played in the future so you can do other things e.g. Christmas robot that starts dancing when Christmas genre music is playing. Until then, you can see what you can do with tempo.
Comments