What would it be like to control musical expression using nothing but the position of your guitar in 3D space? Well, let's protoype something and find out!
The Basic IdeaI wanted to be able to control 3 effect parameter in real time, I wanted to do this using how I positioned my guitar. So one thing was clear, I was going to need a few things.
- A sensor that is able to to see 3D space
- Servos to turn the knobs
- An LCD display
- An I2C Servo driver
- A Raspberry Pi
- To learn Python
Want to see through walls? Sense objects in 3D space? Sense if you are breathing from across the room?
Well, you're in luck:
The Walabot is whole new way for sensing the space around you using low power radar.
This was going to be key to this project, i would be able to take the carteasan (X-Y-Z) coodinates of objects in 3D space, and map those to servo positions changing how a guitar effect sounds, in real time, without touching the pedal.
Win.
More information about the Walabot can be found here
Getting StartedFirst things first, you will need a computer to drive the Walabot, for this project i'm using a Raspberry Pi 3 (here in refered to at RPi) due to the built in WiFi and general extra oomph.
I bought a 16GB SD card with NOOBS preinstalled to keep things nice and simple, and opted to install Raspian as my Linux OS of choice
OK, once you have got Raspian running on your RPi, there are a few configuration steps to take to get things ready for our project
1. Firstly make sure you are running the latest Kernel version and check for updates by opening a command shell and typing
sudo apt-get update
sudo apt-get dist-upgrade
(sudo is added to ensure you've got administrative privilages eg. stuff will work)
This may take a while to complete, so go and have a nice cup of tea.
2. You need to install the Walabot SDK for RPi. From your RPi web browser go to
https://walabot.com/getting-started and download the Raspberry Pi Installer Package.
From a command shell:
cd downloads
sudo dpkg -I walabotSDK_RasbPi.deb
3. we need to start configuring the RPi to use the i2c bus.
From a command shell:
sudo apt-get install python-smbus
sudo apt-get install i2c-tools
once this is done, you have to add the following to the modules file
From a command shell:
sudo nano /etc/modules
add these 2 strings on seperate lines:
i2c-dev
i2c-bcm2708
4. The Walabot draws a fair bit of current, and we'll also use GPIO's to control stuff so we need to set these up
From a command shell:
sudo nano /boot/config.txt
add the following lines at the end of the file:
safe_mode_gpio=4
max_usb_current=1
The RPi is an excellent tool for makers, but it is limited in the current it can send to the Walabot. Hence why we're adding a 1Amp max current rather than the more standard 500mA
PythonWhy Python? well, as it's super easy to code, fast to get running and there's loads of good python examples available! i'd never used it before and was soon up and running in no time.
Now the RPi is configured for what we want, the next step is to configure Python to have access to the Walabot API, LCD Servo interfaces.
For the Walabot
From a command shell
Sudo pip install “/usr/share/walabot/python/WalabotAPI-1.0.21.zip”
For the Servo Interface
From a command shell
sudo apt-get install git build-essential python-dev
cd ~
git clone https://github.com/adafruit/Adafruit_Python_PCA9685.git
cd Adafruit_Python_PCA9685
sudo python setup.py install
Why do we need to used a servo driver? Well, for an RPi a couple of reasons.
1. The current drawn by a servo can be very high, and that number gets larger the more servos you have (of course). If you're driving the servo directky from an RPi you run the risk of blowing it's power supply
2. Timings of the PWM (Pulse Width Modulation) that control the servos position are very important. As the RPi doesn't use a realtime OS (there may be interrupts and such) the timings a not accurate and can make the servos twitch nervously. A dedicated driver allows accurate control, but also allows for up to 16 servos to be added, so this is great for expansion.
For the LCD
Open your RPi web browser
https://www.sunfounder.com/learn/category/sensor-kit-v2-0-for-raspberry-pi-b-plus.html
download
https://github.com/daveyclk/SunFounder_SensorKit_for_RPi2/tree/master/Python/ *.*
From a command shell:
sudo mkdir /usr/share/sunfounder
Using the graphical explorer, copy the python folder out of the zip file into you new sunfounder folder.
The LCD is used to prompt the user as to what exactly is going on. Showing the process of configuration through to the x, y and z values being mapped onto each servo:
Blynk is a brilliant IoT service that allows you to create a custom app to control your stuff. It seemed like the perfect solution to give me remote control of the walabot to really dial in the settings...
One problem. Blynk is not currently supported on the Python platform, bugger. But fear not! i was able to find a nice little work around that allows remote control and remote parameter input! it is a littl hacky, but hey, this is hackster right?!
First step is to download the Blynk app from your favourite app store.
Second, sign up for an account. Once that is done, open the app and start a new project, choosing Raspberry Pi 3 as the hardware. The app will allocate you an access token (you will need this to put in your code). Once you have done that. you will need to set up the app as shown in the images. This is how it will interface with the Walabot.
All the Blynk slider and button settings:
OK Now that the app is all set up, we can configure Python and the RPi to talk to it over the internets. Magic
Firstly, you need to install the Blynk HTTPS wrapper for Python. From a command shell:
sudo git clone http://github.com/daveyclk/blynkapi.git
sudo pip install blynkapi
Secondly, you need to install the Blynk Service on the RPi. From a command shell:
git clone https://github.com/blynkkk/blynk-library.git
cd blynk-library/linux
make clean all target=raspberry
To run the Blynk service:
sudo ./blynk --token=YourAuthToken
To ensure the Blynk Service runs on start up you need to modify the /etc/rc.local by doing:
sudo nano /etc/rc.local
Add this at the end:
./blynk-library/linux/blynk --token=my token &
(I have included a cope of my /etc/rc.local file in the code section for reference). To test it's working simply type:
sudo /etc/rc.local start
The Blynk Service should now be running.
Autorunning the scriptNow that this is all set up and configured, and we have the python code ready. we can set things to auto run so we can ditch the keyboard and monitors. There are a few things to do:
Create a new script file to run out Python Program
sudo nano guitareffect.sh
Add these lines:
#!/bin/sh
python /home/pi/GuitarEffectCLI.py
Make sure to save it. Next we need to give the script permission to run by typing
Sudo chmod +x /home/pi/guitareffect.sh
And finally, we need to add this script to the /etc/rc.local file that we tinkered with earlier.
Sudo nano /etc/rc.local
Add
/home/pi/guitareffect.sh &
Be sure to include the "&" this allows the Python Script to run in the background. Right! That's all the configuration and software sorted, next it's time to wire up the hardware
The HardwareFirst Breadboard prototype:
The enclosure was designed and rendered in the awesome Fusion360:
Gut shots:
Final Assembly Shots:
Well this has been a steep learning curve, but has been so worth it.
My take aways are
- I had to learn Python.. turns out it's ace
- Interfaced the Python on the Raspberry Pi with the Blynk IoT service. This is not officially supported so there is some limits to it's functions. Still works great though!
- It turns out the Walabot is great for musical expression. I used it on a Korg SDD3000, but you can use any effect you like
Have a go yourself. This isn't limited to guitar effects, i can be used with any instrument with any effect.
Comments