In the mid-1990s, William Latham amazed the world with his Organic Art PC application and screensavers - introducing the public to bizarre, other-worldly forms rendered using cutting-edge genetic algorithms that continually mutate simple shapes into elaborate organic lifeforms. I've always been fascinated by generative art, and have long dreamt of creating an interactive installation where participants can influence the algorithms by their presence or movement. Walabasquiat uses Processing on Raspberry Pi and Android with the Walabot sensor as input parameters to create a unique, ever-evolving tapestry of pixels in response to the movement of its viewers. ๐จ ๐
Getting Started ๐ฐ ๐ฉโ๐ปGetting the Walabot working on a Raspberry Pi is extremely straightforward: simply plug it into an available USB port via the included micro-USB cable (be sure you're using a 2.5A
+ power supply) and install the Walabot API. I like to use the CLI
whenever possible, so from Terminal
on the Pi itself, I ran:
cd ~
wget https://s3.eu-central-1.amazonaws.com/walabot/WalabotInstaller/Latest/walabot_maker_1.0.34_raspberry_arm32.deb
sudo dpkg -i walabot_maker_1.0.34_raspberry_arm32.deb
to install the API, and then:
pip install WalabotAPI --no-index --find-links="/usr/share/walabot/python/
in order to run the included Python examples as well as those available on GitHub. A great first app to run is SensorApp.py
, which outputs the raw coordinates of targets that the Walabot detects:
cd /usr/share/doc/walabot/examples/python
python SensorApp.py
Now that everything's up and running, it's time to make something cool with it! ๐ถ๏ธ
Development Process ๐ช ๐ปThe first challenge was coming up with a way for Processing, which I wanted to use to create the generative art, to talk to the Walabot. I initially tried to integrate the Walabot API directly into my sketch using Processing's Python Mode, but after experiencing difficulty with differing Python versions and other compatibility issues, I realized I should abstract the Walabot's sensors via a RESTful API, which Processing (and any other network-enabled client!) could consume. I started putting together a Flask-based server, then somehow stumbled upon @TheArtOfPour's walabot-web-api which was pretty much exactly what I was in the process of creating, although intended for use with Windows and the Developer version of Walabot, while I was using Linux and the Creator version - but it was still quicker to modify it to work with my OS/hardware than create my own from scratch! With a working RESTful API serving Walabot target data on my Raspberry Pi, I then switched over to the generative art portion of the project using Processing. ๐จ ๐ป
I had been using the book Generative Art by Matt Pearson as a guide for harnessing Processing to create generative art, but in searching for examples I happened upon @hype's HYPE Processing Library, which despite not being updated for over two years still worked perfectly, and provided exactly the kind of help I needed to create something that looked spectacular! I combined the generative functionality of HYPE with the JSON
sensor data provided by the Flask-based RESTful API server to create beautiful representations of Walabot targets:
Since Walabasquiat is intended as an art installation, with the Processing sketch being displayed on a large screen, or projected, I thought it would be fun to provide a "souvenir" that would allow visitors to continue to enjoy the project even after they left. I created Walabasquiandroid, an Android live wallpaper, again using Processing for visuals and the same RESTful API to obtain the Walabot sensor values. The visualization is more simplistic in the Android app, as not to use unreasonable amounts of CPU just to provide a pretty background, but it presents an attractive, generative display of the same targets that are informing the main piece, which can be enjoyed long after viewing the primary installation:
To recreate this project, simply connect Walabot to the Raspberry Pi and install the API as outlined in Getting Started above, then, on the Raspberry Pi, using Terminal
, download and run the server:
cd /usr/share/doc/walabot/examples/python
sudo wget https://raw.githubusercontent.com/ishotjr/walabot-web-api/rpi/app.py
python3 app.py
You can use curl
to ensure that everything's working:
curl --include http://192.168.1.69:5000/walabot/api/v1.0/sensortargets
HTTP/1.0 200 OK
Content-Type: application/json
Access-Control-Allow-Origin: *
Content-Length: 527
Server: Werkzeug/0.11.15 Python/3.5.3
Date: Tue, 11 Sep 2018 04:06:12 GMT
{
"sensortargets": [
{
"amplitude": 0.0026219950401443343,
"xPosCm": -0.5432446316758038,
"yPosCm": 10.355883874986088,
"zPosCm": 8.265291199116765
},
{
"amplitude": 0.0018395134981517656,
"xPosCm": 10.506637221750935,
"yPosCm": -3.1108099532967013,
"zPosCm": 10.035551162938308
},
{
"amplitude": 0.0015859160772638584,
"xPosCm": -12.981743742198365,
"yPosCm": -8.162094824811618,
"zPosCm": 10.094844162189423
}
]
}
In this example, the local IP address of the Raspberry Pi on my network is 192.168.1.69
- you can find yours using ip addr show.
Now for the art! ๐จ If you don't already have Processing installed on your Raspberry Pi, grab that first (again, I like using the CLI
, but there's an easier way if that's not your thing!):
cd ~
curl https://processing.org/download/install-arm.sh | sudo sh
Next, clone the Walabasquiat and HYPE library repos, and install the latter by unzipping it into the libraries
folder in your sketchbook
:
cd ~/sketchbook
git clone https://github.com/ishotjr/Walabasquiat.git
git clone https://github.com/hype/HYPE_Processing.git
unzip HYPE_Processing/distribution/HYPE.zip -d ~/sketchbook/libraries/HYPE
Open Processing from under Graphics in the Raspberry Pi's application menu, and use File > Open
to load the sketch
from your sketchbook
:
Update the URL on this line to the IP address of your own Raspberry Pi:
String[] json = loadStrings("http://192.168.1.69:5000/walabot/api/v1.0/sensortargets");
Press the Run
โถ๏ธ button, and Walabasquiat will open full screen on the connected HDMI display or VNC display if you're connected virtually. Move yourself or other objects around in front of the Walabot and gaze in wonder at the beautiful landscape that evolves from your movements:
Of course, it's not enough just to enjoy art when you're sitting in front of a Walabot - you should be able to enjoy it everywhere you go - which is where Walabasquiandroid comes in!
To build Walabasquiandroid from source, first of all add Android Mode to Processing using these instructions, as well as the Android SDK if you don't already have it. Next, clone the repo into your sketchbook
:
cd ~/sketchbook
git clone https://github.com/ishotjr/Walabasquiandroid.git
and open the sketch in Processing like you did with Walabasquiat above. Make sure the sketch is in Android Mode:
Update the URL on this line to the IP address of your own Raspberry Pi:
String[] json = loadStrings("http://192.168.1.69:5000/walabot/api/v1.0/sensortargets");
Make sure Developer options and USB debugging are enabled on your Android device, then connect it to the host machine and press Run on device
:
This will bring up the wallpaper settings on your device and allow you to select Walabasquiandroid, after which your phone background will start to look something like this:
Don't forget, app.py
on the Raspberry Pi needs to be running and accessible over the network in order for the Walabot sensor data to be read! If you just want to grab the live wallpaper APK without building it yourself, you can do so from the GitHub repo. ๐ ๐
The Walabot Creator is a unique sensor, unlike anything I've used before! I'd like to give a big thanks to Walabot and Hackster for enabling me to use this unique bit of kit to realize my dream of creating an interactive generative art installation that responds to viewers' movements! Processing and the HYPE library made it really easy to turn my graphical ideas into reality. ๐ โจ
Next Steps/Future Enhancements ๐ ๐ฎI was able to accomplish everything I set out to in this project, although in future it would be cool to add various additional visualizations which the installation could cycle through over time. I'd also like to add better handling of network problems, and perhaps optimize the Processing code a bit - with the sketch and server running and Android clients all connecting at the same time, I actually managed to cause the overheating warning to display on the Raspberry Pi, which I'd never seen before! ๐ก๏ธ Ultimately, I'd like to have my art displayed somewhere that the public could see and enjoy - or perhaps you'll reproduce it and create a performance somewhere near you! ๐คฉ ๐จ
Comments