In 2023 I was one of four artists selected by the City of Albuquerque to participate in a ten week, full time Internet of Things Bootcamp backed by the Central New Mexico Community College. The outcome of this program is for the artists to use the skills from the bootcamp to develop temporary public artwork for the city's Rail Trail, a pedestrian and bicycle trail currently being developed along an active rail corridor through downtown Albuquerque.
Going into this project I knew I wanted to create a work that could travel the length of the trail. Albuquerque already has a fantastic network of trails that I frequently bike on, and I wanted my work to be able to move through the city on this network in the same way that I do
I also knew that I wanted this work to be interactive, but to focus on interactivity for nonhuman participants. Who are the other beings that inhabit this space, and how do I include them in an artwork that travels the length of the trail?
I've been very interested in recent research on plant communication, ways that plants communicate via mycorrhizal networks, volatile compounds, and even sound. There's also been a lot of research into measuring the electrical impedance of plants to understand what's happening inside the plant.
Knowing these two things -- that I wanted to transmit a work along the trail and that I wanted to try communicating with plants -- Epiphyte was born.
The first step was to figure out if I could get meaningful readings from plants. I did a lot of tests on reading impedance from a houseplant until I determined that I was in fact receiving meaningful information. Here are some early tests sending the plant pulses at varying frequencies from 500hz - 10, 000hz. In my tests lower frequency waves were able to pass through the plant much more easily.
The charts below show two tests run before watering the plant, and one immediately after watering. The X axis is the frequency of a square wave sent to the plant, and the Y axis is a raw analog read value from a probe connected to a leaf on the plant.
Knowing that I could in fact collect meaningful electrical information from plants I moved into designing a network for them. The network currently functions by having plants send each other data over LoRa (long range low frequency radio). Each plant in the network uses a Particle Argon to collect data about itself and receive data from other plants in the network. It then assembles all of these pieces of data into a packet to send along to the next plant in the network. These data packets eventually reach a Particle Boron which uploads the collected data to the cloud over the cellular network. This project could exist anywhere where there is cell service and the data would reach the cloud.
Here's a demonstration of the system in action:
Data Visualization:Once the data is in the cloud, a separate Argon connected to my PC pulls that data over wifi via MQTT. It then parses out the data, attaches codes to it, and sends it into my computer over USB.
At this point anything could be done with this data. For this demonstration I wrote a visualization in Processing.
In this visualization there are three plants in the network. Each plant controls one of the three 3D cubes in a 3D environment. It controls the movement of the cube and the rotation of the cube. The position and rotation speed update once a minute when the MQTT server receives new data from the field. The cubes then begin to move to their next position determined by the trees outside.
The first time I had trees remotely controlling 3D assets in a virtual environment it blew my mind, literally jaw on floor. Tree Chat is a prototype, but I'm very excited to continue working with these ideas, creating a larger plant-based network and the virtual spaces to view the data sent over that network. I believe that this project has a lot of potential to challenge our notions about nonhuman life on this planet
In the months after completing the original prototype I was able to further develop the visualization component of Tree Chat (the original title for this project) into a completed generative art piece built in Unreal Engine 5 which I've titled Epiphyte. The plant is now able to control a much more complex, dreamlike visualization. This new visualization still gives the viewer insight into the status and health of the plant, but Epiphyte is also a much more complex visualization. Because of this complexity, the trees continue to surprise me with their choices, creating fascinating and beautiful compositions I would never have considered -- even though I'm the one who built the visualization! Epiphyte provides a glimpse into the inner lives or dreams of plants based on the live the system receives.
I recently had the opportunity to display Epiphyte as part of the 2023 New Mexico Tech Summit.
Here's a small sample of some of the visuals that Epiphyte can produce based on the input from the plant it's connected to.
In addition to creating this new visualization in UE5, I also made some hardware improvements. I found that these adhesive electrode pads capture a better signal while also doing less damage to the plant tissue compared with the alligator clips I used in the original prototype.
This remains an ongoing project. Make sure to check back for future updates!
Comments