Who doesn't love a good road trip? That cultural institution canonized in the great 1980s documentary, National Lampoon's Vacation.
In a world where gas prices didn't make airline travel look cost-competitive, the Satrom household was known for taking an annual trip each spring.
My kids are still young enough that we can force them to load into the minivan (no shame in it) and endure a multi-day trek from our home in Austin, TX to some far-flung part of the US. We listen to audiobooks, and play games like "I Spy, " highway bingo, and my personal favorite: "how long will dad make us wait for the next pit stop?" Everyone's a winner in that one.
More often than not, we head out to Colorado and spend a week-plus in the real happiest place on earth: the Rocky Mountains. This last March, however, we decided to head out East to visit family in Georgia, spend some time in the Smokey Mountains, and visit a few historical sights on the way out and back.
And because I work for Blues Wireless and never leave home without some project in tow, I decided a road trip was the perfect excuse to bring along a Notecard-powered tracker.
With a twist!
We (and members of our community) have posted our fair-share of tracker projects, starting with our very first, way back in December of 2020. If you're looking for a low-cost, low-power tracker that just works, the Notecard and Notecarrier A are up to the task.
But I wanted a tracker with some extra smarts: a host MCU, and a couple of environmental sensors so that I could capture air quality all along my holiday road.
So I built one, a *still* low-power tracker with a brain and sensors, using a brand-new Notecarrier that we've just released to the public today: the Notecarrier F.
Caption: F as in Feather, or Fantastic. You pick.
In this article, I'll share what I built and show you how to:
- Use the Notecarrier F and Swan to build a tracker that senses
- Use the Notecard to capture GPS/GNSS location data all along my road trip
- Create Routes in Notehub.io to send data from my sensors and location events from the Notecard to Snowflake, a Data Cloud platform that Notehub has built-in support for.
- Create a Python-powered dashboard for viewing data from my road trip using Streamlit.io.
Let's get started!
Hardware ConfigurationMy hardware for this project was simple. For cloud connectivity, I used the Blues Wireless Notecard. If you've not yet heard of the Notecard, it's a cellular and GPS/GNSS-enabled device-to-cloud data-pump that comes with 500 MB of data and 10 years of cellular for $49 dollars.
The Notecard itself is a tiny 30 x 35 SoM with an m.2 connector. To make integration into an existing prototype or project easier, Blues Wireless provides host boards called Notecarriers. As mentioned above, I used the new Notecarrier F for this project because it includes a handy set of headers ready with any Feather-compatible device.
For the Feather-based host MCU, I chose the Blues Wireless Swan, and for environmental sensing, I chose the SparkFun Environmental Combo breakout, which includes a BME280 and CCS811 for air quality on the same board.
Finally, I used a 5, 000 mAh mega-LiPo for power, a dual LTE and GPS/GNSS antenna from Molex, a solar panel to provide juice to the battery as we drove across the country, and threw the whole thing into a clear lid Pelican 1040 case.
For this project, I used Platform.io, and created a new Arduino application with the Swan as a target. Then, I added the note-arduino, SparkFunCCS811, and SparkFunBME280 libraries.
#include <Arduino.h>
#include <Notecard.h>
#include "SparkFunCCS811.h"
#include "SparkFunBME280.h"
#include <Wire.h>
Next, I created object for my Notecard and SparkFun sensors.
#define CCS811_ADDR 0x5B
#define PRODUCT_UID "com.blues.bsatrom:road_trip_tracker"
Notecard notecard;
CCS811 airQualitySensor(CCS811_ADDR);
BME280 tempSensor;
Then, in setup()
, I initialized the connection to the Notecard and my sensors.
tempSensor.settings.commInterface = I2C_MODE;
tempSensor.settings.I2CAddress = 0x77;
tempSensor.settings.runMode = 3; //Normal mode
tempSensor.settings.tStandby = 0;
tempSensor.settings.filter = 4;
tempSensor.settings.tempOverSample = 5;
tempSensor.settings.pressOverSample = 5;
tempSensor.settings.humidOverSample = 5;
Wire.begin();
delay(250);
notecard.begin();
if (tempSensor.beginI2C() == false) {
Serial.println("BME280 error. Please check wiring.");
} else {
Serial.println("BME280 Connected.");
tempSensorAvailable = true;
}
if (airQualitySensor.begin() == false) {
Serial.println("CCS811 error. Please check wiring.");
} else {
Serial.println("CCS811 Connected.");
airQualitySensorAvailable = true;
}
To configure my Notecard as a tracker for this project, I needed to do 3 things. You can explore more of the details around each of these at dev.blues.io.
1. Set it's Notehub ProductUID using hub.set
.
J *req = notecard.newRequest("hub.set");
JAddStringToObject(req, "product", PRODUCT_UID);
JAddStringToObject(req, "sn", "feather-tracker");
JAddStringToObject(req, "mode", "periodic");
notecard.sendRequest(req);
2. Configure GPS/GNSS location sampling using card.location.mode
.
req = notecard.newRequest("card.location.mode");
JAddStringToObject(req, "mode", "periodic");
JAddNumberToObject(req, "seconds", 300);
notecard.sendRequest(req);
3. Configure the Notecard to store location information in a Notefile that will be sent to Notehub using card.location.track
. The configuration below will add a new Note and synchronize each time a location change is detected, and a heartbeat every 4 hours if no movement is detected.
req = notecard.newRequest("card.location.track");
JAddBoolToObject(req, "start", true);
JAddBoolToObject(req, "heartbeat", true);
JAddBoolToObject(req, "sync", true);
JAddNumberToObject(req, "hours", 4);
notecard.sendRequest(req);
With the Notecard configured, the last piece of firmware I needed was to add some sensor readings so that I could get mobile air quality data as we drove across the country. First from the BME280:
float tempC = tempSensor.readTempC();
float humid = tempSensor.readFloatHumidity();
float tempF = tempSensor.readTempF();
float pressure = tempSensor.readFloatPressure();
float alt = tempSensor.readFloatAltitudeFeet();
airQualitySensor.setEnvironmentalData(humid, tempC);
J *req = notecard.newRequest("note.add");
if (req != NULL) {
JAddBoolToObject(req, "sync", true);
JAddStringToObject(req, "file", "env.qo");
J *body = JCreateObject();
if (body != NULL) {
JAddNumberToObject(body, "temp", tempF);
JAddNumberToObject(body, "humidity", humid);
JAddNumberToObject(body, "pressure", pressure);
JAddNumberToObject(body, "altitude", alt);
JAddItemToObject(req, "body", body);
}
notecard.sendRequest(req);
}
Then from the CCS811:
airQualitySensor.readAlgorithmResults();
J *req = notecard.newRequest("note.add");
if (req != NULL) {
JAddBoolToObject(req, "sync", true);
JAddStringToObject(req, "file", "air.qo");
J *body = JCreateObject();
if (body != NULL) {
JAddNumberToObject(body, "co2", airQualitySensor.getCO2());
JAddNumberToObject(body, "tvoc", airQualitySensor.getTVOC());
JAddItemToObject(req, "body", body);
}
notecard.sendRequest(req);
}
I used two different Notefiles for this application so that I could take and send readings individually, but you could easily combine them into a single file as well.
Routing Data from Notehub to SnowflakeAnd that was it for the firmware! But before I was able to hit the road, I needed to decide where to send sensor readings so I could analyze the data and build a dashboard later. For this project, I decided to give Snowflake a try.
Snowflake is a popular data cloud platform that provides all manner of options for data storage, ingestion, transformation, and the like. And since it's focused on data, I figured it would be faster to get started with than using a hosted DB service from one of the hyperscaler cloud platforms.
Also, Blues has a built-in integration for Snowflake! I started by creating an account, and then followed the blues guide for creating a Snowflake route. At the time of writing, the Blues integration recommends the public private key pair authentication approach, so I created an RSA key pair, and associated my public key to my Snowflake account using an alter user
SQL query in a workbook.
alter user [your_user_name] set rsa_public_key='DaIIBIjgfANBgkqh...';
Then, I created a new database and table in my Snowflake account. To make things as simple as possible, I decided not to worry about defining a schema or multiple tables up-front, and instead added a single column of type variant
to my table. The variant type allows me to dump the raw JSON from Notehub events into the table regardless of the contents of those events, or the Notefile they came from. In a bit, I'll show you a cool way that we can add some structure to this data, after the fact!
create or replace database road_trip_tracker;
create or replace table tracker_data(file variant);
Finally, it was time to create the Route. I added my Organization and Account names, and uploaded my Public Key.
I decided to route all of the Notefiles for the project so I could later analyze Session data, and capture sensor and tracking events.
And for the JSONata expression, Notehub provides a default that I can tweak slightly with my database name and table name in the statement field. The $[0]
syntax in the expression selects the entire JSON of the event and inserts it into my tracker_data table.
{
"statement": "insert into TRACKER_DATA (payload) select parse_json($$" & $[0] & "$$)",
"timeout": 60,
"database": "ROAD_TRIP_TRACKER",
"schema": "PUBLIC",
"role": "ACCOUNTADMIN"
}
Time To DriveWith the hardware assembled, firmware loaded, and Notehub routes set up, we were ready for a road trip! I popped the pelican case on the dash and we were off!
Everything was nice and smooth during my trip. I captured tons of sensor and location readings, and everything made its way into Snowflake flawlessly. Once I was back home, I noted that I had over 11, 000 rows in my Snowflake table, ready for me to process and visualize.
And the first step was turning 11k rows of raw JSON into something I could make sense of. And as I said previously, the nice part about using Snowflake is that I didn't have to worry about the shape of the data or building a dashboard until after I returned from my trip. The reason: If your variant fields contain valid JSON, Snowflake SQL actually supports traversing that data in your queries. That means, you can create structured views that look like tables based on semi-structured JSON data.
I started with a query to create a view for all of the data from the Notecard's built-in tracker _track.qo
:
create or replace view
tracker_vw
as select
file:value.event::STRING as id,
file:device::STRING as device,
file:combinedwhen::TIMESTAMP as created,
file:where::STRING as loc,
file:value.best_lat::FLOAT as lat,
file:value.best_lon::FLOAT as lon,
file:value.best_location::STRING as location,
file:value.best_location_type::STRING as location_type,
file:value.best_timezone::STRING as timezone,
file:value.best_country::STRING as country,
file:body.temperature::FLOAT as temp,
file:body.voltage::FLOAT as voltage
from tracker_data
where file:file = '_track.qo' and file:value.event is not null and file:where is not null;
the file:
corresponds to the variant field in my table and tells Snowflake that I am going to traverse that structure. I can map fields from the JSON object into view fields, and even traverse JSON objects, like the body, using dot notation (body.voltage
).
After I created my tracker view, I can run queries against it as I would a normal table:
select * from tracker_vw order by created;
And get a structured view of my data.
Next, I created views for the readings from the BME280 data:
create or replace view
env_vw
as select
file:device::STRING as device,
file:combinedwhen::DATETIME as created,
file:where::STRING as loc,
file:value.best_lat::FLOAT as lat,
file:value.best_lon::FLOAT as lon,
file:value.best_location::STRING as location,
file:value.best_location_type::STRING as location_type,
file:value.best_timezone::STRING as timezone,
file:value.best_country::STRING as country,
file:body.temp::FLOAT as temp,
file:body.humidity::FLOAT as humidity,
file:body.pressure::FLOAT as pressure,
file:body.altitude::FLOAT as altitude
from tracker_data
where file:file = 'env.qo' and file:combinedwhen is not null;
And finally, Co2 and TVOC data from the CCS811:
create or replace view
air_vw
as select
file:device::STRING as device,
file:combinedwhen::DATETIME as created,
file:where::STRING as loc,
file:value.best_lat::FLOAT as lat,
file:value.best_lon::FLOAT as lon,
file:value.best_location::STRING as location,
file:value.best_location_type::STRING as location_type,
file:value.best_timezone::STRING as timezone,
file:value.best_country::STRING as country,
file:body.co2::FLOAT as co2,
file:body.tvoc::FLOAT as tvoc
from tracker_data
where file:file = 'air.qo' and file:combinedwhen is not null;
With my data organized into views, it was time for a dashboard. For this project, I used Streamlit.io, a powerful tool for building data processing and visualization apps in Python. Best of all, it has a nice Snowflake integration!
I followed the Streamlit installation guide to get things installed locally, and then created a new project, first with some requirements in a requirements.txt file:
altair
pandas
streamlit
snowflake-connector-python==2.7.6
matplotlib
Then, I created a Python virtual environment and used Pip to install the dependencies above:
python3 -m venv env
source env/bin/activate
pip install -r requirements.txt
For the Snowflake integration, I created a secrets.toml
file in a .streamlit
directory and included my account information for authentication:
[snowflake]
user = "[your_user_name]"
password = "[your_password]"
account = "[your_organization_name]-[your_account_name]"
warehouse = "COMPUTE_WH"
database = "road_trip_tracker"
schema = "PUBLIC"
Then, in a file called streamlit_app.py
, I started with some imports, as well as logic to connect to my Snowflake database, and handle queries from my app:
import streamlit as st
import pandas as pd
import snowflake.connector
# Initialize connection.
@st.experimental_singleton
def init_connection():
return snowflake.connector.connect(**st.secrets["snowflake"])
conn = init_connection()
# Perform query.
@st.experimental_memo(ttl=600)
def run_query(query):
with conn.cursor() as cur:
cur.execute(query)
return cur.fetchall()
Streamlit has some built-in smarts that can turn multi-line comments and markdown into content on the resulting web page. So I added the following comment, which Streamlit processes as a header element and text when the page is rendered.
"""
# Blues Holiday Road Trip Tracker Demo!
This demo pulls data from Snowflake that was routed from [this Notehub project](https://notehub.io/project/app:6e153550-c690-4a7d-ba00-ed6056138574).
The application in question is a Notecard and Notecarrier-F-based asset tracker. In addition to being configured as a tracker, the Swan-powered host application takes readings from connected environmental sensors that capture temperature, humidity, altitude, pressure, CO2 and TVOCs.
Raw JSON is routed to Snowflake using the Snowflake SQL API and transformed into a structured data tables using views, with a view for `_track.qo`, `air.qo`, and `env.qo`
events.
"""
Streamlit also provides a number of form elements that can be used to add easy interactivity to your app. The values of form elements are bound to variables, and as those variables change, dependent data and UI elements change as well. I added the following to my app to add a slider, select box and a few checkboxes.
"""
### Options
"""
num_rows = st.slider('Rows to fetch?', 10, 500, 100)
sort = st.selectbox('Sort?',('asc', 'desc'))
show_map = st.checkbox('Show map?', True)
show_charts = st.checkbox('Show charts?', True)
show_table_data = st.checkbox('Show table data?', False)
You can run a Streamlit app locally with the streamlit run
command, which will open your app in progress in the browser:
streamlit run streamlit_app.py
Next, it was time to query my views and display some charts and graphs! I started with queries that pull data from each of my views and place them into Pandas DataFrames, which I can then use to process and display the data. Notice that I'm using using a Python format string and the sort
and num_rows
variables in a few of the queries. As mentioned above, these variables are bound to the UI elements on the page, and as they change, the query will re-run.
tracker_rows = run_query(f'SELECT * from tracker_vw ORDER BY created;')
tracker_data = pd.DataFrame(tracker_rows, columns=("ID", "Device", "When", "Loc", "lat", "lon", "Location", "Location Type", "Timezone", "Country", "Temp", "Voltage"))
air_rows = run_query(f'SELECT * from air_vw ORDER BY created {sort} LIMIT {num_rows};')
air_data = pd.DataFrame(air_rows, columns=("Device", "When", "Loc", "Lat", "Lon", "Location", "Location Type", "Timezone", "Country", "CO2", "TVOC"))
env_rows = run_query(f'SELECT * from env_vw ORDER BY created {sort} LIMIT {num_rows};')
env_data = pd.DataFrame(env_rows, columns=("Device", "When", "Loc", "Lat", "Lon", "Location", "Location Type", "Timezone", "Country", "Temp", "Humidity", "Pressure", "Altitude"))
Finally, it was time for some visuals, starting with a map using my tracker_data
DataFrame and the built-in map()
function of the Streamlit API.
if show_map:
"""
### Tracker Map
"""
tracker_locations = tracker_data[["lat", "lon"]]
st.map(tracker_locations)
Then, charts showing air quality and environmental data over time.
if show_charts:
"""
### Environment Charts
"""
air_group = air_data[["CO2","TVOC"]]
st.line_chart(air_group)
env_group1 = env_data[["Pressure", "Altitude"]]
st.line_chart(env_group1)
env_group2 = env_data[["Temp", "Humidity"]]
st.line_chart(env_group2)
And finally, the raw table data.
if show_table_data:
"""
## Notecard `air.qo` Events
"""
air_data[air_data.columns[::-1]]
"""
## Notecard `env.qo` Events
"""
env_data[env_data.columns[::-1]]
And that's it! In less than 100 lines of code, I had an interactive dashboard for my road trip!
This was a fun project to build, and I was able to complete the end-to-end project in just just a few days. If you've not yet, I encourage you to check out the Notecard and see what it's like to prototype without fear using no-fees Cellular IoT. Oh, and if you're already a Notecard fan, check out the new Notecarrier F!
Have fun, and I can't wait to see what you deploy!
Comments