I signed up to try out Structure's new platform. After following the tutorial to build a dashboard from Forecast.io data, I was really interested in making it work with a Particle Photon and SparkFun Weather Shield I have. Ultimately, I'd like to build a climate control system for my son's tortoise habitat. For now, we are going to set up the Photon to push data to a workflow built with Structure, and ultimately, create a dashboard displaying temperature and humidity statistics.
Prerequisites1. Create a Structure account.
2. Complete the tutorial for creating an application and dashboard using Forecast.io's API. This project is predicated on many of the concepts you will learn in the tutorial.
3. A Particle Photon that is set up and ready to go.
4. The Particle CLI installed.
5. SparkFun Photon WeatherShield connected to the Photon. You can do this project with any temperature and humidity sensors. I just happen to have the WeatherShield, so I used it.
1. Start a new project in Particle Build.
2. Include the library for the SparkFun Weather Shield to the project.
3. Copy and paste the code attached below into the editor.
4. Save and verify the code, and then flash the device.
At this point, you should see the Photon sending temperature and humidity data to the Particle Cloud:
We will need to create a virtual device in Structure. This device will receive our final output from the Structure workflow. Structure can then use the device as the basis for our dashboard.
1. Create a new Structure application.
2. Click the "Add Device" button.
3. Click the "Create Blank Device" button.
4. Name your device
5. Select "Virtual Device"
6. Under "Device Attributes", add fields for Temperature and Humidity. These are the storage for data the device will receive. They will also be used later to tell our dashboard what to monitor.
7. Click "Save Device"
The webhook will be our trigger mechanism and starting point for the workflow we will build.
1. Go to Application Settings (link on right-side).
2. Select the "Webhooks" tab:
3. Click the "Add Webhook" button.
4. Enter a name for the webhook, such as "climate-data"
5. Click "Save Webhook".
Structure will create a special URL for your new webhook:
We can use the URL given to us by Structure to create a Particle webhook. The code you flashed the Photon with contained a call to "Particle.Publish". This call creates an event on the Particle Cloud with the JSON payload as its data. If we create a webhook using the Structure URL, that data will be pushed to Structure by the Particle Cloud.
1. Use the "structure-webhook.json" file attached below.
2. Modify the "URL" field to match the URL created above in "Structure: Webhook".
3. From the command line, use the particle-cli to create the webhook. The command is
particle create webhook structure-webhook.json
Now every time the Photon takes a reading and publishes it, Particle Cloud pushes it to Structure. Let's build a workflow that can accept it.
1. Create a new workflow.
2. Add a Webhook trigger.
3. In the Webhook trigger's properties, select the webhook created earlier:
The data coming from the Particle Cloud is actually a string and not a JSON document, so we will need to massage the data a bit before Structure can process it. This can be done by routing the Webhook into a Function block.
4. Add a Function logic block
5. Connect the Webhook block to the Function.
6. In the Function block's properties, your data comes in via a variable called "payload". In this case, the Photon's JSON payload is stored in "payload.data.body.data". We are going to use the following JavaScript to convert that value to JSON and store it back where we found it:
payload.data.body.data = JSON.parse(payload.data.body.data);
7. Add a "Virtual Device" output block.
8. In the Virtual Device properties, select the virtual device we set up earlier.
9. The last part of the set up is to link the Virtual Device to our inbound data. Set the Virtual Device's "temperature" value to the payload's "t" value (the value output by the Photon. Do the same for "humidity":
temparature = {{ data.body.data.t }}
humidity = {{ data.body.data.h }}
10. Click "Deploy Workflow" to save and deploy your changes.
At this point you can go back to the "Device" menu, go into the "climate-monitor" device and scroll to the bottom. You should see data flowing in from the Photon every 15 seconds:
Data pulled from the environment around you is being read and pushed into the cloud and is being used to trigger our workflow. As awesome as that is, making pretty pictures with that data is even better. Let's make the dashboard.
1. In the top menu, select Dashboards > Create Dashboard
2. Give it a name and click "Save"
3. We are going to set up a "Gauge" block, so click "Customize" for the Gauge block:
4. Give the block a header label. In this case, "Temperature".
5. Select the Application from the drop-down.
6. Select "Dial Gauge" and enter a Min and Max value. Since this temperature gauge will eventually monitor a reptile's habitat, I selected 50 and 120, respectively.
7. Select the frequency to aggregate historical data. This is a great feature that lets you report on minimum, maximum, sum, and mean values over a period of time. Since our monitor is new, select a small value (ie. "5 minutes"). You can always increase it later when you have more data. You can also select "Last received data point" to display data as it arrives.
8. Select the Virtual Device ("climate-monitor") and give it a label ("*F"). If you selected a time value in the previous step (ie. "5 minutes"), you need to select an aggregation method. I used "Mean" to give us the average temperature over the last 5 minutes.
9. Your block set up should look something like this:
10. Click "Add Block" to save.
Congratulations. If everything went right, you should see your temperature gauge:
While this project may seem rather long and involved, once you have setup one Structure workflow, you will see how easy it is and that you will be able to create new ones very quickly.
Play around with the dashboard a bit. Try adding a "Time Series Graph". This will let you monitor multiple data points over time, graphing them against each other.
Try adding blocks to the workflow. A "Debug" block can be used anywhere to output data and see what the workflow is doing at any point. This is extremely useful, for example, figuring out how the data coming from the Photon was stored and that it wasn't coming in as JSON.
I really enjoyed using the Structure platform. The workflow tool is intuitive to use and has a great deal of flexibility. Combined with it's dashboard capabilities, Structure gives you a really great platform for your IoT projects.
Comments