Keeping tabs on soil moisture level can be useful when trying to optimize watering schedules. But, WiFi might not always be the best solution when deployed in large scale environments. We can keep setup simple by building a soil moisture monitor equipped with LTE on the Particle platform.
HardwareWe need some hardware to get started:
Set up your Boron device at setup.particle.io and follow the prompted instructions. Let’s set up some of Particle’s features once your device is assigned to a product in your account.
Particle LedgersWe’ll start by configuring a few device level Ledgers. Ledgers allow data to be synced between your device and the cloud regardless of connectivity status. This is useful for all kinds of applications, but in this case we’ll use Ledgers to configure settings for the sensor.
Choose “Cloud to Device” Ledger. You can read more about what this means here.
Name your Ledger and give it a description. Then make sure to choose the “device” scope as we want this Ledger to be 1 to 1 with a single device. Each device will have its own unique parameters.
On the Ledger detail page, navigate to the “Instances” tab and choose “Create Instance”.
Fill in your Boron’s device ID and apply the following key value pairs. I named my device-config
These values with be used in our device’s firmware in a later step. They’ll start as arbitrary values for now, but will get updated when we implement a calibration procedure.
We’ll create one more Ledger. This one will be “Device to Cloud” (different than the previous one). This Ledger will be used to view samples from the sensor when a cloud function is called.
Once again, create a new Ledger but this time choose “Device to Cloud”.
Give it a name and a description. Remember what you named your Ledger as it will be used later in device firmware. I’ll call mine device-calibration
.
Now that we have our Ledgers set up, let’s set up an integration so that our sensor’s readings can be stored persistently.
We’ll start in the AWS Console. Sign up for an account if you haven’t already and navigate to the DynamoDB page. Choose “Create a table”.
Give your table a name and enter event_id
for the “Partition key”. Leave “Sort key” blank. You can learn more about what these fields mean here. Leave the rest of the settings in their default state and click “Create table” at the bottom of the page.
Click into the table details after the table has been created (this may take a few seconds). Expand the “Additional info” dropdown the table overview. Look for the “Amazon Resource Name” (ARN) and note this for later use.
Now head over to the IAM dashboard in your AWS console. Choose “Users” and click “Create user”
Give your user a name and click “Next”. Make sure to keep “Provide user access to the AWS Management Console” unchecked.
Select “Attach policies directly” and click “Create policy” to open a new tab.
In the “Create policy” page, select the “JSON” tab and paste the following. Replace <DYNAMO TABLE ARN>
with the ARN you copied from the table creation step and click “Next”.
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"dynamodb:PutItem"
],
"Resource": "<DYNAMO TABLE ARN>"
}
]
}
On the next page, name your policy and give it a description. Then click “Create policy”.
Navigate back to you “Create user” tab and click the refresh button next to the “Create policy” button. Select the policy you just created and click “Next”.
On the next page click “Create user”.
Click into your newly created user back in the IAM > Users page. Select the “Security credentials” tab and scroll down to the “Access keys” subsection. Click “Create access key”.
Select “Third-party service” and dismiss the warning on the next page.
Fill in the description if you choose and select “Create access key”.
Take note of your “Access key” and “Secret access key”. You can choose to download the.csv file if desired.
Particle Integration ConfigurationBack in your Particle Console, navigate to the “Integrations” page. Click “Add new integration”.
Scroll down to find the “AWS DynamoDB” integration under “Data analytics”. Click “Start now”. Name your integration, I’m choosing SoilMonitorDynamoIntegration
. Under event name choose something that makes sense for your application. This is the Event that your device will publish to when sending data to your database. I’m choosing soil-monitor/data
.
Fill in your “Dynamo Table Name”, “AWS Region”, “AWS Access Key” and AWS Secret Access Key”.
In the “JSON DATA” block, we’ll need to make one small adjustment. Replace the id
key with event_id
as that is what we defined as the “Partition key” in our table’s settings.
Click “Enable integration” to save the integration settings.
Device FirmwareNow that all of our cloud configuration has been taken care of we can start on the device firmware. Make sure to configure the Particle Workbench if you haven’t already.
Program your Boron device with the following code:
#include "Particle.h"
SYSTEM_MODE(AUTOMATIC);
SYSTEM_THREAD(ENABLED);
SerialLogHandler logHandler(LOG_LEVEL_INFO);
FuelGauge battery;
Ledger calibrationLedger;
long unsigned int publishInterval = 60 * 1000 * 5; // 5 minutes
double calibrationMin = 0.0;
double calibrationMax = 10000.0;
int numReadingsToAvg = 10;
double readSensor()
{
double sum = 0.0;
for (int i = 0; i < numReadingsToAvg; i++)
{
sum += analogRead(A0);
}
return sum / numReadingsToAvg;
}
int sampleRawSensor(String command)
{
Variant data;
double reading = readSensor();
data.set("reading", reading);
Log.info("Sample result: %f", reading);
return calibrationLedger.set(data);
}
void onLedgerSync(Ledger ledger, void *)
{
Log.info("Ledger %s synchronized at %llu", ledger.name(), ledger.lastSynced());
LedgerData config = ledger.get();
if (config.has("publish_interval"))
{
publishInterval = config["publish_interval"].asUInt() * 1000;
}
if (config.has("calibration_max"))
{
calibrationMax = config["calibration_max"].asDouble();
}
if (config.has("calibration_min"))
{
calibrationMax = config["calibration_max"].asDouble();
}
if (config.has("num_readings_to_avg"))
{
numReadingsToAvg = config["num_readings_to_avg"].asUInt();
}
}
void setup()
{
Ledger deviceLedger = Particle.ledger("device-config");
deviceLedger.onSync(onLedgerSync);
calibrationLedger = Particle.ledger("device-calibration");
Particle.function("sampleSensor", sampleRawSensor);
}
void loop()
{
if (Particle.connected())
{
static unsigned long lastPublishTime = 0;
unsigned long now = millis();
if (lastPublishTime == 0 || (now - lastPublishTime) > publishInterval)
{
double raw = readSensor();
Log.info("%f", raw);
double percent = map(raw, calibrationMin, calibrationMax, 0.0, 100.0);
percent = percent < 0.0 ? 0.0 : percent;
percent = percent > 100.0 ? 100.0 : percent;
String data = String::format(
"{\\"time\\":%d,\\"battery\\":%.2f,\\"reading\\":%f}",
Time.now(),
battery.getSoC(),
percent);
Log.info(data);
Particle.publish("soil-monitor/data", data);
lastPublishTime = now;
}
}
}
Some things to point out about the firmware:
Notice that we provide a callback to the device-config
Ledger for theonSync
action. This implements the “Cloud to Device” functionality. When the Cloud updates the Ledger, the Device will handle the new values in onLedgerSync
by parsing the data and updating state. This state includes: calibrationMin
, calibrationMax
, numReadingsToAvg
, and the publishInterval
. You can see how conveniently you can configure various parameters for your device.
That is different than the device-calibration
Ledger. device-calibration
is configured as “Device to Cloud”. When the sampleSensor
cloud function gets invoked the sampleRawSensor
function will sample the moisture monitor numReadingsToAvg
and update the device-calibration
Ledger with the resulting average.
The main loop
simply waits for the publishInterval
to expire and then posts the sensor sample to the soil-monitor/data
event. Remember that we configured our DynamoDB integration to be invoked on that specific event. The data from our event payload will be stored in the event_value
key in our database.
Now that we have everything wired together, let’s test our configuration. Flash the firmware to your device and wait a few minutes for samples to start rolling in.
You can confirm that your sensor is reading correctly in the device details page in the Particle Console. Scroll down to the “events” tab and filter for your data event, soil-monitor/data
in my case. Confirm that events have been logged.
Remaining in the device detail page, scroll down to the “Functions” subsection and trigger the sampleSensor
function. You do not need to provide an argument to this function.
In a new tab, head over to the “Ledger” page of the Particle Console.
Click into the device-calibration
Ledger and select the “Instances” tab. You should see a newly created instance from your device. Select “Get Instance” and confirm that your sensor has returned a reasonable reading. We’ll use this to calibrate our device in a later step.
Note that every time you trigger the sampleSensor
cloud function, the device-calibration
Ledger will be updated.
Next, let’s make sure the DynamoDB integration is working properly. By now a few samples have been logged from your device. Go to the “Integrations” page in the Particle Console and check for any errors reported under the “History” subsection.
You can check for entries back in your AWS Console. Navigate to your DynamoDB table and choose “Explore table items”. Ensure that items are being returned as expected.
Head back to your Particle Console to calibrate your sensor. Open two tabs:
- Device detail
- Ledgers page
Scroll down to the function section of the device detail page and call the sampleSensor
function. No argument is required.
Switch back to the device-calibration
Ledger detail page and click get instance.
This will give you the reading for a calibration point. Make note of the value. Repeat this process with both a dry sensor and a sensor completely submerged in water to get your calibrateMin
and calibrateMax
readings.
Once you’ve taken your two calibration points, head back to the device-config
Ledger detail and update the device configuration based on the readings you’ve just taken. Note that the calibrateMax
reading should come from the sensor fully submerged and the calibrateMin
readings should come from a dry sensor.
That’s it! Now your device should be reporting soil moisture as a percentage into a DynamoDB database. You could wrap a front end web app around the database to have a fully featured IoT application with very little investment.
Comments
Please log in or sign up to comment.