Grafana was not the best tool for this exercise in the end due to its inability to store images which would have enabled me to better visualize the orientation of each sensor in the table. You can provide HTML links in Grafana for each picture, but that was outside the scope of this particular experiment's focus.
This time around I wanted to focus on the accelerometer. It will provide data whenever it has moved, but it will also provide its static data whenever the other information is being captured. This enables us to try a sensor orientation experiment with the focus on informing the user how the sensor was installed in the field.
For the purposes of this article, I will only focus on the 3 major axis options with binary values for each. What I mean is I am assuming that one of the 3 axes of the part will be perpendicular to the ground while the other 2 are parallel. Let’s get started!
PlatformI’m utilizing our Everactive Environmental+ Evaluation Kitto show the types of features that you can leverage with our always on batteryless monitoring capabilities. Simply purchase a kit, follow the instructions to get the sensors connected, and start to stream data from our sensors under minimal harvesting conditions.
SetupI leveraged one Everactive Eval Kit and only a single sensor as this can be duplicated easily later on. I leveraged the Grafana dashboard environment to develop a tailored dashboard with the panels I required.
ActivityI created two environment variables within Grafana. The first would allow me access to customer specific data sets. In my case, it would have all of the available sensors targeting “Ashleys Eval Kit”. The second would allow me to select which sensor was of interest to me for more of a deep dive view. These are the same variables and queries used in the first article.
Here is a snapshot of the variable queries and settings.
With the variables set up, I then moved on to setting up the panels.
Orientation PanelThe main view I desired was to see the current orientation of the sensor being targeted. I wasn’t sure what the best way to do that would be, so I first started out obtaining what the values would be for each orientation.
To obtain this data, I added a panel with the timeseries data for a single sensor’s x, y, and z results.
SELECT
$__time(ts),
(reading->'movementMeasurement'->'initialAcceleration'->'x')::jsonb::text::numeric as "x",
(reading->'movementMeasurement'->'initialAcceleration'->'y')::jsonb::text::numeric as "y",
(reading->'movementMeasurement'->'initialAcceleration'->'z')::jsonb::text::numeric as "z"
FROM api_v2.packet_sensor_readings
where sensor_mac_address = '$mac_address'::macaddr8 and $__timeFilter(ts)
order by 1 desc
Figure 4. Single sensor acceleration results query
Once I had the chart, I needed to map the data sets. I went through each of the desired orientations and as the data became available in the chart, I added an annotation note to state which orientation the sensor was in. Now my data is saved and I can refer back to this while creating the rest of this experiment. For simplicity, I’ve made a table here for you to see with the numbers rounded to clearly show what I’ll be utilizing.
Now, I had to map this data to useful information for an end user.
I used a simple case statement in my query and started with a text table to display my desired results.
with sensor_last_readings as (
select
distinct on (sensor_last.sensor_mac_address) sensor_mac_address,
sensor_last.ts,
(reading->'movementMeasurement'->'initialAcceleration'->'x')::jsonb::text::numeric as x,
(reading->'movementMeasurement'->'initialAcceleration'->'y')::jsonb::text::numeric as y,
(reading->'movementMeasurement'->'initialAcceleration'->'z')::jsonb::text::numeric as z,
sensor_last.gateway_serial,
sensor_last.reading
from
api_v2.packet_sensor_readings_last sensor_last
where sensor_last.gateway_serial != '' AND (sensor_last.reading->'schema')::jsonb::text like '%environmental%' AND $__timeFilter(ts)
order by sensor_mac_address, ts desc, gateway_serial, reading desc
)
select
sensor_last_readings.sensor_mac_address,
case
when (sensor_last_readings.x < -500) then 'On Edge, USB port down'
when (sensor_last_readings.x > 500) then 'On Edge, USB port up'
when (sensor_last_readings.y < -500) then 'On Edge, Antenna port down'
when (sensor_last_readings.y > 500) then 'On Edge, Antenna port up'
when (sensor_last_readings.z < -500) then 'Flat, PV cell up'
else 'Flat PV cell down'
end as orientation,
sensor_last_readings.ts
from
sensor_last_readings
left join api_v2.sensors sensors on sensor_last_readings.sensor_mac_address::macaddr8 = sensors.mac_address::macaddr8
join api_v2.gateways gateways on sensor_last_readings.gateway_serial = gateways.serial_number
join api_v2.associations_current associations on gateways.gateway_id = associations.entity_id
join api_v2.customers customers on associations.customer_id = customers.customer_id
where customers.name IN ($customer)
Figure 7. Grafana query for obtaining the final table.
You’ll notice that I dropped the comparison down to +/-500mg. I found that my original target of +/-900mg was too aggressive and failed when the device was slightly off axis. This gave me more wiggle room. Again, to get a true understanding of orientation, one would want to expand beyond these 6 options and a rendering showing a more variable axis result would be for more informative in the field.
FindingsI was pleased with the final table. It clearly explains the orientation of each sensor and how it was deployed in a simplistic view. This could be expanded upon fairly easily to provide more combinations for off axis orientation options, but at that point, I’d like to figure out how to map this to images of the device in the proper orientation. I believe the text would get too confusing beyond these simple values.
Comments
Please log in or sign up to comment.