Early October last year a company I work for was investigating how some sensors (which were out of 3G/4G/LoRaWAN/.. coverage) could be connected to the Microsoft Azure IoT Services (Azure IoT Hub and Azure IoT Central).
This is a long post so there is an "Executive Summary" at the end.
I had already written a Cloud Identity Translation Gateways for LoRa and LoRaWAN networks so building one for Swarm didn't appear that bad (In the end it took several weeks over a couple of months).
The first step was to purchase a Swarm M138 modem which provides 2-way data via Swarm’s LEO Satellite network. The Swarm Hive has a REST API for sending messages, getting lists of devices etc. Uplink messages are accessible via a REST API or delivered to an application with an HTTP webhook endpoint.
The Swarm M138 modem is a Mini-PCI Express (mPCIe) format package designed to be integrated into other vendor's products.
A Swarm M138 modem costs USD89(March 2023) which includes 50 free messages. Once the 50 messages have been used a data plan is required which costs USD $60(March 2023) per year (USD $5 per month). Each plan includes 750 data packets (roughly one per hour) per month. You can "stack" up to four data plans if you application sends more messages. Each message can be up to 192 Bytes.
I initially purchased a SparkFun Satellite Transceiver Kit - Swarm M138 (Swarm sell the modems but minimum purchase is 25 units) to explore how to connect a .NET nanoFramework device to a Swarm M138 Modem using their Swarm Tile library.
I have "violated the warranty" by cutting a couple of tracks on the breakout board so I can connect to the device with an FTDI module (make sure 3v3 otherwise you will let the smoke out) and power it with a USB-C power bank.
The Sparkfun modem and antenna are not weather-proof so I also purchased a Magnetic Mount antenna which I could put on the garage roof.
I then purchased a Swarm Evaluation kit which cost USD449(March 2023). The evaluation Kit has an IP67 rated enclosure for the electronics and an integrated solar panel for charging the battery.
I'm going to "violate the warranty" by drilling a couple of holes (with cable glands) for additional sensors and replacing the Adafruit Feather S2 with one running the .NET nanoFramework.
I also purchased a Swarm Asset tracker to collect some more "interesting" (than my backyard) location information.
The Swarm constellation is not yet complete so there maybe periods where there is no coverage. The Swarm Satellite Pass tracker gives a pretty good indication of when you should be able to send/receive messages.
A good ground plane, a clear view of the sky and positioning your device to minimise the background noise really helps with coverage.
Swarm Azure IoT Connector ArchitectureMy "Swarm Azure IoT Connector" is a Cloud Identity Translation Gateway, which maps Swarm Devices to Azure IoT Hub Devices.
The Swarm devices can connect with an Azure Device Connection string or the Azure IoT Hub Device Provisioning Service(DPS). The Swarm Azure IoT Connector has an Azure Functions StartUp Service which calls the Swarm Space Hive Devices API to get a list of devices to provision.
I use LazyCache by Alastair Crabtree to cache the DeviceClient connections. It is thread safe and the implementation guarantees to only execute your cachable delegates once. This prevents issues like when an uplink message from a device is received from a device before is has been provisioned by the startup service.
The Azure IoT Connector also has basic Azure Digital Twin Definition Language(DTDL) support so that devices can be "automagically" provisioned in Azure IoT Central.
Message OrderingThe messages from devices will often not arrive in the order they were sent. Because the messages are queued by the modem, satellite, base station and Swarm Hive infrastructure.
I use Azure Storage explorer to view the queued messages, create test messages and return messages from the poison queue to the uplink queue for processing.
In Azure IoT Central the arrival order of messages is highlighted because they are sorted by "iothub-creation-time-utc" rather than the time the message was enqueued.
The Azure IoT connector has a lightweight WebAPI endpoint which receives uplink messages and puts them into an Azure Storage Queue.
The WebAPI uses two Data Transfer Objects(DTOs). The uplink DTO is fairly "lax" about the payload format as I have assumed that the Swarm Hive webhook payload is always correct.
The second DTO has additional properties and is used to deserialise message for the Queue Trigger Azure function which processes the uplink queue. If the message is "invalid" or cannot be processed by the Azure Function (this makes the handling of errors and transient conditions less complex) it will be moved to the poison queue after a number of retries.
I use CS-Script by Oleg shilo to compile C# code which is used to transform the uplink and downlink messages into a suitable format. The uplink and downlink message formatters have a standardised interface.
The payload formatters are stored in file in an Azure Blob Storage container.
The process of loading, then compiling the C# payload formatters can take several seconds so LazyCache is used to store the binary output files. The payload formatters can "use" libraries like Newtonsoft JSON.Net, System.Collections.Generic etc. so I have assumed that the developer of payload formatters knows what they are doing.
The Swarm Azure IoT Connector solution includes a console application for debugging and testing payload formatters.
The UserApplicationID in the uplink and downlink messages is used to identify the payload formatter to use. If there is no matching payload formatter the default uplink or downlink formatter is used.
Azure IoT Central Command Processing
The connector supports three different types of commands, parameter less (Just an @ character payload), single parameter and JSON. Commands have to be configured "queue if offline", "Request" and have no "Response".
The simplest type of command is one without a parameter. In this example I use two commands "lights go on" on "light go off".
The downlink message "method-name" property is mapped to an UserApplicationId and optional payload and optional DTDLV2 identifier.
The payload formatter converts the two different JSON payloads into a single byte.
When a command is sent from Azure IoT Central the payload is "transformed" to an array of bytes.
Then Swarm Hive message API is used to send the downlink message
The second type of command has a single parameter. This example illustrates controlling the status of a fan.
The sample single parameter command has an enumeration for the fan status.
The third type of command has a JSON parameter. This example illustrates the setting of a minimum and maximum temperature range.
The use of the Azure IoT Hub Devices Provisioning Service (DPS) is pretty much required. It is possible to configure individual devices with a device connection string, but this should be avoided.
The Azure IoT Hub implementation also supports the transformation of uplink and downlink message with payload formatters. The JSON can be "enriched" and message properties added to make the routing and processing of messages easier.
The Swarm Space Azure IoT Connector Azure IoT Hub implementation supports Device Connection strings.
The Swarm Space Azure IoT Connector Azure IoT Hub implementation also supports the Azure IoT Hub Device Provisioning Service. (DPS)
This long post is an overview of my Swarm Space Azure IoT Services connector. I have a series of blog posts covering my "learning journey" and multiple iterations of the architecture etc.
Executive SummaryThe Swarm Space Azure IoT connector supports.
- Uplink and downlink messaging
- Transformation of uplink and downlink messages with compiled C# code
- Azure IoT Hub connectivity with device connection Strings
- Azure IoT Hub(s) connectivity with the Device Provisioning Service (DPS).
- Azure IoT Central, parameter less, single parameter and JSON commands
In addition, the solution includes a command line application for creating and testing payload formatters.
The application has been "soak" tested for a fortnight (now a couple of months), stress tested (I wrote a test-rig which is configured with JSON files) with tens of devices.
The Swarm Space Azure IoT Connector should be treated as late beta.
Comments