As the world becomes increasingly AI-centric, maintaining the volume and quality of the data being collected to train those models is vital. Imagine the scenario of having a distributed set of simple, interconnected microcontrollers that need to capture data at the same time and then send it back to a central location for processing. This normally involves complicated signaling/timing mechanisms as well as allowing more bandwidth for inefficient data transmission or onboard compression.
What are Particle Logic and Ledger?Logic is Particle's new offering that allows for simple JavaScript programs to be directly written in the Console and then be triggered by external events or by a cron definition, akin to a serverless app. They support a limited subset of built-in JavaScript method calls, Cloud event publishing, and a variety of encoding and decoding functions.
Logic is able to work hand-in-hand with Particle Ledger- a Cloud-based object store that will be able to seamlessly synchronize data between the Cloud and devices in the future. For now, values can be read, set, and merged within Logic scripts, thus letting devices send data for later storage without the need for another service and webhook.
In this project, there are two Particle Photon 2 microcontroller boards. One has an ADT7410 high-accuracy temperature sensor while the other is connected to an LSM9DS1 9-DoF IMU. Data is read and converted into JSON objects only after receiving the SEND-DATA
event. The event subscription will trigger each device to publish their own event containing the JSON representation of their data: either temperature or IMU.
When it comes to repeatedly executing an action at set intervals, cron is the go-to utility for most, as its highly expressive syntax can account for almost any scenario. Logic supports the creation of scheduled events wherein the function can be deployed with an accompanying cron-style string to denote the frequency. Functions can run at-most every minute, and this project relies on one of its functions running once every five minutes to emit the SEND-DATA
event to the listening Products.
After each Photon 2 sends back its respective payload in the form of an event, the data must be transformed and stored somewhere. Rather than using a webhook and external server, the latest data will be set in the Ledger store for future retrieval. To start, the event data string is parsed into a JSON object which helps ensure its validity and structure. Next, the current owner-scoped Ledger instance is retrieved via Particle.ledger('sensor-data')
and then updated using sensorLedger.set(data, Particle.MERGE)
. Using the MERGE
enumeration allows for only certain values in the store to be updated rather than overwriting the entire object.
If you wish to get started with your own Logic Functions, you can easily use an existing Particle device in conjunction with Logic/Ledger to read, process, store, and even request new data via Cloud Events. More information can be found here in Particle's Logic docs.
Comments
Please log in or sign up to comment.