A couple of years ago I had this crazy idea. What if you could record what you wanted your program to do. If you want a light to illuminate then touch the light (not the switch). If you want a servo to move then twist the horn to where you want it to move to. If you want a LCD to say something then click on it and write the message.
If you want a light to illuminate then touch the light not the switch.
I thought that was all pretty cool but what would programming this way actually look like? Thus began the experimental project that has become Touch Logic Control or TLC.
Give Me an Example?Let's kick off with the above 'Rock Paper Scissors' example above to illustrate.
The first thing to understand is TLC is a parallel program, much like an FPGA or PLC, all rows are executed in parallel and together they emerge a programmatic behavior.
The RED blocks are Conditions and the GREEN blocks are Commands that are executed sequentially if the condition is true. More than one Command sequence can be executed at a time so the commands are executed asynchronously.
The first line says that if the SHAKE GESTURE is true then print a bitmap that looks like a question mark to the micro:bit display, then wait 1 second, then randomly assign a value between 1 and 3 to the variable var. You can name your own variable, var is the default name.
So imagine now you shake the micro:bit creating a SHAKE GESTURE event which executes the sequence and assigns a random value of 1 to the variable var. What happens?
Well because the value of var changed the TLC interpreter then tests all conditions that depend on the variable var to see if they are true. Of the 3 conditions only var == 1 is true so only that sequence is executed drawing the Rock bitmap to the micro:bit!
Here is a worked tutorial staring my 12 year old son who is going to be building a 'Kids Teach Kids Tech' series where we cover the essentials of TLC and logic in general.
If you want to support our work we have created a Quickstarter:Kickstarter to help us kick it along.
Under the HoodTLC is a ultra high level language. More like an authoring system than strictly a programming language.
You could describe TLC as sort of mashup of the following
- Visual Story Board mind maps
- Excel macro recording of programs
- Frame Based Computer Animation
- Programmable Logic Control (PLC)
- FPGA asynchronous logic
- Inference based logic
- Fuzzy Logic
TLC is envisioned as an decoupled asynchronous scaleable system which can be deployed anywhere from a cloud service to a microcontroller.
At the core of TLC is a straightforward logic flow.
Events are generated which conditionally trigger dependent conditions to spawn asynchronous command sequences which can in turn generate Events
TLC Program
The resulting TLC program is a collection of Conditions : Command Sequence pairs which are processed by a TLC Interpreter
{ Condition ==> {Command } }
TLC Interpreter
The TLC Interpreter consumes Events and computes dependent conditions and manages the asynchronous execution of the command sequences.
TLC is intentionally designed to be decoupled so events and commands can be executed in different locations to the interpreter. This decoupling supports several important use cases including tethered microcontrollers as event generators/actuators and cloud hosted interpreters.
The reference implementation of the TLC interpreter is the interpreter bound to the Virtual Breadboard UWP (Universal Windows Platform) App.
TLC for Micro:BitI have been very attracted to the idea of working with the micro:bit due to it's widespread availability and rich peripherals make it a great platform for learning.
TLC Tether Firmware
The first step to supporting micro:bit is to view it as a tethered event generator which serializes events such as buttons presses, gestures, compass headings and temperature readings and forwards them to the Virtual Breadboard hosted TLC interpreter. The tether firmware can also receive and decode commands such as bitmaps to display and bluetooth messages to forward.
TLC Interpreter Firmware
A more advanced firmware can embed the TLC interpreter in the micro:bit itself and execute the TLC program standalone.
I had been doubting which was the best way to implement the TLC interpreter for micro:bit but the availability of Ada for micro:bit seems to be a really great fit.
Ada has features very well suited to the implementation of the TLC interpreter. In particular it's support for Task based parallel processing seem a perfect match to the TLC paradigm.
Also the atomic nature of the command and condition blocks seem well suited to the pre/post condition features of Ada for formal verification of the implementation.
Work In ProgressAlthough this project is a work in progress there are already results you can work with today so I have published this project immediately and will update it progressively.
Current StatusVersion 0.1 of the Tether Firmware is available in the code section of this project. You can build it and upload to it to a micro:bit and use it with the current version of the Virtual Breadboard App and follow the first few TLC examples today!
Next StepsTether firmware work items:
- There is an accelerometer class in the drivers example but only with raw data which needs to be converted into gestures.
- I couldn't immediately access the Ada.Text_IO.Get_Line to be able to receive serial data to print bitmaps to the display so I need to figure that out.
More soon...
Comments