When I initially read about the NXP "Revolutionize Your IoT Prototyping" competition, I had just started playing a bit with my WemosD1 mini, an ESP8266-based development board, as a software audio synthesizer.
Therefore, I have quickly started thinking around what kind of audio and music I could do with the NXP Rapid IoT kit instead.
The K64, which is the main micro-controller core of the kit, is a respectably powerful machine for audio signal processing, ( it is even used by Teensy that also includes a powerful audio library ), but the kit itself is not well suited for audio. As a matter of fact, the kit only includes a poor buzzer and no easy way to attach an audio input...
Instead, the strengths of the kit are a small form factor, a rechargeable battery, and a plethora of sensors: those make it a powerful candidate for a wireless music controller.
In particular, the Bluetooth Low Energy (BLE) provided by the KW41 could turn it into a now-standardized BLE-MIDI controller (courtesy of Apple Inc. first introducing this feature starting with iOS9).
Inspirational for this endeavor were to me the old videos of Imogen Heap's "MiMu Gloves" that I had seen a few years ago an that had made me dream: I now wanted to implement my Poor Man's version of the gloves.
EDIT: GripBeats is a newer example of what is possible along those lines.
I have then described my idea on the NXP internal company site (I am an NXP employee), and quite surprisingly received positive feedback and a kit sent to me: youpeee!
I then felt morally engaged to produce "something" with it.
I was initially thinking of programming the kit "offline" with NXP MCUXpresso, starting from the downloadable SDKs, and possibly some example (yes, I am THAT kind on NERD that likes do everything from scratch). However, that looked from the very first steps a possibly daunting task: there are many layers of software abstraction included (FreeRTOS, Atmosphere framework, FSCI bluetooth, sensor drivers, etc,...), that I could not figure out easily: I was hitting hard against the wall the kit was supposed to break, and I did not have enough spare time to solve that by myself.
I then resigned NOT to reinvent the wheel and re-use the framework provided by the Rapid IoT Studio online IDE. I must confess that looking first at the "Rapid IoT Kit Out Of Box Demo" confused me more than helping: once again there were too many things configured, in order for me to understand anything...
Instead, I finally started to understand how the whole programming model was working, when I have looked at a much simpler examples like the "Embedded Mathematical Operation Demo".
Similarly, from the "Embedded Variable Demo" I learnt how to implement a simple timer routine to poll a temperature sensor, and implement a GATT service to stream temperature notifications through BLE: this was roughly similar to what I needed to do for my project.
I ended up favoring the Ambient Light sensor over the others sensors for testing, because I could quickly change its reading by covering it with my finger.
I was then able to go back to the "Out Of the Box Demo", and discover the other key element I needed: the function block, that allows to perform custom signal processing to manipulate sensor data (see ATMO_GetUnsignedInt ), and return "something else" (see ATMO_CreateValueBinary ). I would have later used the function block to convert the sensor readings into standard MIDI messages.
At this point I wanted desperately to start testing the BLE connection and streaming of data, because I thought that woudl have been the most difficult bit to get right. Unfortunately I found the NXP Rapid IoT Android App, and even the NXP IoT Toolbox App of little use in debugging a custom data stream.
Instead, I finally found the Apps "Bluetooth LE GATT list" and "nRF Connect for Mobile", that allow to connect your mobile to a custom GATT server, explore GATT services and characteristics, read, write and notify data at low-level. It is here that I finally discovered that the kit appears with the name "PAFB3", whatever that means. For the first time I could read the values of the sensor over the wireless connection: cool!
My original idea was to create a virtual COM port on the PC, re-direct it through haireless MIDI a Virtual MIDI port created with loopMIDI, then use this as the input of a VST host into which instantiate a virtual instrument to be controlled.
I thought some kind of virtual COM port to BLE software should have been easily available, because I knew bluetooth "was originally conceived as a wireless alternative to RS-232 data cables" (at least according to wikipedia).
Unfortunately, I quickly realized the original Serial Port Profile (SPP), was only available for Bluetooth "Classic". The closest BLE alternative was the nRF UART, which is a "de facto" standard for which I was unfortunately not able to find a working windows client (at least not using the bluetooth chip integrated in my PC).
Now I then needed to implement a custom GATT client, which on the other hand I was hoping could give me potential to implement a truly standard MIDI-BLE controller.
I honestly did not want to go into low-level programming on a PC: my purpose was mainly to learn about Bluetooth stack implementation on embedded micro-controllers, and having fun with music and MIDI, possibly getting familiar with the MIDI-BLE implmentation. Therefore, instead of looking into the examples from Microsoft, I have tried to look for high-level python implementations. I found two that gave me hope: pygatt and bleak. I have tried both on Windows and Linux, and failed, therefore I gave up on this after a couple of days of tinkering.
The only option left I could see was that of "biting the bullet" and go implementing the GATT client on a NXP QN9080DK board that I had laying around from some older project. I would stream the data over the debug virtual COM port, and everything else as per the original plan.
I then have downloaded the QN9080 SDK (using the NXP SDK builder), and switched to IAR (a 30 days free license is available for anybody ) as a development environment, that I favor over NXP's MCUXpresso (NB: depending on the options selected for the SDK creation, support for ARM keil might be available as well ). I was finally able to configure a black background... big plus IAR!
I had my first success using the ble_shell demo.
After launching the application, I have used TeraTerm to connect to the virtual COM port of the debugger. One only needs to set the UART speed to 115200 baud to get the shell running.
Below the list of commands for the debug shell that allow to connect to the kit and read some data.
gap scanstart
gap scanstop
gap connect 4
gatt discovery -all
gatt read 34
where 4 is the handle associated to the kit during the scanning phase (this may change from one scanning run to another), and 34 is the characteristic handle (this seems never changing in my case).
Perfect! I now just needed to automate this sequence.
It took me a while to figure that out.
At first I have given a deep look at the ble_shell demo, but I was not able to find the right point to start modifying the code. Particularly I could not find the right "boundary" between the ble_shell and the shell_gatt services...
After discussing this with a colleague (thanks Antonio C.!), he suggested to look at an example that looked more similar to my final target, and oriented me towards the temperature_collector demo. In this example, the collector automatically connects to a sensor, activates notifications and start reading values: exactly what I needed. Except things are never as easy as they seem!
First, I must admit I have cheated... Initially I clearly did not get right the structure of a GATT server, therefore I ended up modifying the code to let the collector (client) to connect based on the MAC of my kit. Of course this is a much less generic solution, but it just works!
See the code snippet taken from my temperature_collector.c below:
uint8_t riot_mac[] = {0xB3, 0xAF , 0x0A , 0x37 , 0x60, 0x00 };
char riot_dname[] = "PAFB3";
...
/* Search for Temperature Custom Service */
if ((adElement.adType == gAdIncomplete128bitServiceList_c) ||
(adElement.adType == gAdComplete128bitServiceList_c))
{
//foundMatch = MatchDataInAdvElementList(&adElement, &uuid_service_temperature, 16);
}
if ((adElement.adType == gAdShortenedLocalName_c) ||
(adElement.adType == gAdCompleteLocalName_c))
{
nameLength = MIN(adElement.length, 10);
FLib_MemCpy(name, adElement.aData, nameLength);
//mmerlin: check MAC
foundMatch = FLib_MemCmp((uint8_t*) riot_dname, name, 5);
}
And finally you can pair the kit with QN9080DK just by clicking the Button1 of the development board.
And now comes the hard part...
GATT Database establishes a hierarchy to organize attributes. These are the Profile, Service, Characteristic and Descriptor. Profiles are high level definitions that define how services can be used to enable an application and Services are collections of characteristics. Descriptors defined attributes that describe a characteristic value.
In order to create a custom GATT service, one typically has to choose a "Universally Unique IDentifier" (UUID) for the service and the characteristics that needs to be implemented.
Now, the MIDI-BLE has its own Service and Characteristics UUIDs:
- MIDI-BLE Service: 03B80E5A-EDE8-4B33-A751-6CE34EC4C700
- MIDI-BLE Characteristic: 7772E5DB-3868-4112-A1A9-F2669D106BF3
I have tried configuring those from the right panel of the NXP Rapid IOT online IDE, that is available when clicking on the GATT box, but curiously enough no service appeared when exploring the device with nRF connect.
So... I assume there is something in the online IDE that prevents to create a characteristic that has different MSBs compared to the Service that it belong too.
I then tried using the other UUIDs that I have discovered: the nRF UART ones, and those worked fine. Unfortunately this means my application will be a custom one, and will not be directly compatible with the MIDI-BLE standard. Still I believe that by manually modifying and compiling the rapid IOT code with MCUXpresso, one might be able to bypass the UUIDs configuration, and maybe go back to a full MIDI-BLE implementation.
For the record:
- NRF UART Service UUID128: 6E400001-B5A3-F393-E0A9-E50E24DCCA9E
- NRF UART Rx characteristic UUID128: 6E400002-B5A3-F393-E0A9-E50E24DCCA9E
- NRF UART Tx characteristic UUID128: 6E400003-B5A3-F393-E0A9-E50E24DCCA9E
Here related configuration form the online IDE.
Similarly, I had to declare the UUIDs in the QN9080 project, by adding the following lines to the file gatt_uuid128.h
UUID128(uuid_service_nrfuart, 0x9E, 0xCA, 0xDC, 0x24, 0x0E, 0xE5, 0xA9, 0xE0, 0x93, 0xF3, 0xA3, 0xB5, 0x01, 0x00, 0x40, 0x6E)
// NRF UART Tx characteristic UUID: 6E400003-B5A3-F393-E0A9-E50E24DCCA9E
UUID128(uuid_characteristic_nrfuart_tx, 0x9E, 0xCA, 0xDC, 0x24, 0x0E, 0xE5, 0xA9, 0xE0, 0x93, 0xF3, 0xA3, 0xB5, 0x03, 0x00, 0x40, 0x6E)
// NRF UART Rx characteristic UUID: 6E400002-B5A3-F393-E0A9-E50E24DCCA9E
UUID128(uuid_characteristic_nrfuart_rx, 0x9E, 0xCA, 0xDC, 0x24, 0x0E, 0xE5, 0xA9, 0xE0, 0x93, 0xF3, 0xA3, 0xB5, 0x02, 0x00, 0x40, 0x6E)
NB: interestingly enough, the endianness of the UUID128 macro is the opposite than in the ASCII conventional form... It took me some debugging to figure that out too.
Of course that is not the whole story: the temperature collector example, actually reads and receives notifications from a characteristic that is NOT specified as a UUID128: there are a bunch of standard GATT characteristics that are identified by a shorter UUID16, and temperature is one of those.
This has an impact in the modifications that need to be applied to the function BleApp_StoreServiceHandles in temperature_collector.c that explores the device services and characteristics.
First, I had an issue with the service uuidType, that was not 16bit. I then had to substitute this lines
if ((pService->uuidType == gBleUuidType16_c) &&
FLib_MemCmp(pService->uuid.uuid128, uuid_service_temperature, 16))
with this
if ( FLib_MemCmp(pService->uuid.uuid128, uuid_service_nrfuart, 16))
Also, the comparison of an UUID16, can be done by a standard '==' operator, because the value can be contained in a standard integer variable, but the same is not true for an UUID128, that instead is contained into a vector of 8bit data.
Fortunately, the SDK provides the function Flib_MemCmp to perform comparison between two buffers of data. Therefore, I had to change the line that identified the interesting characteristic from this:
if ((pService->aCharacteristics[i].value.uuidType == gBleUuidType16_c) &&
(pService->aCharacteristics[i].value.uuid.uuid16 == gBleSig_Temperature_d))
to this
if (FLib_MemCmp(pService->aCharacteristics[i].value.uuid.uuid128, uuid_characteristic_nrfuart_tx,16))
I am not sure it was actually required, but in order to be able to revert to the original temperature example, I ended up creating a custom data type to describe the NRF UART service, as below:
typedef struct nrfUartConfig_tag
{ uint16_t hService;
gattDbCharPresFormat_t dataFormat;
uint16_t hNrfuart;
} nrfUartConfig_t;
typedef struct appCustomInfo_tag
{
// tmcConfig_t tempClientConfig;
/* Add persistent information here */
nrfUartConfig_t nrfuartClientConfig;
}appCustomInfo_t;
Of course this meant also to substitute all the occurrences of customInfo.tempClientConfig with customInfo.nrfuartClientConfig.
At this point, I was able to explore the GATT server services and characteristics, but I still needed to transfer real data from the sensors. Looking at how that was implemented in the gatt_shell, I have copied the read command function MidiLeGatt_Read:
static int8_t MidiLeGatt_Read(uint8_t CharHandle)
{
uint8_t *pValue = NULL;
uint16_t OutActualReadBytes;
//bleUuid_t pUuid16 = (bleUuid_t)gBleSig_nrfuart_tx_d;
//if (gPeerDeviceId == gInvalidDeviceId_c)
if (mPeerInformation.deviceId == gInvalidDeviceId_c)
{
debug_shell_write("\n\r--> Please connect the node first...");
return CMD_RET_FAILURE;
}
// mmerlin: allocate buffers only once
if (mpCharBuffer == NULL )
mpCharBuffer = MEM_BufferAlloc(sizeof(gattCharacteristic_t));
if (pValue == NULL )
pValue = MEM_BufferAlloc(mMaxCharValueLength_d);
if (!mpCharBuffer || !pValue)
{
debug_shell_write("\n\rMidiLeGatt_Read: Error allocating buffer!\n");
return CMD_RET_FAILURE;
}
mpCharBuffer->value.handle = CharHandle;
mpCharBuffer->value.paValue = pValue;
GattClient_ReadCharacteristicValue(mPeerInformation.deviceId, mpCharBuffer, mMaxCharValueLength_d);
debug_shell_write("\n\r--> Bytes Read! ");
debug_shell_writeDec(OutActualReadBytes);
return CMD_RET_ASYNC;
}
At this point I was able to trigger a single value read by calling the
For instance, in the function BleApp_StateMachineHandler, I had added the read under the case mAppRunning_c
#define hNrfUartTxHandle 34
...
case mAppRunning_c:
{
debug_shell_write("\n\rState mAppRunning_c, event: ");
debug_shell_writeDec(event);
if (event == mAppEvt_GattProcComplete_c)
{
debug_shell_write("\n\rState mAppRunning_c && mAppEvt_GattProcComplete_c\n");
MidiLeGatt_Read(hNrfUartTxHandle);
Actually, in order to see the value printed into the shell, I had to create another function MidiLeGatt_Print
static uint8_t MidiLeGatt_Print(deviceId_t serverDeviceId)
{
debug_shell_write("\n\r--> GATT Event: Characteristic Value Read ");
debug_shell_write("\n\r Value: ");
debug_shell_writeHexLe(mpCharBuffer->value.paValue, mpCharBuffer->value.valueLength);
return CMD_RET_ASYNC;
}
This function will be triggered by the function BleApp_GattClientCallback, when the procedure result is gGattProcReadCharacteristicValue_c, as shown in the code that I have added below.
static void BleApp_GattClientCallback(
{
if (procedureResult == gGattProcError_c)
{
...
}
else if (procedureResult == gGattProcSuccess_c)
{
switch(procedureType)
{
...
case gGattProcReadCharacteristicValue_c:
{
MidiLeGatt_Print(serverDeviceId);
break;
}
}
}
}
You can see from my code snippets, that I have a bunch of functions named debug_shell_write, that are actually not available in the SDK. This is because I needed the shell to debug stuff at this phase, but I knew later I would need it to stream exclusively MIDI messages.
Therefore I have created some customized macros that allows to switch from one mode to the other by simply modifying one define. See the code below:
/************************************************************************************
*************************************************************************************
* Shell Configuration
*************************************************************************************
************************************************************************************/
#define DEBUG_SHELL 0
#define debug_shell_write(fmt) if(DEBUG_SHELL) shell_write(fmt);
#define debug_shell_writeDec(fmt) if(DEBUG_SHELL) shell_writeDec(fmt);
#define debug_shell_writeLe(fmt) if(DEBUG_SHELL) shell_writeLe(fmt);
#define debug_shell_writeN(fmt,len) if(DEBUG_SHELL) shell_writeN(fmt,len);
#define debug_shell_writeHexLe(fmt,len) if(DEBUG_SHELL) shell_writeHexLe(fmt,len);
#define MIDI_SHELL 1
#define midi_write(fmt,len) if(MIDI_SHELL) shell_writeN(fmt,len);
I could now successfully perform a single read.
I then have tried several methods to chain the read commands, but they have all failed. This is where I started understanding I actually needed to use notifications from server rather than reads triggered by the client: this is exactly what the temperature_collector demo does, but for some reason notifications were still not active...
So here is what I have learnt the hard way: in order to activate notifications the client needs to write a specific standard characteristic on the server. The characteristic is identified by an UUID16:gBleSig_CCCD_d, and the value that need to be assigned is also already defined gCccdNotification_c.
Writing this characteristic is done by the function BleApp_ConfigureNotifications, that I report below. Note I have also made the green led flashing ( Led2Flashing ) to help me understanding notifications were active, when not using the shell.
static bleResult_t BleApp_ConfigureNotifications(void)
{
bleResult_t result = gBleSuccess_c;
uint16_t value = gCccdNotification_c;
if( mpCharProcBuffer == NULL )
{
mpCharProcBuffer = MEM_BufferAlloc(sizeof(gattAttribute_t) + mMaxCharValueLength_d);
}
if( mpCharProcBuffer != NULL )
{
mpCharProcBuffer->handle = mPeerInformation.customInfo.nrfuartClientConfig.hNrfuart;
mpCharProcBuffer->uuid.uuid16 = gBleSig_CCCD_d;
// turn on notification
GattClient_WriteCharacteristicDescriptor(mPeerInformation.deviceId,
mpCharProcBuffer,
sizeof(value), (void*)&value);
debug_shell_write("\n\rNotifications Enabled!\n");
// change LED color, for shell-less debug
LED_StopFlashingAllLeds();
Led2Flashing();
}
else
{
debug_shell_write("\n\rConfigure Notifications, BLE Out of Memory!\n");
result = gBleOutOfMemory_c;
}
return result;
}
NB: This only enables Notifications! The real output of the notification is taken care by BleApp_GattNotificationCallback.
static void BleApp_GattNotificationCallback
(
deviceId_t serverDeviceId,
uint16_t characteristicValueHandle,
uint8_t* aValue,
uint16_t valueLength
)
{
debug_shell_write("\r\nNotification Callback!\n\r");
//if (characteristicValueHandle == mPeerInformation.customInfo.tempClientConfig.hTemperature)
if (characteristicValueHandle == hNrfUartTxHandle)
{
// BleApp_PrintTemperature(*(uint16_t*)aValue);
BleApp_PrintMidile(aValue,valueLength);
#if (cPWR_UsePowerDownMode)
/* Restart Wait For Data timer */
TMR_StartLowPowerTimer(mAppTimerId,
gTmrLowPowerSecondTimer_c,
TmrSeconds(gWaitForDataTime_c),
DisconnectTimerCallback, NULL);
#endif
}
}
The only change have applied in my implementation is that I substituted the original function to display temperature BleApp_PrintTemperature with a new function BleApp_PrintMidile, defined as you see below.
static void BleApp_PrintMidile
(
uint8_t *msg,
uint16_t msgLen
)
{
debug_shell_write("MIDI Message: ");
debug_shell_writeHexLe(msg,msgLen);
debug_shell_write("\n\rRAW MIDI Message: ");
midi_write(msg,msgLen);
}
With all this in place, I finally had the QN9080DK operating as a "pipe" and streaming into the virtual COM port anything that was receiving through the BLE connection, which at this stage was only a bunch of raw data from the sensor reads.
Therefore I now needed to go back to the online IDE and add code to transform the raw data into actual MIDI messages.
Time to go back to the MIDI-BLE specification from which I grab that the simplest possible message format is the following:
The message includes a timestamp, which from the specification is described as "Timestamps are 13-bit values in milliseconds, and therefore the maximum value is 8, 191 ms. Timestamps must be issued by the sender in a monotonically increasing fashion."
Fast-Forward for a second: I am currently not planning to connect to a compliant MIDI-BLE client (i.e. an iPad), but only to my QN9080DK board, and then redirecting the value with no further elaboration to hairless MIDI. It turns out the original MIDI message format only includes the MIDI status and data bytes, so the header and message including the timestamp were misinterpreted.
Long story short, I then just went back to the original MIDI message format reported below
This means that the BLE connection will introduce some pseudo-random time latency that cannot be compensated for in the client ( :-( ). It would be annoying for a real music performer, but at this point I have to accept this project can only be a demonstrator for what COULD be done, as some more work would be required to transform it into a real controller.
So back to the online IDE, I have started simple by triggering some "note on" signals when pushing the switch buttons.
And here as example I report the simple code to be added into the function FunctionBRbutton2MIdi_trigger.
ATMO_Status_t FunctionBRbutton2MIdi_trigger(ATMO_Value_t *in, ATMO_Value_t *out) {
char midi_msg[4] = "dea";
midi_msg[0] = MIDI_STATUS_NOTE_ON | (MIDI_CHANNEL & 0b1111); // 1001nnnn, n = channel. Note On
midi_msg[1] = 0x0fffffff & MIDI_F3; //0kkkkkkk, k = key (66 = middle F)
midi_msg[2] = 0x7F; //0vvvvvvv, v = velocity
ATMO_CreateValueBinary(out, (void*)midi_msg, 3*sizeof(void));
return ATMO_Status_Success;
}
Note the function ATMO_CreateValueBinary is used to create the output buffer where the output data will be later read by the GATT service.
You can also note I have used some constants that need to be declared in a specific section of the file, as you can see below.
Conversely to the push buttons, some sensors do not generate interrupt that can trigger actions. Instead, a timer that cyclically forces a read of the value can be used instead. This is what I have done for the Ambient Light sensor.
Below the example code to transform the light ambient value read from the sensor into a MIDI pitch bend message.
ATMO_Status_t FunctionLight2Midi_trigger(ATMO_Value_t *in, ATMO_Value_t *out) {
char midi_msg[4] = "dea";
unsigned int light = 0;
ATMO_GetUnsignedInt(in, &light);
midi_msg[0] = MIDI_STATUS_PITCH_BEND | (MIDI_CHANNEL & 0b1111); // 1110nnnn, n = channel. Pitch Bend
midi_msg[1] = 0x0; //0ffffff, f = fine
midi_msg[2] = light & 0x7F; //0ccccccc, c= coarse
ATMO_CreateValueBinary(out, (void*)midi_msg, 3*sizeof(void));
return ATMO_Status_Success;
}
Note the function ATMO_GetUnsignedInt is used to read the value from the sensor and move it into the light variable.
The ambient light sensor is slightly easier than most of the other sensors, because the reading already comes in integer format, which fits perfectly the MIDI format.
Other sensors like the magnetometer have a floating point output instead, therefore a floating to integer data conversion is required. It is not exactly clear to me how that is done in the online IDE context, but I have assumed the DataTypeConvert boxes were tuned correctly for this task, and used them before the MIDI transformation function that maps the sensor reads into MIDI Control Changes.
Below the example of the function FunctionMagX2Midi_trigger.
ATMO_Status_t FunctionMagX2Midi_trigger(ATMO_Value_t *in, ATMO_Value_t *out) {
char midi_msg[4] = "dea";
unsigned int speed = 0;
ATMO_GetUnsignedInt(in, &speed);
midi_msg[0] = MIDI_STATUS_CC | (MIDI_CHANNEL_FX & 0b1111); // 1110nnnn, n = channel. Control Change
midi_msg[1] = 0x7F & CONTROL_MAGX;
midi_msg[2] = 0x7F & speed; // controller value
ATMO_CreateValueBinary(out, (void*)midi_msg, 3*sizeof(void));
return ATMO_Status_Success;
}
I have then mapped the touch sensor to "note on" events too, just like the push buttons.
ATMO_Status_t FunctionTouchUp2Midi_trigger(ATMO_Value_t *in, ATMO_Value_t *out) {
char midi_msg[4] = "dea";
midi_msg[0] = MIDI_STATUS_NOTE_ON | (MIDI_CHANNEL & 0b1111); // 1001nnnn, n = channel. Note On
midi_msg[1] = 0x0fffffff & MIDI_A2; //0kkkkkkk, k = key (57 = A-1)
midi_msg[2] = 0x7F; //0vvvvvvv, v = velocity
ATMO_CreateValueBinary(out, (void*)midi_msg, 3*sizeof(void));
return ATMO_Status_Success;
}
Then I have used the Motion detector to generate a drum kick even on the percussion channel.
ATMO_Status_t FunctionMotion2Midi_trigger(ATMO_Value_t *in, ATMO_Value_t *out) {
char midi_msg[4] = "dea";
unsigned int speed = 0;
ATMO_GetUnsignedInt(in, &speed);
//if ((0x7F & speed) > 10) // add threshold for percussive sound
//{
midi_msg[0] = MIDI_STATUS_NOTE_ON | (MIDI_CHANNEL_PERCUSSION & 0b1111); // 1001nnnn, n = channel. Note On
midi_msg[1] = 0x0fffffff & MIDI_C1; //0kkkkkkk, k = key (68 = middle G)
midi_msg[2] = 0x7F & speed; //0vvvvvvv, v = velocity
ATMO_CreateValueBinary(out, (void*)midi_msg, 3*sizeof(void));
//}
return ATMO_Status_Success;
}
Note the motion detection is pretty sensitive, therefore I have tried to use the detect motion to trigger an accelerometer read in order to be able to "silent" movements below a certain acceleration threshold. Unfortunately I was not able to get this to work.
I have given a final touch by putting a couple of sensor reads into the graphical interface, as shown below.
Here is how my Graphical User Interface looks like:
Finally, most VST hosts allow to configure a mapping of the CC to some specific controls. Below is how I mapped the Rapid IoT kit signals generated above to the TAL bassline synthesizer (available for free).
And here is an example of how it all sounds, when connected to a musical synthesizer.
At last, a similar example based on Nordic SDK.
I hope you enjoyed, and this will allow more people to create some new wonderful music applications!
EDIT:
One might ask himself: this is all so fun, but what is there a "useful" application of this in real life ? My answer to this half-critic is "most certainly yes".
Anybody who has tried playing a “classical” instrument knows there is a steep learning curve in dealing with a particular instrument: that is a classical human-machine interfacing problem, which is (in my belief) one of the bottlenecks of modern computing. For instance my capacity of communicating effectively with you is limited by the speed at which I can type on my keyboard. Finding new human-machine interface methods might help breaking this barrier, which is why the topic is so relevant even if taken apart from the “fun” and musical aspects.
Here is an example of industrial automation:
This shows interaction methods that save countless hours of pragmatic coding.
On a similar string the "Myo Armband" is a good example of what I have in mind of what could be feasible with the Rapid IoT prototyping kit. Of course this would require developing some gesture recognition, and likely much more, which probably go far beyond the scope of an hackster project…
Relevance of the topic is perceived by the big names like E. Musk and the likes, who even bothered founding a company for Brain-Machine interfacing (Neuralink).
More on that on an interesting internet article.
Going back to MIDI, even though it was born for controlling musical instruments, it became a de facto standard for light effects as well.
On top of that, there are probably countless of examples of custom applications of MIDI controls in different context. I have just found this video on how to control Apple Motion with a controller.
Comments