It's been almost one year since I have started tinkering with the NXP Rapid IoT Prototyping kit, that I was given to participate to the "Revolutionize Your IoT Prototyping" contest, and that I nicknamed "rIoT" for ease of speak. In the spirit of the "rapid prototyping", until now I have developed mostly with the suggested on-line graphical tools NXP Rapid IoT Studio (or with the Atmosphere IoT Platform), with which I have written MIDI rIoT!, and reproduced the Talking Weather Station.
I love the hardware of the Rapid Prototyping kit, that include a number of sensors ( buttons, touch controller, temperature, humidity, accelerometer, gyroscope, magnetometer, air quality, battery level ), a few actuators ( LEDs, LCD, buzzer ), wired (USB, UART, I2C, SPI) and wireless interfaces (BLE/Zigbee/Thread and NFC), all in a compact form factor, and for a fairly reasonable price point of about 50$.
However, as an hardware guy I feel a bit far from the transistors, and limited in the functionalities exposed by the rapid IoT studio. For instance: control of the graphics of the LCD is limited to the defined presets, DSP functions are limited to some sub-functions on the graphical datapath, RTC functions are not exposes even though they are actually present in the SDK... it seems close to impossible to add drivers for new device that are not supported by atmosphere, but also, I could not configure the BLE GATT service and characteristic UUID, needed to implement a standard BLE-MIDI controller. I soon started to feel too much constrained by the software limitations of such graphical development tools: I feel the need for deeper hardware control!
The natural tool suggested for development is NXP MCUXpresso IDE, with the Rapid IoT SDK.
However, developing with MCUXpresso also almost inevitably requires buying the Hexiwear Docking Station, which while being a good development tool, is a bit expensive (39$) compared to the cost of the bare kit (50$).
On top of that, when you start trying to develop with the SDK, you basically have to start dealing with the FreeRTOS abstraction layer, the NXP middleware, and the different implementation of the low-level drivers... in short: development immediately become much less "rapid", as there is some learning curve to do.
Moreover, when you start project yourself more into the future into moonshot projects ( for some time I have caressed the idea of porting the Keyword-Spotting on MCU demo from ARM working on the FRDM-K64 board, that uses the same MCU ), you realize that the rIoT is not supported by the SDK Builder, which means no updates on the NXP SDK (including ARM CMSIS library), and no support for other toolchains and IDEs than MCUXpresso (usually NXP SDK support IAR, keil and GCC Make).
Furthermore, the LCD library provided is Segger's emWin, which is statically compiled in a binary format. This also means when I have tried to include cpp sources from other project, it made my compiling attempts to fail, probably due to compiler settings, that I was not able to figure out.
So I was feeling the need for a simpler development environment, inspired I was with my recent experiences with Arduino boards like Wemos D1 mini, and the Arduino micro.
Moreover, I had recently read that Arduino's latest official boards were finally based on ARM MCUs, and using ARM's MBED development platform below the Arduino library. Even though some ARM boards were unofficially already available, I thought that all the mbed-supported boards were immediately available (or sort-of with Arduino).
Critically, I then have started to look for other boards based on the same K64 MCU that powers the rIoT, and discovered about the FRDM-K64 and Hexiwear.
Particularly the Hexiwear bears striking similarity with the rIoT: they are meant to use the same docking station, they are based on the same MCU, and have some same or similar sensors.
So, I have started to poke around with MBED... and figure out the Hexiwear codebase can be used to program the Rapid IoT, I have dubbed this effort "rIoTwear mbed".
Compiling and Debugging with online MBED StudioUnfortunately trying to use the online IDE, results in binary files that are not accepeted by the default bootloader of rIoT, therefore I really needed to put my hands on an Hexiwear docking station to go forward.
Compiling and Debugging with offline MBED StudioWhen I finally had it, I found that the offline Mbed studio IDE did not seem to support the OpenOCD debugger. Deception. Sadness.
Compiling and Debugging with IAR Embedded WorkbenchBut then I figured out that fortunately Mbed does support a whole breed of other IDEs/toolchains. I finally had my first success exporting the blinky demo (well actually an older version of it, based on the blocking wait_ms(), as opposed to the non-blocking thread_sleep_for() ), and finding that it worked right away with IAR Embedded Workbench on my work windows PC.
Here an unofficial version that works out of the box.
Potentially this is the right thread implementation for K64 ?
Compiling and Debugging with Visual Studio Code and GCC (Ubuntu)Unfortunately, my copy of IAR is tied to a network license, therefore when I have tried it at home, and found that for some reason my VPN connection was not working, and therefore I could not access the license server, I have turned myself towards open source again.
Particularly I noticed the support for VisualStudio code, that I had already tried and appreciated for some other python project. Also, being this a spare-time project, I wanted be able to develop with my personal Ubuntu 18.04 machine. I found that only two things needed to be modified, that are not mentioned by the tutorial.
First the gdb-server: arm-none-eabi-gdb seems now deprecated in favor of gdb-multiarch. In order to reflect this change, I had to modify the corresponding line of the launch.json file in the.vscode folder, as below.
"linux": {
"MIMode": "gdb",
//"MIDebuggerPath": "arm-none-eabi-gdb",
"MIDebuggerPath": "gdb-multiarch",
"debugServerPath": "pyocd-gdbserver"
},
The other issue I had was with the linker not finding the file mbed_config.h.
cc1: fatal error: /filer/workspace_data/exports/d/d8e6c0dafc780e4e27535649b9338717/mbed-os-example-blinky/mbed_config.h: No such file or directory
compilation terminated.
In order to fix this, I had to modify the Makefile to actually provide the exact path to the file to the ASM_FLAGS (presumably used by the linker).
# ASM_FLAGS += /filer/workspace_data/exports/d/d8e6c0dafc780e4e27535649b9338717/mbed-os-example-blinky/mbed_config.h
ASM_FLAGS += /media/marco/DATA/programming/mbed/mbed-os-example-blinky_vscode_gcc_arm_hexiwear/mbed-os-example-blinky/mbed_config.h
And there goes the blinky!
Compiling and Debugging with Visual Studio Code and GCC (Windows 10)One note goes for the same procedure on Windows10 that is currently not working on my PC, with compilation failing, probably due to an error of coexistence between different MinGW versions that I have indirectly installed due to the presence of open source software on my machine.
I found some "cures" looking on some forums, but I am afraid this will mess up my configuration, so I have to let others to debug the issue for the time being.
LED blinking demoSee my demo that shows how to make blink the red, green and blue leds.
PrintfSee the tutorial on how to use printf() statements:
LCDI have taken an original demo, and adapted in my repository.
LCD: LPM013M126A – JDI Color Memory LCD
https://os.mbed.com/users/KURETA90/code/ColorMemLCD/
https://os.mbed.com/users/batman52/code/rIoTwear_LCD/
Accelerometer, Magnetometer, and Altimeter demoFortunately, the same sensors included in the rIoT, are also in the supported FRDM-FXS-MULTI2-B board. Looking at the provided libraries, I found the SensorStream example.
Gyroscope: FXAS21002
Accelerometer/Magnetometer: FXOS8700
Altimeter/Pressure: MPL3115
I have therefore created my own example, mapping the correct pins for the rIoT.
Based on an available driver, I have implemented and example.
Temperature/Humidity: AMS ENS210
Based on an available driver, I have implemented an example.
Light: TSL2572
On the base of the sx9500 driver I have created an example.
Touch: SX9500
Mbed natively supports SPI flash block device.
https://os.mbed.com/docs/mbed-os/v5.14/apis/spi-flash-block-device.html
However, the feature need to be enabled in the compilation configuration.
https://os.mbed.com/docs/mbed-os/v5.14/reference/storage.html#blockdevice-default-configuration
Based on the provided documentation, this seems not to be possible neither with the online compiler, nor with IAR.
On top of that, the SPI flash device is shared at boot time between K64 and KW41 and an handshake protocol has been implemented, in order to not conflict while accessing the peripheral.
This protocol needs to be implemented if planning to use the flash on Rapid IoT.
I have attached to this project the code from the NXP SDK for the SPI bus share between K64 and KW41.
SPI Flash: MT25QL128ABA
Air Quality (gas) demoI found a couple of implementations of the CCS811 driver, but both of them seems to no be working very well with rapid IoT and mbed 5.
https://os.mbed.com/users/andcor02/code/CCS811-TEST//file/13e51613b175/main.cpp/
https://os.mbed.com/users/stevew817/code/AMS_CCS811_gas_sensor/
It might be a better idea to re-implement the driver starting from either the driver in the rapid iot SDK, or the original sparkfun driver for Arduino.
https://github.com/sparkfun/SparkFun_CCS811_Arduino_Library/blob/master/src/SparkFunCCS811.cpp
AIR quality: AMS CCS811
RTCI could not find an available mbed driver for this device, therefore I have attached the SDK driver from MCUXPresso, from which it can be implemented.
Similarly, I found some Arduino libraries.
https://github.com/jvsalo/pcf2123
https://github.com/Megunolink/PCF2123-RTC-Arduino
RTC: PCF2123BS
NFCNFC is supported by mbed ( https://os.mbed.com/docs/mbed-os/v5.14/reference/nfc-technology.html ), but not the specific chipset in Rapid IoT. Some work is required to get it working.
An arduino library is available.
NFC: NT3H2211
Next steps: Neural NetworksAn excellent research paper, implemented a Neural-Network based on the CMSIS-NN library from ARM. CMSIS-NN is part of CMSIS-DSP in CMSIS5.
There is a ready repository of CMSIS5 for mbed, even though the documentation mentions "some work is needed".
The project implements "keyword spotting", meaning recognizing when one between a few keywords was pronunciated, ie like in the attempt of implementing a voice function.
However a github repository that includes the original code that could run on the Freedom K64 board too is available.
In theory this should run on the Rapid IoT, even though the device is missing a microphone input. A Mic Click board could be used for that purpose.
Next steps: ArduinoOne of the interesting developments that happened about mbed, is that the newest Arduino Nano 33 boards, are supporting the Arduino API through an mbed compatibility layer. This means the compatibility layer can (at least in theory) be applied to any mbed-supported board.
https://github.com/arduino/ArduinoCore-nRF528x-mbedos
Based on the description in the tutorial, it should be feasible to port the Arduino API to an mbed project, and this would allow to directly access the Arduino library of drivers for peripherals.
EDIT 2021:
An intermediate step that should be easier to make work is that of running arduino sketches on top of mbed.
http://blog.janjongboom.com/2019/08/01/arduino-mbed.html
A new repository that supports mbed 6 is available.
https://github.com/arduino/ArduinoCore-mbed
Next steps: bootloader (for Arduino)One of the key points of Arduino, is that it includes a bootloader, that does not require having a somewhat expensive programmer, like a jlink or the CMSIS-DAP, which are availble in the hexiwear docking station.
There are examples of how to implement a bootloader with mbed.
https://os.mbed.com/docs/mbed-os/v5.14/tutorials/bootloader.html
https://github.com/ARMmbed/mbed-os-example-bootloader
Next steps: BLEMbed already supports BLE through a dedicated class.
https://os.mbed.com/docs/mbed-os/v5.14/apis/ble.html
In order for this to be used, the different Host Controller Interface (called FSCI for NXP products), that on Rapid IoT uses the UART port, needs to be implemented.
Some documentation is available on the FSCI interface on the NXP community forums.
FSCI on KW40
https://community.nxp.com/thread/390776
Develop FSCI for KW36
https://community.nxp.com/docs/DOC-343043
ArduinoBLEArduino has its own BLE API.
https://github.com/arduino-libraries/ArduinoBLE
Next Steps: KW41The Freedom board KW41z is supported by mbed.
https://os.mbed.com/platforms/FRDM-KW41Z/
Therefore, in principle, it should be possible to use the same process explained here, in order to use KW41 chip too. However, in order to get the wireless protocols working, the connectivity stack shall be rewritten within mbed. It does not sound such an easy task, and for the purpose of the Rapid IoT Kit, it probably adds little value. What I personally find more interesting would be the possibility of using the FRDM-KW41 board as a BLE-enabled mbed/arduino board.
Comments