In case you are unable to view the video (or annotations, which aren't available on mobile), it demonstrates Hexy's core functionality: the Hexy animation starts off happy, but quickly falls asleep due to lack of movement; after 60 seconds (which would be increased for real-world applications) the Hexy animation turns red, and the device starts vibrating, in order to show Hexy's displeasure at the user's lack of activity. When the device is shaken, the device stops vibrating, and the animation turns happy and green.
Download hexy-mvp_HEXIWEAR-09954ec1e80e.bin
, drag and drop it into your DAPLINK
Hexiwear USB drive, and press reset to experience Hexy right away!
The inspiration for Hexy came from a desire to leverage the Hexiwear's delightful array of onboard sensors in the context of a wearable device, while presenting the resulting data in an interesting and novel way. I've always been a big fan of virtual pets, and the anthropomorphism of machines and computers in general, so it became my goal to enhance "boring" health data via a wearable pal that's fun to be with!
Development EnvironmentUpon receipt of the Hexiwear Power User Pack, after a thorough review of the default firmware apps and WolkSense integration, I started looking into what it was going to take to start hacking on this thing! I had a modicum of experience with Kinetis Design Studio from an earlier contest, but the additional setup and configuration required beyond basic KDS installation appeared somewhat daunting, and I recalled seeing mention of mbed, which I've used in the past and found extremely efficient for initial "blinking an LED" type projects while getting to know new hardware.
After a brief period of confusion due to not having realized that my build target was still set to K64F
instead of Hexiwear
, I worked through a number of the example repos, developing an understanding of the OLED API, serial debugging techniques, and several of the onboard sensors. However, after working through a few examples, the workflow of building and downloading a binary from the cloud compiler, then moving it to the virtual USB drive and rebooting (often causing issues with my laptop due to the frequent USB disconnections) - which was so powerful and enabling at first - started to become tedious. At this point, I decided to take another look at KDS, expecting a more productive workflow.
After many hours of downloading, setup, and configuration (and that was starting with KDS already installed!), I finally had everything ready to build and debug my "first program". I was disappointed after all of that to discover that my "first program" would consist of trivial changes to the default firmware. I had literally no clue where to go from there - I'm not familiar with FreeRTOS or the Hexiwear's particular firmware implementation - I prefer the more typical abstracted main()
function as used by mbed or Arduino's setup()
and loop()
.
Feeling out of my depth with KDS/FreeTROS, I reverted to mbed. I was delighted to find that during the intervening weeks spent banging my head against KDS, the available mbed examples had proliferated (some even being updated while I was working!), covering most of what I'd need, as well as leading me to discover that it was possible to leverage libraries for shared components with K64F
platforms.
Upon reverting to mbed, I started with a clean mbed OS Blinky LED HelloWorld template to which I added the freshly updated Hexiwear OLED Display Driver - and was greeted with a screen full of random pixels. I tried reverting to earlier versions, to no avail. At that point, I reverted to a fresh clone of Hexi_OLED_Text_Example
and was able to achieve the desired font size and effects by analyzing the well-commented Hexi_OLED_SSD1351
driver and font source, as well as the built-in class reference. The omission of many basic graphics primitives lead me to use text-based kaomoji to represent Hexy < ^.^ >
, which was actually what I'd used in my original idea submission.
With the basic UI and animation in place, it was just a process of iteration and refactoring, incorporating new features such as the FXAS21002 gyro for activity measurement and built-in vibration for haptic feedback. Developing a working PoC based on my intended minimum viable functionality ended up being quite easy after all of the experience I'd gained so far, and with a few tweaks to initially-guessed values and thresholds, I had what I found during testing to be far more than a PoC, and instead a robust wearable experience!
Results and ConclusionsI experienced a fair number of challenges getting started with Hexiwear development, but have emerged with significant new experience that I look forward to applying to this and other platforms. Thanks to the mbed examples, I was able to quickly prototype and refine the functionality that I'd envisioned for Hexy, and submit a project that may not contain everything I'd originally dreamt of, yet is remarkably useful and functional. While the submitted code uses a 60 second inactivity interval, during testing I had a larger, more "realistic" threshold set, and it really was incredibly effective at its intended purpose of making me more active instead of sitting gormlessly in front of screens all day. I'm really impressed with the Hexiwear platform, and with mbed's full-featured OS and cloud IDE, which allowed me to quickly realize an extremely effective prototype of the functionality that I'd envisioned.
Next Steps/Future EnhancementsOne of the most limiting aspects of Hexiwear development that I found was the lack of availability of libraries. There are a small number of libraries included or alluded to in the mbed examples, but it is beyond my current capabilities to turn a complicated data sheet describing a sophisticated component that uses elaborate data formats into an mbed library. I was really looking forward to incorporating the MAX30101 Optical Heart rate Sensor and TSL2561 Ambient light Sensor into my project, but I simply lacked the time or expertise to do so without existing examples/libraries. I had similarly hoped to leverage BLE to stream data for use in a web dashboard, but the mbed OS seemed to clobber the built-in SensorTag functionality, although I did notice some new BLE examples popping up shortly before the (original) deadline, but lacked sufficient time to investigate them. I'm used to working on platforms like Arduino, Raspberry Pi, and Pebble, which have extensive documentation, limitless examples and libraries, and amazing communities - Hexiwear, being a new platform, has not had time to build this kind of momentum, so I expect the lonely fumbling in the dark that I experienced will become a less common experience as the platform matures.
I discovered after the bulk of my development was complete that a CLI version of mbed exists too, which I think would address my workflow concerns, as well as allow direct use of Git and GitHub vs. the Mercurial-based mbed cloud repos.
In terms of the project itself, in addition to the functionality described above (once libraries became available) there are a few issues with the existing code, such as the timer overflowing after half an hour (again, this being a PoC, the expectation is that it can be evaluated in a few minutes), which I'd love to find time to address (perhaps by switching to the RTC instead?). I'd also love to incorporate configurability, i.e. the ability to set inactivity duration, disable vibration, select standard vs. debug mode (i.e. hide the numbers for general users), etc. using the capacitive touch buttons and persistent storage.
Comments