Gesture Control in the Heat of the Moment

One size does not fit all when it comes to UIs, but this thermal imaging-based gesture recognition system fits some important use cases.

Nick Bild
4 months agoProductivity
Gesture recognition with thermal imaging and a Photon 2 (📷: Naveen Kumar)

No matter how good it is, no particular user interface is right for every application. The touchscreen, for instance, is excellent for a small portable device like a smartphone, but would not be a good choice for applications on a laptop computer where rapid data entry is necessary. Voice assistants, on the other hand, are a great choice for controlling a smart home, but in a noisy public space they offer their users more frustration than assistance.

In scenarios where touch-free interfaces are needed, and either due to background noise or a need for more precision, voice control is not a good option, gesture-based interfaces are gaining popularity. But these systems have their issues as well. They often rely on computer vision-based techniques for operation, which means that they struggle (or fail entirely) under low-light conditions. As such, alternative gesture recognition systems are needed for a wide range of applications, in areas from healthcare to industry.

Hardware hacker Naveen Kumar recently demonstrated a prototype that might be the ideal solution for these use cases. Kumar’s system uses thermal imaging such that it can operate under any lighting conditions (including total darkness). To protect privacy and ensure real-time operation, it runs directly on a powerful and energy-efficient hardware platform, and a machine learning development platform was leveraged to greatly simplify creation of the gesture recognition algorithm.

At the heart of the device is a Particle Photon 2 development board, which is tiny and inexpensive, yet has enough computational horsepower to run a gesture classification algorithm locally. This was paired with a Pimoroni MLX90640 thermal camera breakout board to capture infrared images of the hand. An optional Adafruit 2.8-inch TFT touch shield display was included to make it easy to view a false-color representation of the captured thermal images.

To prove the concept, Kumar decided to build a device that can recognize the hand poses used in a game of rock paper scissors. Toward that goal, the hardware was set up, and a series of images of each hand state were collected from the thermal camera. These images were then uploaded to the Edge Impulse machine learning development platform. A machine learning classifier was then designed using the Edge Impulse interface, after which it was trained using the data previously collected. On the first attempt, a classification accuracy level of nearly 99% was achieved. Not bad at all!

For speed and privacy, this trained model was deployed from Edge Impulse to the Particle Photon 2 board. Kumar then demonstrated how the device could accurately recognize the hand gestures it was trained on — even without an internet connection. These predictions could be used to trigger any arbitrary action required of an application, like controlling another device. Kumar also showed how the detected gestures can be recorded in the Particle Cloud, which makes it possible to log actions, or even control remote systems.

The hardware build and software development process has been extensively documented by Kumar, so feel free to borrow whatever you need in building your own gesture recognition device. Modifying it for different use cases is as simple as collecting a different training dataset.

Nick Bild
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.
Latest articles
Sponsored articles
Related articles
Get our weekly newsletter when you join Hackster.
Latest articles
Read more
Related articles