Wouldn’t be nice to control a robotic arm on the other side of the world and make it follow your gestures in the most natural manner?
With our Tactigon Skin and recently developed SDK this can be achieved with very few lines of code. In this article we will show how to use the T-Skin to recognize gestures in conjunction with our AI-based SDK and the MQTT protocol allowing to send messages wherever we want. The python code illustrated below is importing modules from our SDK which are in charge of BLE data collection from one or two T-SKIN (Tactigon_BLE) and gesture recognition (Tactigon_Gesture).
Thanks to those modules, recognized gestures are pushed in a pipe which can be pulled by the application process (Application_MQTT) and published on an MQTT server. Gestures to be recognized,
MQTT server details and topic names can all be customized by using our SDK configuration files.
T-Gear SDK is use for collecting new raw data, send the data to server, ask server to train a model using the raw data, and download the model from server and finally use the model for testing real-time gesture recognition.
The T-Gear SDK includes several end applications that can be customized using config files or used as reference for development.
Comments