Hackster is hosting Hackster Holidays, Ep. 6: Livestream & Giveaway Drawing. Watch previous episodes or stream live on Monday!Stream Hackster Holidays, Ep. 6 on Monday!

Drone + Tai Chi = Drone Chi

A new kind of close-range human-drone interaction experience.

Cabe Atwell
5 years agoRobotics / Drones / Sensors

Human-computer interaction (HCI) is increasingly influenced by somaesthetics — a multidisciplinary field aimed at improving life quality by cultivating an appreciation for bodily and sensory experiences. A research team based at Monash University in Melbourne is working to investigate the potential of drones for somaesthetic HCI and has developed a Tai Chi-inspired, close-range human-drone interaction experience they have called Drone Chi.

Along with the increased traction of somaesthetics as a theoretical foundation for designs that engage with the body, autonomous flying drones have also been flourishing as a design material. With high-performance motion sensing, it is possible for drones to operate within a close approximately 0.5 meter-radius around the body, allowing “intimate” human-drone interaction (HDI) with somaesthetic qualities. Informed by Tai Chi and meditation, the Drone Chi project explores the possibility of co-movement with a drone as a design opportunity.

The design process led to radial symmetry as a formal criterion for the drone, in order to best fit the intended experience and innate form of a quadcopter and to provide a larger degree of freedom for movement and attention due to ambiguous drone orientation. The final result, built on a Bitcraze Crazyflie micro-quadcopter, is a custom 3D-printed hull modeled after the petals of a lotus flower and LEDs at the center of each petal. A docking station made from a plastic vine with integrated charging cables completes the design.

Design solutions for hand tracking were pursued concurrently with designing the drone. Using hand pads with slender features printed from PLA plastic, the design mirrors the delicate look and feel of the drones. Correspondence between hands and drone uses an offset-midpoint measure calculated using mocap data obtained using a Qualisys system and infrared-reflective markers. The LEDs within the drone design vary in intensity based on this measure, offering subtle guidance toward a state of focus.

The interaction experience itself unfolds in two stages. First, the drone is followed with the hands along a circular path that gradually grows in size. Then, the mode switches to enable the hands to lead the drone, which will follow as long as the offset-midpoint is within 20 cm of the drone. Object positions are processed on a host PC, and the drone is kept under closed-loop control at 100 Hz to interact responsively with the hands using software based on the open source Crazyflie client. The experience itself is designed to exemplify dynamic and intimate somaesthetic interactions with robotic design material and body movements within a 3D space and illustrates the potential for further uses of drones for somaesthetic HCI.

Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles