Skoltech Scientists Combine a Glove, Machine Learning, and a Drone for Gesture-Based Light Painting

Using an Arduino Uno-based wearable controller and a machine learning base station, the drone can turn gestures into art.

The DroneLight system turns gestures into mid-air artworks, drawn by a Crazyflie 2.0. (πŸ“·: Ibrahimov et al)

Researchers at the Skolkovo Institute of Science and Technology (Skoltech) have designed an interface designed to make drone-based art more accessible, by guiding a light-painting drone through the sky using hand gestures β€” and suggest the same technology could even assist in search and rescue missions.

"Flight control is a challenging task as user has to manipulate with the joystick to stabilize and navigate drones," explains Professor Dzmitry Tsetserukou of the problem his team was trying to solve. "Only a very skilful operator can maintain smooth trajectory, such as drawing a letter, and for the typical user it is almost not possible."

The solution: DroneLight, which sees a compact Crazyflie 2.0 quadcopter fitted with a light reflector and an array of user-controllable RGB LEDs linked to a wearable glove-style controller equipped with an inertial measurement unit (IMU), flex sensor, an Arduino Uno, and an XBee module for wireless communication. The controller links to a base station running a machine learning algorithm which matches the user's gestures to pre-programmed letters or other patterns, then instructs the drone to draw them in the air.

"The most fascinating application can be DroneMessenger, when partners can not only exchange messages and emoji over the distance but also enjoy the light art during a starry night," Tsetserukou says. "Another application is a show of drones when an operator can generate dynamic light patters in the sky in real time. You can also imagine another system, SwarmCanvas, where users located in remote places can draw a joint picture on the canvas of the night sky. Currently, drone show systems just reproduce pre-designed trajectories and lighting patterns."

"To our knowledge, it would be the world's first human-centric robotic system that people can use to send messages based on light-painting over distant locations (drone-based instant messaging)," the researchers note in the paper's abstract. "Another unique application of the system would be the development of vision-driven rescue system that reads light-painting by person who is in distress and triggers rescue alarm."

The team's work was presented at the IEEE International Conference on Robot & Human Interactive Communication (IEEE RO-MAN 2020), and the corresponding paper is available under open access terms on arXiv.org.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Get our weekly newsletter when you join Hackster.
Latest articles
Read more
Related articles