The Art of Teamwork
The creative team of the future may include a robot that helps artists and engineers to enhance their creativity and productivity.
Digital art has been capturing a lot of attention in the past few years, as more and more people turn to technology as a way to express their creativity. With the rise of new machine learning-powered tools, anyone can create stunning works of art without needing to have any traditional artistic skills or training.
Consider OpenAI’s DALL-E, which can create entirely new images from scratch based on textual descriptions provided to it by users. For example, users can provide descriptions like "an armchair in the shape of an avocado" or "a snail made out of harp strings," and DALL-E will generate corresponding images.
This democratization of art has been transformative for the business world and hobbyists alike, however, the biggest impacts have only been felt in the digital realm thus far. For those that want to pick up a paintbrush or pencil and just get a helping hand from technology, there is much less assistance to be found from specialized tools. But a new idea that recently emerged from the Human Computer Interaction Lab at Saarland University may help to fill this void.
They have created a robotic printer called RoboSketch that can transition between manual, assisted, and fully automated modes. The printer can produce either solid lines or patterns, and by manipulating a paintbrush-like handle on top of the robot, an artist can either take full control of drawing, or allow the robot to help draw straight lines or other precise shapes. And when needed, the robot can fully take control to create a predefined drawing.
Inside the 164 x 191 x 60 millimeter laser-cut MDF casing is a pair of micro metal gear motors with magnetic encoders to maneuver RoboSketch with precision. An ultrasonic distance sensor and wide-angle RGB camera are included to allow for obstacle avoidance and various computer vision tasks like blob detection. A high resolution handheld thermal inkjet printer enables the robot to print on diverse absorbent surfaces (canvas, paper, wood, textiles) at up to 30 centimeters per second.
A Raspberry Pi 4 single-board computer and a Microchip ATmega32U4 microcontroller provide the processing power to capture sensor measurements and run the algorithms that power the device’s main functions. An LCD screen on the top of RoboSketch gives feedback to the artist and allows them to configure the robot. The entire device is powered by batteries and can run for about eight hours on a single charge.
The device was demonstrated performing a number of practical tasks. In one case, the artist started drawing a simple pattern, then switched the robot into autonomous mode, and RoboSketch automatically continued drawing the same pattern. The team also showed off the robot’s ability to help a user draw a straight line, as well as the fully automated drawing of various geometric shapes.
RoboSketch has also got a few more fun and interesting features. Based on a user’s past drawings, RoboSketch can suggest new ideas to complete an in progress sketch using AI-based image synthesis techniques. In another example, RoboSketch was used to interactively create an electronic circuit by loading the printer with a conductive ink. This feature can be used to incorporate lights, or other electronic components, alongside traditional materials. And as always, the artist can quickly switch between modes to change their level of direct involvement in the creation of the artwork.
A cohort of seven participants was recruited to take part in a case study to assess the utility of RoboSketch. After a hands-on exploration session, the artists among the group believed the device could help them to improve their creativity. Engineers, on the other hand, saw the potential RoboSketch has to improve their productivity. These early experiences show that RoboSketch has the potential to provide valuable assistance to artists and engineers. And the researchers are not planning to stop here. In the future, they hope to develop many more mixed-mode tools that offer varying degrees of user assistance.