Eye, Robot
These spooktacular robotic eyes use a Raspberry Pi and deep learning to mimic your every move.
Just because grocery stores have already started to put out their stock of Halloween candy does not mean it is time to don our costumes and head out for trick-or-treating. We still have a way to go. But all of those custom, spooky electronics projects hardware hackers like to create around this time of year are not going to build themselves, so now is the time to get started. Rome wasn’t built in a day, and neither is a red-eyed skeleton that screams when little neighborhood ghouls venture a tad too close.
Instructables user thomas9363 has just detailed the construction of something that could serve as the centerpiece of many frightening hacks — creepy robotic eyes that mimic your every move. The horror! This mysterious device uses chilling computer vision and an eerie artificial intelligence algorithm to track the movements of an individual’s eyes. Those movements are then reproduced exactly in a set of uncanny robotic eyes that almost certainly mean you harm.
In a previous project, thomas9363 built the robotic eyes from hacked up lottery balls, a universal joint, and a series of servo motors that control x- and y-axis eye movements, as well as opening and closing of the eyelids. These motors are operated using an Adafruit 16-Channel 12-bit PWM/Servo Driver and a Raspberry Pi 4 single-board computer with 4 GB of RAM. In the original setup, the eye motions were controlled by an Android app.
In order to update the eyes for maximum spookiness, thomas9363 added in a Logitech C920 USB camera and some new software. In particular, Google’s MediaPipe Face Mesh framework was utilized for eye tracking. To account for blinking, which might otherwise make the robot go bananas, a one-second debounce period was built into the software.
Testing showed that this design worked very well at tracking both horizontal eye movements and opening and closing of the eyelids. However, it did a very poor job at recognizing when the eyes were rolling up or down. This was because even slight tilting of the head would completely throw off the ratios between the detected facial landmarks, making it virtually impossible to determine where the eyes were on the vertical axis over time.
Since a linear algorithm could not decode the data correctly in these scenarios, thomas9363 employed a deep learning algorithm to learn the patterns. After collecting a dataset and training this model on a laptop, it was loaded onto the Raspberry Pi for running inferences. Tests showed that this method was capable of sorting out the vertical eye tracking issues.
With accurate eye position data available, all that was left to do was translate it into commands to send to the servo motors. And the final result — spooktacular! Step-by-step build instructions are available in the project write-up. Even if you do not have a set of robot eyes available to you, this project could still be useful. It could be used, for example, to control a computer or other electronic devices.