Beyond Recognition

CamPro uses camera-based obfuscation to thwart facial recognition while still allowing for activity recognition and other applications.

Nick Bild
10 months agoSecurity
On-device anti-facial recognition technologies can safeguard privacy (📷: W. Zhu et al.)

Facial recognition technologies have become increasingly prevalent in today's digital landscape, finding applications in various sectors such as law enforcement, retail, finance, and even everyday consumer devices. These technologies utilize advanced algorithms to analyze and identify unique facial features, allowing for swift and accurate identification of individuals. From unlocking smartphones to surveillance cameras in public spaces, facial recognition has become a ubiquitous aspect of modern life.

The widespread adoption of facial recognition, however, has sparked significant concerns about privacy. Critics argue that the deployment of such technology raises serious ethical questions, as it can lead to unwarranted surveillance and the potential misuse of personal information. Governments and organizations employing facial recognition systems often have access to vast databases, raising fears of mass surveillance and erosion of individual privacy.

In response to these concerns, there is a growing trend towards the development of anti-facial recognition measures. One common approach involves the manipulation of facial images after they have been captured, aiming to disrupt the algorithms used by recognition systems. Techniques such as adversarial attacks and image obfuscation attempt to introduce subtle alterations to the facial features, making it challenging for recognition systems to accurately identify individuals. However, a significant drawback of these measures is that the images are manipulated after being captured, leaving room for potential attackers to acquire the unmodified versions and exploit them for facial recognition purposes.

A new twist in the ongoing cat-and-mouse game has just been revealed by a team at Zhejiang University with their anti-facial recognition method called CamPro. In contrast to existing approaches, CamPro leverages the camera itself to obfuscate images, making it impossible for clear facial images to be taken from the device. But despite the obfuscation, the images are still useful — they can be used for a wide range of applications, like person detection and activity recognition, that are needed for many IoT devices.

Typically, a digital camera consists of both an image sensor and an image signal processor. The image sensor captures raw readings representing detected light levels. The signal processor then converts those measurements into an RGB format that makes sense to the human visual system. This signal processor has tunable parameters that allow it to work with different image sensors. The researchers realized that this tunability of parameters might have utility in anti-facial recognition applications.

They focused on the gamma correction and color correction matrix parameters of signal processors. These factors have the potential to defeat facial recognition systems, but consistently tricking these systems is challenging. So, an adversarial learning framework was designed and leveraged to determine the optimal adjustments that should be made to the signal processor’s parameters.

After making this change it was found that the images were indeed resistant to facial recognition algorithms, but they were a bit too garbled to be of use for many applications. Accordingly, the team trained an image enhancement algorithm to restore the image’s quality to make it suitable for tasks like activity recognition. Crucially, this step was not able to restore facial recognition capabilities.

Experiments were conducted that revealed that CamPro images were only correctly identified by a variety of facial recognition algorithms in 0.3% of cases. Anticipating the next move of malicious hackers, they retrained a facial recognition algorithm on manipulated images captured by CamPro, while utilizing their full knowledge of how the obfuscation technique works in the retraining effort. This was found to have little impact on the anti-facial recognition technique.

As it presently stands, CamPro appears to be a strong protection against facial recognition where only more coarse-grained detection capabilities are needed. Of course, despite their best efforts, that may change in the future. Malicious hackers are a crafty bunch, and the cat-and-mouse game seems to go on forever. If you want to protect your privacy without relying on someone else's hardware to do it, you might be interested in checking out Freedom Shield.

Nick Bild
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles