Cloaking Photos—A New Privacy Measure?

Fawkes-Image-Cloaking

Facial Recognition and Privacy Concerns Spur Image Cloaking Options

With the introduction of artificial intelligence platforms like ChatGPT, the amount of data online that is being analyzed, stored, and reproduced is staggering. Now, more than ever, personal data is at the risk of becoming highly visible online and circulated for public consumption. From personal blogs to photo albums online, nothing is sacred anymore. And as facial recognition and the collection of biometric data becomes more commonplace, citizens have little in their arsenal to protect their privacy.

Fawkes Image Cloaking – A New Security Measure

Fortunately, a project in development at the labs of the University of Chicago aims to change that. Known as Fawkes, a team of PhD students and professors have developed a complex algorithm and software tool that can help citizens protect their privacy when it comes to facial recognition online. Built to run client-side and locally on the computer, Fawkes injects miniscule changes into personal images that make the photographs read differently to machines looking to “identify” an individual. Referred to as “image cloaking,” these pixel-level changes are impossible to see to the human eye but make it so that facial recognition software cannot “recognize” a real person based on cloaked photos.

Growing Use of Facial Recognition and Fawkes Image Cloaking

Inspired by the V for Vendetta character Guy Fawkes, the Fawkes team notes that cloaked images can still be used for model training and has tested well against sophisticated facial recognition platforms from technology giants such as Microsoft and Amazon. The team believes the need for such a privacy tool is especially timely as recent studies revealed that unregulated facial recognition services have downloaded over 3 billion personal photographs of people from social media and other websites without the knowledge or permission of their owners.

As no current state or federal regulations specifically address these concerns, a citizen’s best bet may be to follow and utilize such software tools like Fawkes. While the project is still in development, it will be interesting to see how copyrights of the photos may be affected by the licensed use of this software and when Fawkes’s miniscule changes are injected into protected works.

Key Takeaways on Fawkes Image Cloaking Technology

The SAND Lab at the University of Chicago is currently working on an important new privacy tool that would combat facial recognition models working off of the unmitigated collection of photographs online. This software:

  • Works by injecting minute changes into digital photographs that work to distinguish the photos from your true self in facial recognition software;

  • Highlights growing privacy concerns about unregulated data collection when it comes to facial recognition based on photograph collecting; and

  • Raises potential questions of copyright law as explicit changes are made to digital photographs.

For more information about data privacy, see our Technology Law Services and Industry Focused Legal Solutions pages.