Worried About Privacy for Your Selfies? These Tools Can Help Spoof Facial Recognition AI

Fawkes and LowKey are two tools that leverage adversarial attacks to spoof AI, preventing detection by facial recognition software.

Worried About Privacy for Your Selfies? These Tools Can Help Spoof Facial Recognition AI

Photo Credit: University of Chicago/ SAND Lab

Fawkes introduces pixel-level alterations to images, thwarting recognition by AI

Highlights
  • Clearview, AWS Rekognition are examples of facial recognition software
  • Such software can be duped by using adversarial attacks
  • Two methods to spoof such AI were detailed at a conference recently
Advertisement

Ever wondered what happens to a selfie you upload on a social media site? Activists and researchers have long warned about data privacy and said that photographs uploaded on the Internet may be used to train artificial intelligence (AI) powered facial recognition tools. These AI-enabled tools (such as Clearview, AWS Rekognition, Microsoft Azure, and Face++) could in turn be used by governments or other institutions to track people and even draw conclusions such as the subject's religious or political preferences. Researchers have come up with ways to dupe or spoof these AI tools from being able to recognise or even detect a selfie, using adversarial attacks – or a way to alter input data that causes a deep-learning model to make mistakes.

Two of these methods were presented last week at the International Conference of Learning Representations (ICLR), a leading AI conference that was held virtually. According to a report by MIT Technology Review, most of these new tools to dupe facial recognition software make tiny changes to an image that are not visible to the human eye but can confuse an AI, forcing the software to make a mistake in clearly identifying the person or the object in the image, or, even stopping it from realising the image is a selfie.

Emily Wenger, from the University of Chicago, has developed one of these ‘image cloaking' tools, called Fawkes, with her colleagues. The other, called LowKey, is developed by Valeriia Cherepanova and her colleagues at the University of Maryland.

Fawkes adds pixel-level disturbances to the images that stop facial recognition systems from identifying the persons in them but it leaves the image unchanged to humans. In an experiment with a small data set of 50 images, Fawkes was found to be 100 percent effective against commercial facial recognition systems. Fawkes can be downloaded for Windows and Mac, and its method was detailed in a paper titled 'Protecting Personal Privacy Against Unauthorized Deep Learning Models'.

However, the authors note Fawkes can't mislead existing systems that have already trained on your unprotected images. LowKey, which expands on Wenger's system by minutely altering images to an extent that they can fool pretrained commercial AI models, preventing it from recognising the person in the image. LowKey, detailed in a paper titled 'Leveraging Adversarial Attacks to Protect Social Media Users From Facial Recognition', is available for use online.

Yet another method, detailed in a paper titled 'Unlearnable Examples: Making Personal Data Unexploitable' by Daniel Ma and other researchers at the Deakin University in Australia, takes such ‘data poisoning' one step further, introducing changes to images that force an AI model to discard it during training, preventing evaluation post training.

Wenger notes that Fawkes was briefly unable to trick Microsoft Azure, saying, “It suddenly somehow became robust to cloaked images that we had generated… We don't know what happened.” She said it was now a race against the AI, with Fawkes later updated to be able to spoof Azure again. “This is another cat-and-mouse arms race,” she added.

The report also quoted Wenger saying that while regulation against such AI tools will help maintain privacy, there will always be a “disconnect” between what is legally acceptable and what people want, and that spoofing methods like Fawkes can help “fill that gap”. She says her motivation to develop this tool was simple: to give people “some power” that they didn't already have.

Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Windows 10X May ‘Likely Never Arrive’, Microsoft Focussing on Windows 10: Report
Share on Facebook Gadgets360 Twitter Share Tweet Snapchat Share Reddit Comment google-newsGoogle News
 
 

Advertisement

Follow Us

Advertisement

© Copyright Red Pixels Ventures Limited 2024. All rights reserved.
Trending Products »
Latest Tech News »