Home / Innovative / Disguise your photos with this AI privacy tool to fool facial recognition

Disguise your photos with this AI privacy tool to fool facial recognition



The ubiquitous face recognition is a serious threat to privacy. The idea that the photos we share are collected by companies to train algorithms that are sold commercially is worrying. Anyone can buy these tools, take a picture of a stranger, and find out who they are in seconds. However, researchers have found a clever way to tackle this problem.

The solution is a tool called Fawkes and was developed by scientists from the Sand Lab at the University of Chicago. Named after the Guy Fawkes masks worn by revolutionaries in the United States V for Vendetta Comic and film, Fawkes uses artificial intelligence to subtly and almost imperceptibly manipulate your photos and outwit facial recognition systems.

The way the software works is somewhat complex. If you let Fawkes run your photos, you̵

7;re not exactly invisible to face recognition. Instead, the software makes minor changes to your photos, so that any algorithm that scans these images in the future will treat you as a completely different person. Running Fawkes on your photos is essentially like adding an invisible mask to your selfies.

Scientists call this process “camouflage” and it is supposed to damage the resources that facial recognition systems need to function: databases with faces that come from social media. For example, the facial recognition company Clearview AI claims to have collected around three billion images of faces from websites such as Facebook, YouTube and Venmo, with which it identifies strangers. But if the photos you share online go through Fawkes, the researchers say, then the face the algorithms know won’t be your own.

According to the University of Chicago team, Fawkes is 100 percent successful against state-of-the-art facial recognition services from Microsoft (Azure Face), Amazon (Rekognition) and Face ++ by Chinese technology giant Megvii.

“We essentially use the disguised photo like a Trojan horse to corrupt and learn about unauthorized models not correct What makes you like you and not like someone else? “Said Ben Zhao, a professor of computer science at the University of Chicago who was involved in the development of Fawkes software The edge. “As soon as corruption occurs, you’re protected no matter where you go or where you’re seen.”

You would hardly recognize her. Photos of Queen Elizabeth II. Before (left) and after (right) going through the Fawkes camouflage software.
Image: the edge

The group behind the work – Shawn Shan, Emily Wenger, Jiayun Zhang, Huiying Li, Haitao Zheng and Ben Y. Zhao – published an article on the algorithm earlier this year. At the end of last month, however, they also released Fawkes as free software for Windows and Macs that anyone can download and use. To date, it is said to have been downloaded more than 100,000 times.

In our own tests, we found that Fawkes is sparse in design but simple enough to use. It takes a few minutes to process each image, and most of the changes you make are imperceptible. Earlier this week, The New York Times published a story about Fawkes, which found that the camouflage effect was fairly obvious and often made gender-specific changes to images such as women’s mustaches. However, the Fawkes team says the updated algorithm is much more subtle and The edgeThe own tests agree.

But is Fawkes a silver bullet for privacy? It is doubtful. First of all, there is the problem of adoption. If you read this article and decide to use Fawkes to camouflage photos that you will upload to social media in the future, you are definitely in the minority. Face recognition is worrying because it is a society-wide trend and the solution must therefore be company-wide. If only the tech-savvy shields their selfies, it only creates inequality and discrimination.

Second, many companies that sell facial recognition algorithms created their facial databases a long time ago, and you cannot take this information back retroactively. Clearview’s CEO, Hoan Ton-That, said the Times So much. “There are billions of unchanged photos on the Internet, all with different domain names,” said Ton-That. “In practice, it’s almost too late to perfect a technology like Fawkes and use it on a large scale.”

Compare uncloaked and disguised faces to Fawkes.
Image: SAND Lab, University of Chicago

Of course, the team behind Fawkes disagrees with this assessment. They find that while companies like Clearview claim to have billions of photos, it doesn’t mean much when you consider that they should identify hundreds of millions of users. “For many people, Clearview probably has a very small number of publicly available photos,” says Zhao. And if people post more camouflaged photos in the future, sooner or later the number of camouflaged pictures will exceed the number of camouflaged photos.

However, regarding adoption, the Fawkes team acknowledges that their software needs to be released on a larger scale to make a real difference. They have no plans to build a web or mobile app for security reasons, but hope that companies like Facebook can integrate similar technologies into their own platforms in the future.

The integration of this technology would be in the interest of these companies, says Zhao. Companies like Facebook don’t want that stop By sharing photos, these companies can continue to collect the data they need from images (for functions such as tagging photos) before they are disguised on the public web. The integration of this technology may have little impact on current users, but it can help convince future, privacy-conscious generations to log on to these platforms.

“The takeover by larger platforms, e.g. B. Facebook or others, could have a debilitating effect on Clearview over time by basically doing so [their technology] so ineffective that it is no longer useful or financially viable as a service, ”says Zhao. “Leaving Clearview.ai out of business because it is no longer relevant or accurate is something we would be happy with [with] as a result of our work. “


Source link