Deepfake Tech Can Now Anonymize Your Face to Protect Privacy

Deepfake Tech Can Now Anonymize Your Face to Protect Privacy

Deepfake videos have demonstrated their applications in entertainment—both acceptably and controversially—but these general adversarial networks (GANs) still have a long way to go before they offer convincing results. This has led to a lack of practical applications and plenty of paranoia, but we’re beginning to see efforts to employ deepfake technology in ways that can help people protect themselves. A recent paper published at the International Symposium on Visual Computing demonstrates how deepfakes could help protect the right to privacy before they become a tool used to cause harm.

The paper utilizes face-swapping to anonymize the speaker’s appearance. Although the authors were not the first to consider this application, initial work simply transplanted expressions onto an existing face that consented to the swap. This new method, instead, replaces someone’s existing face with a uniquely generated one from a data set of 1.5 million face images. In theory, the new face won’t match any face in reality.

Image credit: DeepPrivacy
Image credit: DeepPrivacy

While the GAN produces suitable results for photos, it still struggles with replacing faces in video. This is likely because the network has to generate a “new” face for each frame. Maintaining consistency for a non-existent face isn’t an easy task in theory or in practice.

Image credit: DeepPrivacy
Image credit: DeepPrivacy

For the purposes of anonymizing a subject in a video, however, a glitchy look doesn’t matter too much. After all, the purposes of this GAN isn’t to fool anyone but rather obscure a person’s face without losing their expression. By blocking out a person’s face with a box (as seen on the left side of the GIF above), we can’t identify them but we also know very little about what they’re attempting to communicate.

In circumstances where anonymity is vital but expression can make a difference, such as anonymizing the appearance of sources in the news or documentary films that could put a subject at risk by revealing their identity, this method could be very useful and employed today. Its only notable issues include glitches that occur in poor lighting conditions or when the subject makes significant movements. Further work will likely resolve these problems in the coming years, but applying this method for its expected purposes won’t easily encounter these drawbacks. After all, interview subjects typically do not make any significant movements, and lighting conditions are controllable more often than not. Besides, when it comes to poor lighting correction, there’s already an AI for that as well.

We can already forge voices with enough precision to successfully impersonate other people to the tune of $243,000 in theft, so anonymizing voices certainly does not create an additional hurdle. We’ve never required artificial intelligence for altering voices and more thorough processes for vocal anonymity exist as well. Now, we have a good start with video. If you want to try it for yourself, you can access the source code on GitHub.

Continue reading

Protect Your Online Privacy With the 5 Best VPNs
Protect Your Online Privacy With the 5 Best VPNs

Investing in a VPN is a smart choice right now, but the options are vast. To help narrow things down a bit, we've rounded up five of our very favorite consumer services.

States Claim Google’s ‘Privacy Sandbox’ Violates Antitrust Law
States Claim Google’s ‘Privacy Sandbox’ Violates Antitrust Law

Google finds itself in an impossible position. Privacy advocates have long demanded Google follow Microsoft and Mozilla's lead in purging tracking cookies from Chrome. Now that it's doing so, state attorneys general have filed an amended antitrust complaint that uses the so-called "Privacy Sandbox" as ammunition against the company.

Apple AirTags, Now Jailbroken, Could Become Even Bigger Privacy Nightmare
Apple AirTags, Now Jailbroken, Could Become Even Bigger Privacy Nightmare

The new Apple AirTag is not the first smart tracker, but it's so good at what it does that it could actually be a privacy nightmare, an even greater concern after a security researcher has shown it's possible to "jailbreak" one.

Meet Starlab, a Private Space Station That Could Fly by 2027
Meet Starlab, a Private Space Station That Could Fly by 2027

Starlab will serve as a hub for both commercial space travel and various types of research.