People Are Using a Neural Network App to Create Fake Celebrity Porn

People Are Using a Neural Network App to Create Fake Celebrity Porn

Many of the most magical pieces of consumer technology we have today are thanks to advances in neural networks and machine learning. We already have impressive object recognition in photos and speech synthesis, and in a few years cars may drive themselves. Machine learning has become so advanced that a handful of developers have created a tool called FakeApp that can create convincing “face swap” videos. And of course, they’re using it to make porn. Well, this is the internet, so no surprise there.

FakeApp is based on work done on a deep learning algorithm by a Reddit user known as Deepfakes. The tool is available for download and mirrored all over the internet, but setup is non-trivial. You need to download and configure Nvidia’s CUDA framework to run the TensorFlow code, so the app requires a GeForce GPU. If you don’t have a powerful GPU, good luck finding one for a reasonable price. The video you’re looking to alter also needs to be split into individual frames, and you need a significant number of photos to train FakeApp on the face you want inserted.

The end result is a video with the original face replaced by a new one. The quality of the face swap varies based on how the neural network was trained — some are little more than face-shaped blobs, but others are extremely, almost worryingly, convincing. See below for a recreation of the CGI Princess Leia from Rogue One made in FakeApp. The top image is from the film and the bottom was made in FakeApp in about 20 minutes, according to the poster. There’s no denying the “real” version is better, but the fake one is impressive when you consider how it was made.

The first impulse for those with the time and inclination to get FakeApp working was to create porn with their favorite celebrity swapped in for the actual performer. We won’t link to any of those, but suffice it to say a great deal of this content has appeared in the two weeks or so FakeApp has been available. Some are already coming out strongly against the use of this technology to make fake porn, but the users of FakeApp claim it’s not any more damaging than fake still images that have been created in Photoshop for decades.

The real power, and potential danger, of this technology isn’t the porn. What if a future version of this technique is so powerful it becomes indistinguishable from real footage? All the face swaps from FakeApp have at least a little distortion or flicker, but this is just a program developed by a few people on Reddit. With more resources, neural networks might be capable of some wild stuff.

Continue reading

Google Pixel Slate Owners Report Failing Flash Storage
Google Pixel Slate Owners Report Failing Flash Storage

Google's product support forums are flooded with angry Pixel Slate owners who say their devices are running into frequent, crippling storage errors.

Astronomers Might Finally Know the Source of Fast Radio Bursts
Astronomers Might Finally Know the Source of Fast Radio Bursts

A trio of new studies report on an FRB within our own galaxy. Because this one was so much closer than past signals, scientists were able to track it to a particular type of neutron star known as a magnetar.

How to Build a Face Mask Detector With a Jetson Nano 2GB and AlwaysAI
How to Build a Face Mask Detector With a Jetson Nano 2GB and AlwaysAI

Nvidia continues to make AI at the edge more affordable and easier to deploy. So instead of simply running through the benchmarks to review the new Jetson Nano 2GB, I decided to tackle the DIY project of building my own face mask detector.

Apple’s New M1 SoC Looks Great, Is Not Faster Than 98 Percent of PC Laptops
Apple’s New M1 SoC Looks Great, Is Not Faster Than 98 Percent of PC Laptops

Apple's new M1 silicon really looks amazing, but it isn't faster than 98 percent of the PCs sold last year, despite what the company claims.