Intel’s AI Can Detect DeepFakes With 96 Percent Accuracy

Intel’s AI Can Detect DeepFakes With 96 Percent Accuracy

Deepfakes wasted no time becoming an internet-wide problem. Within just a few years, these manipulated videos have evolved to take part in web comedy, political misinformation, fake job interviews, and even pornography that lacks the depicted person’s consent. Deepfakes are likely to become even more convincing as artificial intelligence (AI) technology develops, leading some companies to seek out or build their own deepfake-detecting techniques.

Intel’s FakeCatcher system does just that, and with surprising accuracy, too. The company says FakeCatcher is able to differentiate videos of real people from deepfakes with 96 percent accuracy in a matter of milliseconds. How does it do it? By looking for the stuff that truly brings us to life: blood.

FakeCatcher seeks out minuscule signs of blood flow in the pixels of a potential deepfake video. As our hearts pump blood, our veins change color. This phenomenon is integral to photoplethysmography (PPG), a technique typically used to monitor a person’s blood flow. Using PPG, FakeCatcher captures blood flow signals from across a video subject’s face. The tool then translates those signals into spatiotemporal maps, which deep learning algorithms compare against human PPG activity to determine whether the subject of the video is real or fake.

Intel engineers used several of the company’s own software products to create FakeCatcher, like OpenVINO as the tool’s deep learning interface and OpenCV for real-time detection. FakeCatcher runs on 3rd Gen Intel Xeon Scalable processors, which Intel says enables the tool to run up to 72 separate detection streams at any given time. Depending on how and where FakeCatcher is deployed, this type of capacity could help squash conspiracy theories, scams, and maliciously-made adult content before they have the ability to reach wider audiences. (Intel didn’t specify where exactly FakeCatcher will be going now that it’s passed testing, but did mention it could find a place on social media, news, and nonprofit websites.)

FakeCatcher represents Intel’s leap into a growing effort to stop deepfakes in their tracks. Earlier this year, Google made the decision to ban deepfake training on its Colab service amid a rise in misinformation and computing resource consumption. Just two weeks later, the European Union announced that it’d be taking action against Meta, Twitter, and other tech giants if they didn’t make a demonstrable effort to combat deepfakes as well.

Continue reading

How to Build a Face Mask Detector With a Jetson Nano 2GB and AlwaysAI
How to Build a Face Mask Detector With a Jetson Nano 2GB and AlwaysAI

Nvidia continues to make AI at the edge more affordable and easier to deploy. So instead of simply running through the benchmarks to review the new Jetson Nano 2GB, I decided to tackle the DIY project of building my own face mask detector.

Astronomers Have Detected a Planet’s Radio Emissions 51 Light-Years Away
Astronomers Have Detected a Planet’s Radio Emissions 51 Light-Years Away

The researchers claim this marks the first time an exoplanet has been detected in the radio bands.

Phosphine Detected in Venus’s Atmosphere May Have Just Been Sulfur
Phosphine Detected in Venus’s Atmosphere May Have Just Been Sulfur

The evidence of phosphine in Venus' atmosphere last year may have actually been a measurement error.

Astronomers Detect Another Possible Exoplanet Right Next Door
Astronomers Detect Another Possible Exoplanet Right Next Door

A project called Near Earths in the Alpha Center Region (NEAR) has just spotted tantalizing signals that could point to a planet in the habitable zone of Alpha Centauri, which is a mere 4.37 light years away. That's right next door in astronomical terms.