Intel’s AI Can Detect DeepFakes With 96 Percent Accuracy

Intel’s AI Can Detect DeepFakes With 96 Percent Accuracy

Deepfakes wasted no time becoming an internet-wide problem. Within just a few years, these manipulated videos have evolved to take part in web comedy, political misinformation, fake job interviews, and even pornography that lacks the depicted person’s consent. Deepfakes are likely to become even more convincing as artificial intelligence (AI) technology develops, leading some companies to seek out or build their own deepfake-detecting techniques.

Intel’s FakeCatcher system does just that, and with surprising accuracy, too. The company says FakeCatcher is able to differentiate videos of real people from deepfakes with 96 percent accuracy in a matter of milliseconds. How does it do it? By looking for the stuff that truly brings us to life: blood.

FakeCatcher seeks out minuscule signs of blood flow in the pixels of a potential deepfake video. As our hearts pump blood, our veins change color. This phenomenon is integral to photoplethysmography (PPG), a technique typically used to monitor a person’s blood flow. Using PPG, FakeCatcher captures blood flow signals from across a video subject’s face. The tool then translates those signals into spatiotemporal maps, which deep learning algorithms compare against human PPG activity to determine whether the subject of the video is real or fake.

Intel engineers used several of the company’s own software products to create FakeCatcher, like OpenVINO as the tool’s deep learning interface and OpenCV for real-time detection. FakeCatcher runs on 3rd Gen Intel Xeon Scalable processors, which Intel says enables the tool to run up to 72 separate detection streams at any given time. Depending on how and where FakeCatcher is deployed, this type of capacity could help squash conspiracy theories, scams, and maliciously-made adult content before they have the ability to reach wider audiences. (Intel didn’t specify where exactly FakeCatcher will be going now that it’s passed testing, but did mention it could find a place on social media, news, and nonprofit websites.)

FakeCatcher represents Intel’s leap into a growing effort to stop deepfakes in their tracks. Earlier this year, Google made the decision to ban deepfake training on its Colab service amid a rise in misinformation and computing resource consumption. Just two weeks later, the European Union announced that it’d be taking action against Meta, Twitter, and other tech giants if they didn’t make a demonstrable effort to combat deepfakes as well.

Continue reading

Pfizer Claims New COVID-19 Vaccine 90 Percent Effective
Pfizer Claims New COVID-19 Vaccine 90 Percent Effective

There have been a number of COVID-19 vaccines in development in the United States and around the world, and one of them has shown some very positive preliminary results in its Phase 3 trial. One particular vaccine developed by Pfizer and German firm BioNTech appears to be more than 90 percent effective in preventing symptomatic…

Apple’s New M1 SoC Looks Great, Is Not Faster Than 98 Percent of PC Laptops
Apple’s New M1 SoC Looks Great, Is Not Faster Than 98 Percent of PC Laptops

Apple's new M1 silicon really looks amazing, but it isn't faster than 98 percent of the PCs sold last year, despite what the company claims.

Space Mining Gets 400 Percent Boost From Bacteria, ISS Experiments Show
Space Mining Gets 400 Percent Boost From Bacteria, ISS Experiments Show

We'll need lots of raw materials to sustain human endeavors on other planets, and a new project on the International Space Station demonstrates how we can make space mining over 400 percent more efficient.

Cyberpunk 2077’s PC Player Base Has Shrunk 79 Percent Since Launch
Cyberpunk 2077’s PC Player Base Has Shrunk 79 Percent Since Launch

CD Projekt Red's latest game has seen a steep player drop off — steeper than usual for a game of this size — but it's probably not a problem for the title long-term.