Deepfakes Are Interviewing For Tech Jobs
Bad actors are impersonating other people via deepfakes to weasel their way into remote work positions, according to the agency’s latest public service announcement. The wrongdoer starts by gathering enough of their target’s personal information to convincingly apply to jobs as that person. Then they acquire a few high-quality photos of the person, either through theft or a bit of casual online sleuthing. When interview time rolls around, the bad actor uses the photos (and sometimes voice spoofing) to create and deploy a deepfake, which often passes for the target in a video medium.
The FBI says job candidate impersonations often involve IT and programming roles, as well as any role that would “include access to customer [personally identifiable information], financial data, corporate IT databases and/or proprietary information.” Such details could be used to steal money from a company directly, as well as undermine the stock market, release competing products or services, or sell massive amounts of private data. While it’s a little less likely that wrongdoers would want to actually work in their wrongfully-won role long-term, there’s also a chance that they want to earn US currency from outside the US—or enjoy the perks associated with a role they otherwise wouldn’t be able to obtain. Some even wonder if the impersonations could be a part of a larger operation threatening national security.
Right now it’s unclear whether job candidate impersonations are ever caught mid-interview. While some deepfakes are awfully realistic, they’re usually one-directional; rarely are they a part of the conversational two-way street, which is typical of a job interview. Ideally, even the untrained eye would be able to notice something “off” about a deepfake interviewee. But there’s also something to be said about your occasional frazzled recruiter, who—in desperation to fill a role or ten—might not catch an unsettling visual lag, or might chalk it up to a poor internet connection. In this way, technical prowess and a bit of luck could combine to create the “perfect” criminal opportunity.
While the FBI hasn’t offered specific strategies for wary recruiters, it does vaguely warn of uncoordinated audio and visuals. “In these interviews, the actions and lip movement of the person seen interviewed on-camera do not completely coordinate with the audio of the person speaking,” the PSA reads. “At times, actions such as coughing, sneezing, or other auditory actions are not aligned with what is presented visually.”
Continue reading
Google Bans Deepfake Training on Colab
Google has quietly made the decision to ban users from creating deepfakes on its Colaboratory computing service.
EU: Meta, Google, and Twitter Can Either Combat Deepfakes or Pay Massive Fines
Tech giants that fail to meet the EU's new disinformation prevention guidelines will have to pay fees equal to six percent of their global turnover.
Apple Granted Patent for Deepfakes Based on Reference Images
You either die a deepfake-defeating hero or live long enough to see yourself become their creator.
Intel’s AI Can Detect DeepFakes With 96 Percent Accuracy
FakeCatcher looks for what truly brings us to life: blood.