Human Rights Groups Ask Zoom to Scrap Emotion AI

Zoom has been flirting with the concept of emotion AI ever since the pandemic gave it a second wind. As we touched on last month, tech giant Intel has been working alongside an e-learning software company to produce an emotion-analyzing program that stacks with Zoom. This program would supposedly benefit teachers by telling them when students appear confused or bored, allowing them to tailor their instruction and increase engagement. Protocol similarly reported in April that companies have begun using emotion AI during sales calls to assess potential customers’ moods and adjust their strategy accordingly. Unbeknownst to them, every customer is graded on an “emotion scorecard” throughout their call.
Digital rights non-profit Fight for the Future quickly caught wind of Protocol’s report. So did the American Civil Liberties Union (ACLU), Access Now, Jobs With Justice, and 24 other human rights groups—all of whom signed an open letter to Zoom published Wednesday. The letter asks Zoom founder and CEO Eric Yuan to scrap the company’s plans to introduce emotion AI, saying the technology is punitive, manipulative, discriminatory, rooted in pseudoscience, and a data integrity risk.

“Zoom claims to care about the happiness and security of its users but this invasive technology says otherwise,” the letter reads. “This move to mine users for emotional data points based on the false idea that AI can track and analyze human emotions is a violation of privacy and human rights. Zoom needs to halt plans to advance this feature.”
The open letter is far from the first to criticize emotion AI. Many have said the technology constitutes excessive surveillance, especially when the targeted students or customers don’t know their body language, tone, and other alleged emotional markers are being assessed. Others have said emotion AI could end up dishing out negative (or simply incorrect) analyses of people whose cultures express emotions differently.
The advocacy groups’ letter closes by reminding Yuan that his company has previously “made decisions that center users’ rights,” such as backtracking its decision to implement face-tracking features due to privacy concerns. “This is another opportunity to show you care about your users and your reputation,” the organizations write. “You can make it clear that this technology has no place in video communications.”
Continue reading

DuckDuckGo Is Demoting Russian Propaganda, and Some Users Aren’t Happy
The company has decided to demote Russian propaganda, but the user base is up in arms about this decision, which many believe is a "slippery slope."

Microsoft Halts Sales of Emotion-Reading Tech, Limits Facial Recognition Access
The company’s decision follows a widespread effort to investigate the social implications of emotion and facial recognition tech.

Study: Autistic People Recognize Emotions Better Than Neurotypical in Cartoons
Neurotypical people have spent a long time believing people with autism struggle to pick up on social and emotional cues. According to new research, this isn't always the case.

New Robot With ‘Emotional Intelligence’ Arrives at Space Station
The population of the International Space Station (ISS) is about to go up by one, but it won't be another human occupant. It'll be an AI-powered flying robot called CIMON-2, a followup to the experimental CIMON robot that debuted last year.