Microsoft Halts Sales of Emotion-Reading Tech, Limits Facial Recognition Access

Microsoft Halts Sales of Emotion-Reading Tech, Limits Facial Recognition Access

In a blog post published Tuesday, the software giant announced that it would be sunsetting facial analysis tools that claim to identify emotional states and personal attributes, like a person’s gender or age. Before, these capabilities were freely available within Azure Face API, Computer Vision, and Video Indexer. Now those who already have access to Microsoft’s emotional-reading features will have one year of use before their access is revoked.

“We collaborated with internal and external researchers to understand the limitations and potential benefits of this technology and navigate the tradeoffs,” wrote Sarah Bird, Azure AI’s principal group product manager, in the post. “In the case of emotion classification specifically, these efforts raised important questions about privacy, the lack of consensus on a definition of ‘emotions,’ and the inability to generalize the linkage between facial expression and emotional state across use cases, regions, and demographics. API access to capabilities that predict sensitive attributes also opens up a wide range of ways they can be misused—including subjecting people to stereotyping, discrimination, or unfair denial of services.”

Microsoft Halts Sales of Emotion-Reading Tech, Limits Facial Recognition Access

Microsoft has also chosen to place restrictions on who will be able to use its facial recognition technology in the future. Going forward, anyone interested in using these tools will need to submit an application detailing their project, after which Microsoft will approve or deny access. Microsoft said it will independently assess the benefits and risks of continuing to use both emotion and facial recognition tools for “controlled accessibility scenarios,” such as its own Seeing AI.

The company’s decision follows a widespread effort to investigate the social implications of emotion and facial recognition tech. Well-known human rights groups have recently called out Zoom for its mood recognition AI, which ranks unsuspecting sales call recipients on an “emotional scorecard.” Similarly, some are concerned that Intel’s emotion-reading e-learning software will incorrectly target or alienate students who are deemed “distracted” or “confused.” While these calls for action often go ignored, they sometimes work: last month the IRS dropped its ID.me program, which required users to upload a video selfie to access government services, after receiving near-unanimous backlash. Now Microsoft seems to have joined the ranks of those willing to backpedal—to do the right thing or for the sake of keeping customers, who knows.

Continue reading

IRS Drops Facial Recognition Program Used to Verify Identities
IRS Drops Facial Recognition Program Used to Verify Identities

The IRS was facing intense pressure to find a better solution, and surprisingly it capitulated.

Clearview Plans Facial Recognition Database That Knows Every Person on Earth
Clearview Plans Facial Recognition Database That Knows Every Person on Earth

Clearview AI aims to build a facial recognition database that includes you. Yes, you. But not only you. Clearview wants its database to include every human being on Earth, and it thinks it can get there by siphoning up 100 billion photos from the internet.

Ukraine is Using Facial Recognition to Contact Kin of Dead Russian Soldiers
Ukraine is Using Facial Recognition to Contact Kin of Dead Russian Soldiers

The families of at least 582 individuals have been contacted this way so far.

Facial-Recognition Tech Gets Lawyer Booted From Radio City Music Hall
Facial-Recognition Tech Gets Lawyer Booted From Radio City Music Hall

Her crime? Being a lawyer at the wrong law firm.