Meta Settles with DOJ Over Discriminatory Ad Algorithms

The lawsuit, filed by the Assistant Secretary for Fair Housing and Equal Opportunity in collaboration with the Department of Housing and Urban Development (HUD), alleged Meta enabled advertisers to target housing ads based on demographic data protected by the Fair Housing Act (FHA). Advertisers were able to selectively show housing ads to users of a certain race, color, religion, sex, disability status, familial status, or national origin. Meta’s marketing tools, known as “Special Ad Audience” (previously “Lookalike Audience”), would then use machine learning to determine whether a user was “eligible” to see a housing ad. This prevented some users from seeing opportunities they otherwise might have used.
The complaint argues that Meta has engaged in disparate treatment by distinguishing users based on FHA-protected characteristics, and then designing algorithms that “[affect] Facebook users differently on the basis of their membership in protected classes.”

Meta immediately worked with the DOJ to devise a settlement. The agreement, established Tuesday shortly after the original filing, requires that Meta pay a $115,054 fine and bring its “Special Ad Audience” to a screeching halt. Meta will have until the end of this year to create a new ad targeting tool. The replacement must “address disparities for race, ethnicity and sex between advertisers’ targeted audiences and the group of Facebook users to whom Facebook’s personalization algorithms actually deliver the ads.” Meta and HUD will work together to select a third-party reviewer who will verify the new tool’s compliance.
Advertisers responsible for housing ads, however, won’t be able to use this targeting system at all. The settlement prohibits Meta from allowing housing advertisers to selectively show ads to users based on FHA-protected characteristics. Failure to comply with this requirement (or create a satisfactory replacement tool) will result in continued litigation in federal court.
“It is not just housing providers who have a duty to abide by fair housing laws,” said Demetria McCain, the Principal Deputy Assistant Secretary for Fair Housing and Equal Opportunity at HUD, in a DOJ statement. “Parties who discriminate in the housing market, including those engaging in algorithmic bias, must be held accountable.”
The complaint and resulting settlement constitute the DOJ’s first foray into challenging algorithmic bias under the Fair Housing Act. But they won’t be the Department’s last. Last year Google was caught allowing advertisers to bar non-binary users from seeing job ads. The company quickly pledged to fix the issue, which it called “inadvertent.” But the oversight had already highlighted how allegedly fine-tuned ad targeting could have a significant negative impact on a certain demographic—or worse, be weaponized against members of a particular community.
Continue reading

Amazon Copied Sellers’ Products, Manipulated Algorithms to Display Their Own Versions First
They say imitation is the highest form of flattery, but for businesses that sell on Amazon, it can be the difference between success and ruin.

Twitter’s Internal Research Confirms Its Algorithm Favors Right-Wing Voices
In recent years, conservative politicians and activists have railed against "cancel culture", claiming Twitter and other social media outlets are biased against them—hence, the existence of right-wing alternatives like Gab, Parler, and Trump's new Truth Social. Twitter has just published some internal research that suggests the opposite. According to the paper, its algorithm actually favors right-wing voices.

Latest iOS Update Employs Nudity-Detecting Algorithms in Children’s Version of Messenger
The feature rolled out to iPhones and iPads with iOS 15.2, which became available Monday.

MIT Researchers Say All Network Congestion Algorithms Are Unfair
Legitimate network management has to go beyond penalizing people for using more data, but researchers from MIT say the algorithms that are supposed to do that don't work as well as we thought.