Facebook Force-Fed Garbage to 140 Million Americans a Month

Facebook Force-Fed Garbage to 140 Million Americans a Month

An internal Facebook report that leaked this week shed light on what concrete steps the company was willing to take in the aftermath of the US 2016 presidential election. While Facebook promised significant changes to its moderation and fake content detection algorithms after Russian propagandists and other bad actors swamped the service, this 2019 report shows just how little value those promises held.

The report’s author, data scientist Jeff Allen, left Facebook not long after compiling this document — his fourth in a series — because he felt his findings had been “effectively ignored” by the company, according to Technology Review.

The 20-page paper lays out the issues in detail. As of October 2019, 15,000 Facebook pages with majority-US audiences were being run from counties like Macedonia and Kosovo. This includes the largest Christian American page on Facebook (20 times larger than its closest competitor), the largest African-American page on Facebook, the second-largest Native-American page, and the fifth-largest women’s page. All told, 10 of the top 15 African American pages, all 15 of the top Christian pages, and four of the top 12 Native American pages were being run by troll farms whose behavior and tactics often echoed those of the Russian disinformation organization, the Internet Research Agency.

Facebook Force-Fed Garbage to 140 Million Americans a Month

In all of these categories, some or all of the loudest voices were not part of the communities they claimed to serve and had no interest in the well-being of people whose beliefs they claimed to share.

The problem with these pages was not specifically their content; the report notes that “The content the Troll Farms post wouldn’t be classified as violating, or even borderline. A lot of it is pretty entertaining.” What concerned Allen, if not Facebook, was the fact that known bad actors had built huge audiences on Facebook by pretending to be something they weren’t.

The content these farms posted at the time was overwhelmingly low-quality clickbait and engagement bait. The troll farms never produced independent content, Allen alleges, because attempting to create content for communities the operators didn’t engage with would run the risk of exposing themselves as bad actors. Troll farms dealt with this weakness by rampantly stealing content and monetizing it every way they could, often to the detriment of the pages and people who had actually created it.

The fact that Facebook is lousy with bad actors, bad faith, and bad memes isn’t exactly a surprise and it’s tempting to dismiss the issue with a shrug. Facebook, after all, isn’t responsible for vetting the intelligence of what people post online.

In this case, however, Facebook can’t weasel out of responsibility. One of the major findings of this report was that the vast majority of people who encountered this garbage content weren’t seeking it out. The reason these groups had such enormous reach is that Facebook’s algorithm actively promoted them:

It is important to note that 75 percent of their reach is non-direct. This means that a big majority of their ability to reach our users comes from the structure of our platform and our ranking algorithms, rather than user choice. Instead of users choosing to receive content from these actors, it is our platform that is choosing to give them an enormous reach. (Emphasis Original)

Facebook knew its feed ranking tended to promote and center bad, low-quality content. It was common for the troll operations to re-post the same content over and over again, watching it go viral each time. Facebook employees knew there were real issues with IP theft on its platform and that people who created content often missed out on any earnings associated with it. Allen writes, “We currently do not enforce our own policies at the Page level, even towards known bad actors.”

Facebook’s Traffic Is Built on a Mountain of Bullshit

As much as 40 percent of US VPVs (this is clearly a page view metric in context, but the exact meaning of the acronym isn’t in the report) went to pages with transparency issues and little to no original content. 60 percent of total video watch-time was going to video aggregators. The document estimates that 50 percent of reshare impressions “could be going to actors abusing the reshare mechanism of our platform.” In total, this accounted for ~25 percent of all Pages’ VPVs.

As for the impact all this fraud has on Facebook as a whole, the document doesn’t mince words:

One camp doesn’t realize the Pages are run by inauthentic actors who are exploiting their communities. They tend to love these Pages. They like how entertaining the posts are, and how they reaffirm their already held beliefs.

The other camp does realize the Pages are run by inauthentic actors. They hate the every loving shit out of these Pages. They hate these Pages with a passion that even I find impossible to match.

So what’s changed since 2019? Some of the troll farms discussed in the report don’t appear to be operating any longer, but at least five of them are, including three targeting African Americans, one targeting Christians, and one targeting Native Americans. It isn’t clear if new troll operations have replaced the old or if Facebook actually cracked down on these groups in a meaningful way more than three years after promising to do better.

“More” Is the Only Value Facebook Recognizes

Much virtual ink has been spilled on the question of whether Facebook is biased against conservatives or liberals. Regardless of what one’s opinion on that topic is, internal documents like this illustrate that Facebook’s most profound bias is not political. It’s towards engagement itself. This is not surprising in a social networking company, but Facebook does not simply value engagement. It prioritizes the concept to the exclusion of all others.

Facebook and Zuckerberg continue to believe that any engagement is good engagement and that “connecting” people is a supreme moral value even when those connections are bereft of meaning. When you believe all connection is good, it’s much easier to justify shoving garbage content, disinformation, and misinformation down the throats of 140 million Americans a month.

Facebook Force-Fed Garbage to 140 Million Americans a Month

If you read Allen’s report, you’ll see that he had ideas for how Facebook could address these problems. While there are no magic bullets, he proposed concrete, specific ways Facebook could enforce its own rules regarding content theft, the endless republishing of the same low-quality dreck, and the fact that trolls have literally built their websites and Facebook pages to capitalize on Facebook’s refusal to address these problems.

A separate, recent report from the Wall Street Journal confirms that Facebook knew in 2018 that its then-recent algorithm changes were making people angrier and creating higher levels of conflict. At the time, Facebook had announced it would show people less news produced by professional sources and more information that was re-shared by one’s family and friends. This was, to Facebook’s credit, intended to tamp down on the amount of anger and toxicity on Facebook.

It had the opposite effect. Political parties and news organizations both contacted Facebook to notify the company they were shifting their articles and platform statements more towards outrage and negativity, to capitalize on the fact that this material is what Facebook was putting in front of people. Facebook staff wrote internal memos stating that “Misinformation, toxicity, and violent content are inordinately prevalent among reshares.”

At least some Facebook staff wanted to make changes, but the site had a problem: People weren’t engaging with it as much, or as frequently. Zuckerberg specifically resisted making changes that would reduce the amount of misinformation and garbage content on Facebook in 2018, because he was unwilling to take any action that might reduce the time people spent interacting with Facebook, regardless of whether that interaction was positive. Facebook finally began evaluating potential changes more than 18 months later, but the lack of transparency at the company means the impact of whatever changes it has made since is still unclear.

It Never Ends

Reading these reports, I was reminded of an old Simpsons episode, pictured above.

In “Treehouse of Horror IV”, Homer sells his soul to the Devil (played by Ned Flanders) for a doughnut. He then forgets not to finish eating the doughnut and is summarily dragged to Hell. One of the various torments while visiting the Ironic Punishment Division is being forced to consume an absolutely Sisyphean supply of donuts. Homer being Homer, he doesn’t even mind.

“More,” he intones, in the brief moments he can speak. “More.” More.” More.”

What played as a joke in 1993 perfectly captures Facebook’s goals in 2021. What do we need, in Facebook’s eyes? More. More, and more, and more, and then yet more on top of that. There will never come a day when we stop finding out about the awful things Facebook has done, because “more” is not actually a moral value, despite Zuck’s claims to the contrary.

These reports show that there are people at Facebook who value high-quality news, meaningful social connections, fan communities, and local business and event pages. There are people at the company who want to create a better social network, not just a larger one. The problem is, none of those people are in charge, and nothing is as important to Facebook, as an institution, as the concept of “more.”

Do your jaws ache? More. Tired of swallowing? More. Does your Facebook feed depress you? More. Does one of your social networks have a deleterious effect on mental health? Make sure to wring your hands over that one, but don’t let it stand in the way of developing a version of the same product explicitly aimed at children. The cure for social media, according to Facebook, is more social media introduced at a younger age. Individuals are responsible for their own social media usage, but that doesn’t change the fact that Facebook has repeatedly and deliberately embraced tactics that made all of these problems much worse.

The dip in Facebook engagement back in 2018 may have been the result of people voluntarily reducing the amount of time they spent on the social network. Facebook’s response to this wasn’t to improve its product or make it friendlier. Instead, the company deliberately chose to continue stoking rage, fear, and unhappiness, even once it was aware its changes had produced these effects.

“More,” to Facebook, means buying account data on people who are not its users, for the purpose of building shadow profiles that cannot be deleted. It means selling and sharing data with companies like Cambridge Analytica, with no regard for the consequences. It means abetting genocide when convenient and ignoring the titanic amount of misinformation sloshing around in the non-English side of its business. It means faux apologies and Congressional appearances, but the one thing it never means is Zuckerberg or Facebook itself taking the slightest pause to seriously consider the consequences of its own actions.

Why did Facebook fight all attempts to understand the role its lax policies played in allowing Russian disinformation to propagate prior to the 2016 election? Because admitting these failures could have hampered growth, and Facebook recognizes no higher moral value. Its decision to exempt 5.7 million people from content moderation worldwide stems from the same unethical precept. If punishing a person on Facebook presents even the smallest risk to Facebook’s traffic, that person is not moderated. Every time Zuck has a chance to choose between what’s right for users and what’s likely to grow Facebook in the short term, he chooses the latter — even when the end result is unhappier users, more misinformation, and a worse product experience.

More is the only thing we can count on Facebook to deliver. More privacy problems. More content moderation failures. More unwillingness to engage with the reality it has created, or to even acknowledge that it shapes that reality far more than it likes to admit in public.

The one thing I know for certain is that I’ll be writing another story like this in a month, or two, or 10. Facebook promises a lot of things that aren’t worth the paper they’re not written on. But when it comes to “more,” the company’s declarations are practically ironclad.

Continue reading

A Garbage Piece of Gaming History is Up For Auction
A Garbage Piece of Gaming History is Up For Auction

The Phantom Gaming System breaks cover again, but don't let the auction verbiage fool you. This was no plucky also-ran.

2 UK Firms Are Developing Space Garbage Trucks to Grab Defunct Satellites
2 UK Firms Are Developing Space Garbage Trucks to Grab Defunct Satellites

One of the firms will win a contract to perform the UK Space Agency's first national space debris removal mission in 2026.

Linus Torvalds Says Intel’s Spectre Fix Is ‘Complete and Utter Garbage’
Linus Torvalds Says Intel’s Spectre Fix Is ‘Complete and Utter Garbage’

Original Linux developer Linus Torvalds has called Intel's patches "COMPLETE AND UTTER GARBAGE." He actually used caps lock, so you know he's serious.

Garbage Class-Action Lawsuit Against AMD Bulldozer Is Headed to Trial
Garbage Class-Action Lawsuit Against AMD Bulldozer Is Headed to Trial

The class-action lawsuit against AMD alleging that the company fraudulently mismarketed its Bulldozer processors has been cleared to proceed.