Facebook Shuttered Team Researching Social Media Addiction

Facebook Shuttered Team Researching Social Media Addiction

But “had” is the keyword, here. According to internal documents inspected by the Wall Street Journal, Facebook shut down the team in 2019.

Worse, the team appears to have been on the cusp of making major suggestions that could have benefited users—at the cost of time spent on the platform, which is crucial to Facebook’s ability to earn revenue. Before they were disbanded, the researchers on the team conducted a survey of 20,000 users and found that one in eight engaged in “problematic use” of Facebook. Problematic use, they said, produced a variety of negative effects on key aspects of users’ lives. Some users reported loss of productivity, while others said their sleep was impacted by late-night scrolling and viewing disturbing content. Users also reported deterioration within their interpersonal relationships; parents even avoided their children in favor of spending more time online. One user missed a family member’s wedding because they were watching a video on Facebook. Another said it was common for them to browse the app until 2 AM, making it difficult to wake up feeling rested the next morning.

Facebook Shuttered Team Researching Social Media Addiction

Anyone familiar with the scientific method will tell you correlation isn’t causation—as will Facebook itself, whose parent company, Meta, denies WSJ‘s interpretation of its research. But Facebook’s correlation with unhealthy social media use isn’t exactly hopeful. It also isn’t comforting in the face of Facebook’s recent rebrand, during which Zuckerburg has publicly aimed at blurring the lines between the “real” world and the virtual world by building out a metaverse.

Facebook’s internal documents reveal that the company knew its platform was more frequently associated with addictive use than other virtual experiences, including Reddit, YouTube, and World of Warcraft. Whistleblower Frances Haugen spoke just last month about how Facebook is designed to reward controversial (and sometimes, downright hateful) content due to the way its algorithms favor engagement above all else. One of Facebook’s subsidiaries, Instagram, was also found this year to have a uniquely poor impact on its users thanks to its algorithms and user interface.

Despite this, the company has only made half-hearted attempts at improving its platform to address these issues. It added a time-management tool to its mobile app in 2018, as well as a “quiet mode” that muted push notifications in 2020. But the latter feature was hidden among the app’s settings, and Facebook’s algorithms still push unsavory content to the top of users’ news feeds. Facebook recently squashed an outside attempt at helping people curb overuse of the app, so it’s unlikely that we’ll see any real strides toward user well-being in the near future.

Continue reading

FTC Files Antitrust Case to Break Up Facebook
FTC Files Antitrust Case to Break Up Facebook

New York Attorney General Letitia James has announced a major antitrust case against Facebook, which will be joined by 47 other state and regional AGs. And that's not all: the Federal Trade Commission (FTC) is filing a separate case against Facebook later today.

Signal, Facebook Spar Over Ads Disclosing What Facebook Knows About You
Signal, Facebook Spar Over Ads Disclosing What Facebook Knows About You

Signal claims Facebook banned it for speaking truth to millions of people. Facebook claims Signal made the whole thing up. Welcome to the internet, where the validity of everything is disputed and everyone is mad about it.

Facebook Announces a New Oculus VR Feature: In-Game Ads
Facebook Announces a New Oculus VR Feature: In-Game Ads

Facebook will soon build ads into your VR games. The company claims the advertising will benefit developers, but it appears to have something else in mind.

Facebook Force-Fed Garbage to 140 Million Americans a Month
Facebook Force-Fed Garbage to 140 Million Americans a Month

Facebook refused and ignored its own staff's attempts to improve the service, even after it knew its own algorithms were feeding people low-quality content they didn't want to see.