No matter how bad conditions appear on the surface, it's even more treacherous below.
That's the disturbing takeaway from the Facebook Papers, a collection of internal Facebook documents leaked by whistleblower Frances Haugen and reviewed by 17 news organizations. Their stories paint a picture of a company broken beyond repair, that, despite scandal after scandal, still has the power to shock.
A small taste of that reporting, summed up below, proves just how bad the situation truly is.
The people who work at Facebook are not a monolith, and the Atlanticreports that company documents show some employees calling out real-world harm caused by the platform — only to be brushed aside by higher-ups.
iRobot Roomba Combo i3+ Self-Emptying Robot Vacuum and Mop—$329.99(List Price $599.99)
Samsung Galaxy Tab A9+ 10.9" 64GB Wi-Fi Tablet—$149.99(List Price $219.99)
Apple AirPods Pro 2nd Gen With MagSafe USB-C Charging Case—$168.99(List Price $249.00)
Fitbit Charge 6 Fitness Tracker With 6-Months Membership—$99.95(List Price $159.95)
Apple Watch Series 9 (GPS, 41mm, Midnight, S/M, Sports Band)—$279.99(List Price $399.00)
"How are we expected to ignore when leadership overrides research based policy decisions to better serve people like the groups inciting violence today," a Facebook staffer wrote in the fallout of the Jan. 6 attack on the U.S. Capitol. "Rank and file workers have done their part to identify changes to improve our platform but have been actively held back. "
Facebook CEO Mark Zuckerberg doesn't want to be in the business of censoring political speech, he has repeatedly insisted. And yet, according to the Washington Post, he's personally done just that when it suits his company's bottom line.
The Posthighlights a particular nasty example of the CEO's duplicity in Vietnam, where, according to people familiar with the decision, Zuckerberg himself made the call to censor anti-government posts on behalf of the ruling Communist Party in 2020.
Vietnam is an important market for Facebook. A 2018 Amnesty International estimate found Facebook earned approximately $1 billion in annual revenue from the country.
That the Facebook algorithm amplifies divisive content is now a widely understood fact. Even so, the horrific nature of that content still has the power to shock even Facebook's own researchers.
"On Feb. 4, 2019, a Facebook researcher created a new user account to see what it was like to experience the social media site as a person living in Kerala, India," reports the New York Times. "For the next three weeks, the account operated by a simple rule: Follow all the recommendations generated by Facebook's algorithms to join groups, watch videos and explore new pages on the site."
I've seen more images of dead people in the past three weeks than I've seen in my entire life total.
According to internal Facebook documents, the experiment laid bare just how skewed Facebook's recommendation systems are.
"Following this test user's News Feed, I've seen more images of dead people in the past three weeks than I've seen in my entire life total," wrote the Facebook researcher.
Reporting has shown that Zuckerberg feared the wrath of Facebook's conservative users, and thus would personally intervene on behalf of right-wing pundits and publishers. Leaked documents contained in the Facebook Papers and highlighted by Politico show that even Facebook's own researchers were aware of this, and repeatedly called it out internally.
"Facebook routinely makes exceptions for powerful actors when enforcing content policy," wrote a Facebook data scientist in a 2020 internal presentation titled Political Influences on Content Policy. "The standard protocol for enforcement and policy involves consulting Public Policy on any significant changes, and their input regularly protects powerful constituencies."
Notably, as Politico points out, the Public Policy team referred to by the researcher includes Facebook lobbyists.
What's more, Facebook researchers confirmed that Zuckerberg himself often got involved in deciding whether a post should stay or go — suggesting a two-tier system of enforcement dependent on unwritten rules.
In multiple cases the final judgement about whether a prominent post violates a certain written policy are made by senior executives, sometimes Mark Zuckerberg. If our decisions are intended to be an application of a written policy then it's unclear why executives would be consulted. If instead there was an unwritten aspect to our policies, namely to protect sensitive constituencies, then it's natural that we would like executives to have final decision-making power.
Human traffickers have used Facebook's tools to power their work. As CNN reports, a 2020 internal Facebook document made clear that Facebook was long aware of this fact.
"[Our] platform enables all three stages of the human exploitation lifecycle (recruitment, facilitation, exploitation) via complex real-world networks," reads the internal Facebook report in part.
And yet, while human trafficking has long been explicitly banned on Facebook, it took Apple threatening to boot Facebook and Instagram from the Apple App Store in 2019 for Facebook to muster the type of response one might have expected much earlier.
"Removing our applications from Apple platforms would have had potentially severe consequences to the business, including depriving millions of users of access to IG & FB," reads the document reviewed by CNN. "To mitigate against this risk, we formed part of a large working group operating around the clock to develop and implement our response strategy."
SEE ALSO: Facebook announces $50 million distraction
Importantly, Apple wasn't the first to bring the issue to Facebook's attention.
"Was this issue known to Facbeook [sic] before the BBC enquiry and Apple escalation?" the internal Facebook report asks. "Yes."
TopicsFacebookSocial Media