Facebook has had many bad days and months, but the company is facing yet enough public disgrace. Everyone’s talking about Facebook Papers, and we’re here to summarize it for you, so you don’t have to spend days of your life reading. Let’s dive right into it.
What are Facebook papers?
Facebook papers are a set of documents, that former Facebook employee, and now a whistleblower, Frances Haugen, obtained before leaving the company. She submitted these documents to the Security Exchanges Commission (SEC) earlier this year, and now they are available to a consortium of news outlets.
Who’s Kathrine Haugen and what does she think about Facebook?
Haugen, 37, was a product manager at Facebook and a part of the Civic Integrity group, which worked on risks to elections including misinformation and bot accounts. She left the company in May, but collected a trove of information that is now called the Facebook papers.
In her interview with the show 60 minutes, she said “Facebook over and over again, has shown it chooses profit over safety.” You can read more about it here.
The Facebook papers have revealed a lot of information about what the company thinks about and how it deals with dwindling user numbers, misinformation, and its own image.
Below are some of the big problem areas for Facebook. We’ll keep updating the list as new details emerge.
Facebook, teens, and mental health
In the past decade, the number of teenage users on Facebook has steadily decreased. According to a report from , a recent internal study revealed that time spent by teens on the platform has declined 16% year-on-year. Facebook’s more popular among baby boomers these days.
Fewer teens are signing up for the services. That’s not surprising given the growth of other platforms such as TikTok.
A report from the Wall Street Journal published in September, said that the company ignored Instagram’s harmful impact on teens, particularly in regard to self-esteem and body image.
After these reports, Nick Clegg, Facebook’s VP of global affairs, went on to defend the firm and said that it’s working on a slew of new features — including a ‘take a break’ warning — for teen safety.
Facebook has failed to moderate hateful content in different countries
While Facebook’s base lies in the US, its biggest audiences are in countries like India and Brazil. As Casey Newton of The Platformer noted, these countries are placed in ‘tier zero’ — meaning they are a high-priority for its Civic Group formed in 2019 to monitor election interference.
But that doesn’t guarantee success. article showed that the company struggles with problems like hate speech, misinformation, and the celebration of violence in India.
The report noted that the infestation of bots and fake accounts had a massive impact on the country’s national elections held in 2019.
A lack of budget allocation where it’s needed
The NYT report said that shockingly, 87% of the company’s budget to combat misinformation is allocated to the US, while the rest of the world has to make do with 13%.
This creates a massive resource crunch for monitoring countries that communicate in languages other than English.
In countries like Myanmar, which held its national elections last November, Facebook deployed tools to fight fake news, but
The situation in other countries, which are slotted in tier 3 according to , is more dire. In Ethiopia, despite knowing that Facebook is being used for violence, .
Facebook’s algorithm mishap
In her revelations, Haugen said that Facebook’s 2018 algorithm change was the culprit of inciting hate between its users. When it was rolled out, Zuckerberg said that it meant to increase interactions between friends and family.
However, this , as the feeds become angrier, resulting in increased toxicity and misinformation.
Make positive stories visible
As a way to combat its algorithm failures, and improve its image, Facebook began to increase the visibility of positive stories about itself after a meeting in January. Plus, the social network began to distance Zuckerberg from controversial topics such as vaccine misinformation.
In 2018, the company even ran an experiment of turning off its . That also led to worse experiences for many people with less engagement and more ads.
Employees are unhappy with the direction
Haugen is probably one of the pristine examples of how Facebook employees are not fine with the way the social network is operating. In her interview, she said that the situation in the firm was substantially worse than “anything I’d seen before.”
A report from noted that the firm ignored employees’ warnings about pages and groups run by global drug cartels and human traffickers.
“We’re FB, not some naive startup. With the unprecedented resources we have, we should do better,” said an employee after the Capitol Riots in the US in January, as per a report by .
The story has numerous quotes from former and current Facebookers, who were angry about the way the firm was handling misinformation across the board.
A report from echoed what some former employees have said before: the company focuses on engagement a lot.
It also highlighted that the critical teams that engage with misinformation directly, such as the content policy team, .
While the company agreed earlier this year to keep King Zuck out of sight for controversial issues, this is too big to ignore.
In the company’s quarterly earnings call, the head of the company addressed this issue and said the media is misinformed about the firm:
Good faith criticism helps us get better, but my view is that we are seeing a coordinated effort to selectively use leaked documents to paint a false picture of our company.
Want to find out more?
This is by no means an exhaustive list of all the issues uncovered by the Facebook papers. So here’s a reading list for you to dive deeper into it:
- Where it started: Wall Street Journal’s reporting of .
- Gizmodo’s reporting on Facebook’s .
- How Facebook is facing huge problems with
- The US lawmakers are keeping and its impact on the world.
- Politico’s report on how little the company did to reduce .