Facebook's algorithms gave an "angry" reaction five times the weight of a traditional "like." After years of research, employees realized the result was more "toxicity."
In late 2017, Facebook was in the final stages of preparing for a major change in how it ranked posts and comments, and at least one employee had a lingering concern: what to do about the emojis.
The plan was to give emoji reactions such as “love” and “angry” five times the weight of a traditional “like” in Facebook’s secret formula, according to an internal company document. That would make content that elicited those reactions far more common in the news feeds of Facebook’s gigantic user base.
The employee quickly foresaw what a problem that could cause.
“Quick question to play devil’s advocate: will weighting Reactions 5x stronger than Likes lead to News Feed having a higher ratio of controversial than agreeable content?” the employee asked on an internal message board.
“I.e. if I post a story that I bought a coffee (pretty boring example I know) I might invite a few Likes from friends. However, if I post ‘Steve Bannon Punches Hillary’ I’ll probably get more polarized reactions with Angry emojis and thus (5x?) more distribution,” the person wrote.
The response from colleagues: It’s possible, but the company knew about the potential problem and was working hard not to promote “engagement bait.”
It turned out that the first employee’s concern was prescient and that the angry cartoon faces would have more influence on Facebook’s billions of users than others expected, company documents show. But it took years for Facebook to realize how right the person was, and when it did, it changed course.
The story of Facebook’s emoji reactions illustrates how the company has come to operate — sometimes at breakneck pace and other times with caution. It’s capable of using even small software tweaks to dramatically change what people see online and is persistent in testing the resulting impact. The story also demonstrates how complex Facebook has become, both as a social media app and as a corporation with a large research staff.
Documents describing the study of emoji reactions were included in disclosures made to the Securities and Exchange Commission and provided to Congress by legal counsel for Frances Haugen, who worked as a Facebook product manager until May and has come forward as a whistleblower.
Social media execs face questions on child safety at Senate hearing
Haugen’s legal counsel redacted most names of Facebook employees. The Washington Post reported earlier on the internal debate.
While the documents aren’t exhaustive, the ones that have been released illustrate the evolution of one of Facebook’s most important tools — its ranking algorithm — through the words of its own employees.
Reaction buttons are important beyond Facebook. For years, online social media companies have struggled with how to design buttons that don’t encourage toxic behavior or whether to have them at all. Reddit has had “upvotes” and “downvotes” since its earliest days in 2005. Twitter has tried different approaches for its “like” button, which predated Facebook’s. A designer behind Facebook’s “like” button has since expressed misgivings, and Instagram now allows users to hide their like counts. Nextdoor also has reaction emojis.
Facebook’s emoji saga began in February 2016 when the company redesigned the “like” button to include five more ways to react to a post: “love,” “haha,” “wow,” “sad” and “angry.” (Yet another emoji reaction, “yay,” was considered but didn’t make the cut.)
The emojis took on renewed importance two years later when Facebook announced a new ranking system to determine which posts people saw and in which order. The system gave equal weight to the five emoji reactions, and that was true regardless of the context or the intention of the person reacting, so a conspiracy theory with a lot of “haha” reactions or a violent image with a lot of “angry” reactions would potentially get an extra boost from the algorithm.
The more emojis, whether angry faces or hearts showing love, the more meaning that Facebook’s computers would interpret.
Facebook spokesperson Drew Pusateri said this week that Facebook tested various versions of the ranking system — known as Meaningful Social Interactions — before launch and conducted significant research.
“The goal of Meaningful Social Interactions is to improve people’s experience by prioritizing posts that inspire interactions between family and friends, and that goal remains unchanged,” he said in an email. “The formula for attaining the goal of MSI is continually updated and refined based on new research and direct feedback from users.”
The documents from Haugen picked up the story again in April 2019, when internal researchers expressed an inkling that something was wrong.
“I’ve been collecting evidence around how anger reactions, overall, is weaponized by political figures and creative negative incentives on the platform,” an employee wrote on an internal message board.
There wasn’t much evidence yet, he and colleagues wrote, but their discussion marked a renewed interest into the impact that emoji reactions had.
“Anger reaction, reshares, and crappy comments — the three things I’ve been thinking about for a while now!” an employee wrote in the same thread, describing three possible signals for identifying toxic posts.
Someone in the thread urged more study to determine whether there really were disturbing correlations between toxic content and certain emojis: “I think a key research project would be: who is an angry engager on Facebook, and why are they doing it? Is it a frequent part of their FB participation, or are there archetypes of users who angry-engage with content?”
By November 2019, more research had come in, and it wasn’t looking good. Not all emojis were created equal, it turned out, and employees began thinking they should change the reaction weights so they weren’t all the same.
“We find that angrys, hahas, wows seem more frequent on civic low quality news, civic misinfo, civic toxicity, health misinfo, and health antivax content, than on other civic and health content,” a study said that month.
At least one researcher expressed concern that even CEO Mark Zuckerberg might not understand what the “angry” emoji meant. When a Facebook user urged the company to create a “dislike” button, Zuckerberg responded in a public comment on the site, “You can use the angry face.”
But the researcher wrote in a December 2019 report that the angry face was, in fact, the opposite of a “dislike” button, because it could deliver more of the same: “Indeed, even Mark himself has suggested that the anger reaction is a reasonable way to express that you don’t like a piece of content, even as we currently count it as 4x as important as a less ambiguous ‘like’ for giving you more such content.”
It’s not clear from Zuckerberg’s years-old remark whether he was being sincere, flippant or something else. The company declined to comment on Zuckerberg’s old remark this week, and over the years, Facebook has generally not commented on the emoji reactions except to announce new emojis and to respond to Haugen’s disclosures this week.
By January 2020, documents show, Facebook had decided that the emojis overall should get less weight in the ranking algorithm, but the weight for each remained equal. When a researcher on a message board asked why, another responded that there were strong arguments on both sides but that “the voice of caution won out” pending further study.
“The caution is indeed about imposing judgement on different emotions’ values,” the second researcher wrote.
But on another front, Facebook was coming around to the idea that users didn’t have enough emotional options for their reactions. In April, it added a seventh emoji reaction, “care,” which could be especially useful during the coronavirus pandemic.
Months later, a proposal to stop giving any weight to the angry face reaction was circulating.
“We see that anger and haha reactions are highly prevalent on misinfo and toxicity,” an internal report from July 2020 said.
“The number of Hahas a Civic post has received is a strongly negative predictor of whether the post’s viewers consider it trustworthy, important, or good for their community. Nevertheless, Haha is the most common reaction on Civic content,” researchers wrote in the report.
Documents show that in September 2020, Facebook approved an overhaul. The angry face emoji would no longer count for anything in the feed ranking; “wow,” “sorry” and “haha” reactions would count somewhat, with a weight of “1” in the formula, and “love” and “care” reactions would count more, with a weight of “2.”
“At this point we don’t expect that we need to restate our goals, but we can discuss if the metric jumps are really big,” the internal announcement said.
In comments on the announcement, employees were turning attention to the next problems to tackle, such as spammy photos for which an author asks for an “OK” or a “Yes” comment to get a discount code.
“Engagement bait comments are a problem!” an employee wrote.Internet Explorer Channel Network