Earlier this month, Facebook’s founder and CEO Mark Zuckerberg posted a short video on Instagram of him with his wife Priscilla Chan serenely enjoying a sailing trip with friends.
The intimate clip was seemingly part of a new PR strategy to distance the 37-year-old from his company's mounting scandals and associate him with Facebook’s futuristic gadgets, such as the video-shooting sunglasses he filmed the nautical scene on.
Yet within 48 hours of the sailing post that strategy was sunk as a former Facebook executive, Frances Haugen, dramatically unmasked herself as the whistleblower behind a devastating series of leaks that have since plunged the social media giant into its deepest crisis to date.
After going public, Ms Haugen called on Facebook to “declare moral bankruptcy” over the way it had prioritised “growth at all costs” when addressing the harms its addictive algorithms are wreaking on users.
Now in her first UK interview, Ms Haugen, who is coming to London to give evidence before Parliament on Monday, is warning that Facebook’s controversial encryption plans mean it will lose sight of espionage operations being perpetrated on its apps by hostile nations.
Speaking to the Sunday Telegraph she also expressed concern about the effect Facebook-owned Instagram may have had on the British schoolgirl Molly Russell, who took her life after viewing self-harm and suicide material on the app.
Ms Haugen has taken issue with her former Facebook boss Sir Nick Clegg, comparing the argument that some users are also responsible for the problematic material Facebook shows them to that of a domestic abuser blaming their victim for making them violent.
Ms Haugen, who was responsible for detecting and disrupting espionage operations during her time at Facebook, tackled efforts by Chinese operatives to implant spying software onto the phones of Uighur dissidents.
She warned that Facebook will not be able to uncover such operations if it goes ahead with its controversial plans to encrypt its Messenger app as well as Instagram’s direct messages – meaning not even the company will be able to see what users are sending.
Describing her work, Ms Haugen said: “A key part of [Chinese operatives’] strategy was to send malware to Uighurs who lived in places that weren’t China, as if they could compromise one phone they could compromise a whole community. We said we won’t be able to see the malware anymore [with encryption].”
Her comments come as Facebook is facing mounting pressure from the UK Government to ditch its encryption plan.
Earlier this year, Priti Patel, the Home Secretary, criticised the proposals saying they would allow harmful material such as child abuse images to “proliferate” on Facebook’s apps.
Ms Haugen warned that Facebook’s encryption plans, which the company argues are to enhance privacy, are in reality an attempt to hide the harmful material circulating on its apps rather than tackle it.
She said: “End-to-end encryption definitely lets them sidestep and go ‘look we can’t see it, not our problem’.”
A Facebook spokesman said: “The reason we believe in end-to-end encryption is precisely so that we can keep people safe, including from foreign interference and surveillance as well as hackers and criminals.
“We will maintain the ability to receive user reports of suspicious messages, which will help indicate if there is a coordinated abuse of our services.”
In her interview, Ms Haugen expressed concern about the impact Facebook algorithms within its Instagram app may have had on British teenager Molly Russell, who took her life just six days before her 15th birthday in 2017.
Her father, Ian, later accused Instagram of “helping to kill” his daughter after he discovered she had been viewing self-harm and suicide material on the app.
An inquest is due to be held next year to establish if social media algorithms “overwhelmed” the teenager with the content they showed her, which is yet to be determined.
Ms Haugen said she recognised the Russell family’s fears about the impact Instagram may have had on their daughter and warned its algorithms may have started showing her self-harm material even without the teenager directly searching for it.
She said: “In the case of Molly Russell, she may have been feeling kind of blue and might have followed some stuff related to feeling a little blue. I guarantee you that, with the algorithm, if she kept engaging, it just kept getting worse.”
“Imagine you are kind of a fragile teen and you are being exposed to a little bit of stuff talking about how you are worthless, and then [you] engage a little bit and it keeps getting worse and worse. It is bad.”
A Facebook spokesman said: “We’ve never allowed content that promotes or encourages suicide, self-harm or eating disorders, and in the last few years we’ve updated our policies to ban even more content, including all graphic imagery.”
Ms Haugen, who holds an MBA from Harvard, joined Facebook in 2019 after a high-flying career in Silicon Valley that saw her work at Google as well as the social media site Pinterest and the business review app Yelp.
During her career she had worked on those companies’ algorithms – which decide what users see in their apps – impacting everything from which websites come top of search results to the order of posts in users’ social media feeds.
The 37-year-old initially joined Facebook to work as the lead product manager countering misinformation ahead of the 2020 Presidential election before moving onto counter-espionage operations.
However, during her two years at the company Ms Haugen said she became increasingly disheartened at what she saw as the company repeatedly made decisions to put its vast profits above fixing the harms its algorithms were causing.
Her main concern is over the way Facebook uses “engagement-based ranking” to choose what people see in their newsfeeds. In layman's terms, this means its algorithms select videos and posts that it deems a user is most likely to engage with by liking or sharing them with their friends.
The danger with this system, Ms Haugen argues, is that the posts people are most likely to engage with are ones that elicit an “extreme reaction” such as conspiracies and hate speech, leading algorithms to show people increasingly dangerous content.
To back-up her concerns, the former executive has released pages of documents of Facebook’s own research that lay bare the impact engagement-based ranking can have on users.
Among the research, according to Ms Haugen, is one investigation showing how the algorithm can lead people from being shown innocuous content to far more harmful material, such as searches for healthy recipes leading to the algorithm eventually recommending eating disorder posts.
Other documents leaked by Ms Haugen earlier this month showed research indicating that Instagram can be particularly toxic for teenage girls. Internal papers found the app made one in three feel worse about their bodies and that one in seven teenage girls with suicidal thoughts traced their origin to using the app.
More details from the cache of research documents are due to be released later this week as they are poured through by an international coalition of journalists.
Facebook has argued that the leaked findings are not representative of how Instagram affects young users and that its research also showed the app helps teenagers struggling with anxiety and loneliness.
Ms Haugen’s bombshell intervention comes as the UK Government is drawing up legislation to impose a statutory Duty of Care on tech companies to prevent their algorithms harming people, a measure the Sunday Telegraph has campaigned for since 2018.
The regime, which will be policed by Ofcom, will see tech companies face fines that could run into the billions if they breach the duty or even be barred from the UK in the most serious cases.
Ms Haugen backed the Sunday Telegraph’s Duty of Care campaign saying tech companies should be legally “responsible for the decisions they make”.
She added: “If they intentionally pick their algorithms, they should be responsible for their algorithms.”
As Ms Haugen arrives in London, Westminster is also debating whether to tighten online laws in the wake of the death of MP David Amess.
Boris Johnson this week said he is considering allowing criminal prosecutions of executives at tech companies that allow “foul content to permeate the internet”.
Ms Haugen argued that the spectre of criminal prosecution for tech bosses would make them take regulation “more seriously”.
“I do think when you have higher consequences people do take things more seriously and personalising those effects mean people take personal sanctions even more seriously,” she said.
Yet, Ms Haugen warned that simply banning certain content would not stop extremist and conspiratorial material circulating on social media unless the algorithms spreading it are checked.
During her time at Facebook, she said she watched its algorithms ‘concentrate’ misinformation and hate content on small groups of users who engaged with it, meaning that in effect around 80 per cent of all Covid misinformation went to less than five per cent of the app’s users.
She said: “The real problem and the thing that leads to extremism is when a small group of people overwhelmingly get hate content.”
Facebook has hit back at Haugen’s accusation that it put its profits before its users' safety, with Mr Zuckerberg saying in a recent blog post the characterisation “just isn’t true”.
The CEO said the idea that Facebook actively promotes harmful content is “illogical”’ as its advertisers don’t want to see their ads next to it.
Former Deputy Prime Minister Nick Clegg, who is now Facebook’s head of policy and communications, has also previously defended how the company’s algorithms work. In a blog post in March entitled ‘it takes two to tango’, Sir Nick argued Facebook users were not “the playthings of manipulative algorithmic systems”, but actively influence what the app shows them by deciding what they click on.
Ms Haugen rejected Mr Clegg’s argument.
She said: “That article, it is almost like a domestic abuser saying ‘if you didn’t make me so angry I wouldn’t hit you so much’.
“He said it takes two to tango, but no. If you go and follow pretty innocuous things just following the algorithm you get led to more and more extreme things because engagement based ranking is dangerous.”Internet Explorer Channel Network