New Times,
New Thinking.

Facebook whistleblower Frances Haugen: “I wouldn’t wish Mark Zuckerberg’s life on anyone”

In 2021 she shone a light on misinformation and online harm. Now she’s “extremely worried” about Big Tech’s impact on the 2024 US election.

By Rachel Cunliffe

“There are no genocides caused by Pinterest that I know of,” the tech whistleblower Frances Haugen told me recently. Nor can you point a finger at the crowd-sourced platform Yelp. “I’ve worked at Yelp – the reviews can get pretty savage, but I don’t think anyone dies over a Yelp review.” 

In 2021 Haugen, a former Facebook engineer and product manager, quit the company and disclosed tens of thousands of internal documents to the world. Working with a lawyer from Whistleblower Aid and the Wall Street Journal – which published the first explosive reports based on her disclosures in September of that year – she was able to detail the Facebook parent company’s embrace of algorithms it knew were causing widespread harm. Senior staff were aware, the documents revealed, that their platforms – which include WhatsApp, Messenger and Instagram, alongside Facebook itself – promoted content dealing in extremism, hate speech and misinformation in users’ news feeds, as well as that relating to body image and self-harm. Instagram’s own internal research had shown that such content kept young users on its platform for longer, often leading to depression, anxiety and eating disorders.

Initially anonymous, Haugen went public when she testified before a US congressional committee a month later, telling senators that the company “knows ways to make Facebook and Instagram safer” but had not acted on them, “because they have put their immense profits before people”. In a blog post, Mark Zuckerberg responded that this was “just not true… If we didn’t care about fighting harmful content, then why would we employ so many more people dedicated to this than any other company in our space?”

I had started by asking Haugen what it was that had shocked her most at Facebook, compared to the other tech companies where she had worked? She was hired by the platform in 2019 to join its “Civil Misinformation” team. Haugen assumed she’d be working on fact-checking posts and removing fake news relating to the 2020 US election; she was wrong. Her team’s purview was instead all the countries where Facebook wasn’t investing in third-party fact checkers. In North America and western Europe, in the wake of the Cambridge Analytica scandal, independent fact-checkers had been hired to try to clamp down on misinformation. But in some of the poorest and most unstable countries in the world, millions of people were using a version of Facebook that wasn’t fact-checked in any way.

Worse, for many of these users, Facebook was the internet: the company’s “Free Basics” strategy (subsidised online access) to win market share in developing economies such as Pakistan, Nigeria and Indonesia, had made Facebook a primary source of news. “They went into countries that were just coming online and bought the right to be the internet,” Haugen explained, speaking via Zoom from her hotel room in Amsterdam. She was in the Netherlands to film a documentary on the social media giants, and her tone was bright, her answers eloquent and considered; the past 18 months had been a crash course in engaging with the media. In developing economies, she argued, the spread of misinformation via Facebook could, and did, lead to people’s deaths. When Haugen testified before Congress, she drew a clear line between fake news on the platform and the Myanmar government’s genocide of Rohingya Muslims, warning that similar atrocities were now taking place elsewhere: “In places like Ethiopia, it’s literally fanning ethnic violence,” she told senators.

But when she and others employed by the company tried to raise the alarm, she alleged, they were met with resistance. It was expensive to combat misinformation in these markets, and anything that might increase friction on the platform and reduce usage was treated with intense scepticism. Speaking out was seen as “being negative”, Haugen told me. “At Google, when you have a cultural value that is ‘Don’t be evil’ [Google’s motto from 2004 to 2018], you open up space for conversations on ‘Are we being evil?’. At Facebook, when I said, ‘I’m working on civic misinformation; we’re trying to deal with this genocide thing,’ people did not want to engage. It was like I was talking to the void. Because you are viewed with suspicion if you don’t believe that the greatest thing you can do is connect people.” Still, she experienced little blowback after going public, something she puts down to the fact that, “Outside of Facebook, very few people like Facebook.”

[See also: Mark Zuckerberg’s Metaverse project dies after 18 months]

Give a gift subscription to the New Statesman this Christmas from just £49

Haugen, now 37, never wanted to be the poster girl for this cause. The daughter of a doctor and an academic turned Episcopalian priest, she grew up in Iowa, a keen maths student and fierce member of the school debate team (skills that helped her in front of the Senate committee). She studied engineering and got her Masters from Harvard Business School, before taking a job at Google. 

A sufferer of coeliac disease, Haugen has had severe health problems for much of her life, and was at one point hospitalised with a blood clot that nearly killed her. That proved a turning point in an unlikely way: a friend who supported her through her recovery started falling down a misinformation rabbit hole, drawn ever deeper into conspiracy theories in the run-up to the 2016 election. When Haugen tried to fact-check some of this “news”, he responded with incredulity that she would trust the mainstream media. Two years later, Haugen joined Facebook. 

Later this month, Haugen will speak at conferences on disinformation in Cambridge and Oxford. This is her job now: she campaigns for transparency and accountability in big tech, gives lectures on fake news, and is developing a “simulated social network” for engineering students to analyse and explore ways of improving products. One key issue is that everyone’s Facebook feed is personalised, making it hard to determine how widespread harmful content actually is. This enabled Nick Clegg, the former UK deputy prime minister and now president of global affairs at Facebook’s parent company, Meta, to argue in a 2021 article, “You and the algorithm: it takes two to tango”, that if users were seeing hate speech or posts encouraging self-harm, it was partly their fault for following the wrong people. A simulated social network could test that argument, and see to what extent Facebook’s algorithm pushes harmful or false content. 

For now, Facebook has little incentive to change. It was hard at times not to panic when talking to Haugen, who described herself as “extremely worried about the 2024 election”, pointing out that Facebook has disbanded its “Civic Integrity” team and that many of the employees who worked on tackling misinformation ahead of the 2020 election have quit. AI adds a further level of risk: “In a world where you can try 10,000 variations, you can really fine-tune conspiracy theories or misinformation,” she said, warning that, thanks to recent advances in large language models like ChatGPT, “the technology has already escaped”.  

Haugen remains hopeful that there is time to get this right, if regulators and internet users across the world act now. “Every single time we have invented a new media technology, it’s been disruptive and then we’ve learned and responded,” she said, citing journalistic ethics, competition regulation, civic education and investment in public media as solutions to the challenges caused by the breakthroughs of the past.

What would an adequate response to the threats now posed by social media look like? Haugen has a number of ideas, ranging from additional requirements for tech companies to publicly report the ethical costs of their work, as well as the financial gains, to tightening environmental, social and governance (ESG) guidelines for Silicon Valley investors.

But the big one is transparency. “In a world where Facebook couldn’t hide secrets, suddenly there [would be] much more space to do the right thing,” she said wistfully. “That’s one of the things we’re going to be really shocked by in 30 years, that we let it continue to run ahead in the dark without the ability to inspect it.” 

And what of the villain in this story, Facebook’s founder, Mark Zuckerberg, who at 39 has personal wealth that now exceeds $100bn (more than the GDP of Bulgaria), and who still holds the only vote that matters at Meta? Haugen has said that Zuckerberg’s iron grip on the company makes cultural change almost impossible, as every senior executive knows their primary job is to keep him happy. What would she say if she were given half an hour, one on one, to make her case to him?

“I would tell him: Mark, you are such a young man!” she answered. “You have infinite money, you could do anything. Don’t you want to do something new? You could go cure malaria! You could go pursue greatness. Why continue?

“Facebook can’t really grow, because Mark can’t really grow as long as he stays there. He’s been doing the same thing since he was 19 years old. I would never wish his life on anyone else.” 

Frances Haugen’s book “The Power of One: How I Found the Strength to Tell the Truth and Why I Blew the Whistle on Facebook” is published by Little, Brown

[See also: Why Big Tech pretends AI is dangerous]

Content from our partners
Putting citizen experience at the heart of AI-driven public services
Skills policy and industrial strategies must be joined up
How the UK can lead the transition to net zero

Topics in this article : , , ,

This article appears in the 05 Jul 2023 issue of the New Statesman, Broke Britannia