A series of leaks revealed what critics of Facebook have suspected for a long time: that the company is aware of the harm its products can inflict. The leaks, which were published by the Wall Street Journal in September, showed that Facebook knows that Instagram can exacerbate mental health issues in teenage girls; that human traffickers and drug cartels freely use its services; that its algorithms reward inflammatory content, and that “elite” users are treated differently.
The whistleblower behind the leaks, Frances Haugen, a former Facebook product manager, revealed her identity for the first time in an interview with US news show 60 Minutes on 3 October. She then appeared before the US Congress, where she shared testimony about her experiences working for the social media giant.
“The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people,” Haugen told the assembled lawmakers. “Congressional action is needed.”
Facebook’s founder and CEO Mark Zuckerberg has pushed back on the allegations, writing “most of us just don’t recognise the false picture of the company that is being painted,” in a post addressed to company staff.
The leaks are not necessarily “explosive” in themselves, says Jillian York, the director of the Electronic Frontier Foundation, because “a lot of this is stuff that experts and researchers already know….What’s absolutely essential here, however, is that Haugen was able to provide receipts,” she says.
US politicians on both sides of the political divide are keen to regulate Big Tech platforms, but a consensus hasn’t solidified around how best to do this. Haugen proposed several ways to tackle Facebook’s harms at the policy level. But experts in the social media regulation space are critical of her suggestions.
How do you solve a problem like Facebook? It depends on what you consider the company’s most egregious harm. For some, it’s the rapid proliferation of “misinformation”, hate speech and conspiracy theories. For others, it’s that Facebook is a monopoly, and exerts far too much power over discourse, and how and what people can communicate.
Haugen falls into the former camp. She proposes undoing Facebook’s reliance on algorithmic amplification – the use of automation to serve content considered most relevant to a particular user. Instead, Facebook should be forced to present content in reverse chronological order, she argues, where the most recent posts appear at the top.
To do this, she suggests reforming Section 230, the section of US legislation which determines that social media companies aren’t liable for the content they host even if they moderate it. Social media companies that use algorithmic amplification should be legally liable for content, she proposes – based on the argument that companies such as Facebook aren’t passively hosting content, but intentionally directing users towards certain posts.
Reforming or repealing Section 230 is a hotly contested issue, but some experts agree with Haugen that the legislation should be reformed. “The US is an outlier in the regulation of intermediary liability,” says Nikolas Guggenberger, a clinical lecturer in law at Yale Law School. “If we look at Europe [there is] already a stricter regime when it comes to liability of platforms for content – I think a move in that direction would help to a certain extent.”
But Evelyn Douek, a lecturer on law at Harvard Law School, says that it’s uncertain what reforming Section 230 would achieve. “For many of the harms that Haugen appears concerned about, such as inflammatory and polarising content or prioritising engagement, the speech is not illegal and couldn’t be made illegal because of that pesky First Amendment,” says Douek. “That means that it’s not clear what exactly platforms could be sued for even if immunity for amplification was removed.”
Others warn that tampering with Section 230 would throw the door open to more damaging reforms in future. Collateral damage could include undermining human rights and disproportionately silencing marginalised people, according to Evan Greer, the director of Fight for the Future. This is because users outside of the mainstream would disproportionately suffer from the protections of Section 230 being stripped away.
Facebook itself is an advocate for Section 230 reforms. Although the company has pushed back on the WSJ’s reporting and Haugen’s statements, it said: “We agree on one thing; it’s time to create standard rules for the internet…it’s time for Congress to act.” Zuckerberg has previously expressed the same sentiment before Congress.
Facebook has the resources to manage an increased regulatory burden, but its competitors may not – likely bolstering the company’s market power. “That’s why Facebook would love for lawmakers to focus on Section 230 instead of on real federal data privacy legislation,” says Greer. “If Congress passed a privacy law strong enough, they could effectively kill Facebook’s surveillance capitalist business model entirely, without undermining free expression and human rights.”
Privacy wasn’t a major focus for Haugen, and despite her criticism of Facebook, she is opposed to breaking it up. “If you split Facebook and Instagram apart, it’s likely that most advertising dollars will go to Instagram and Facebook will continue to be this Frankenstein that is endangering lives around the world,” she said. “Only now there won’t be money to fund it.”
It’s an odd argument. As the New York Time’s Kevin Roose writes, Facebook’s leaked materials paint a picture of a company with its best days long behind it. Haemorrhaging users to other social media platforms such as TikTok, and facing dwindling engagement from younger users on both Facebook and Instagram, Roose cites the example of Facebook’s executives pondering how to monetise playdates as a mark of the company’s terminal decline.
Could a better solution be to carve Facebook up and let it wither? Haugen doesn’t want this; she believes Facebook can be better. “I don’t hate Facebook,” she wrote in a final message to her workplace. “I love Facebook. I want to save it.”
But experts say that saving Facebook is at odds with solving it. “At the core of the dysfunction in social media is a market power problem” says Guggenberger. He believes breaking up Facebook’s monopoly to create competition is at least part of the answer. This won’t be possible “if we rule out structural reforms, breakups, or stricter anti-trust and behavioural remedies”.
Haugen’s proposals might not solve Facebook’s issues – but they could end up making it stronger.