In 2016, a skinny vegan violinist named Nicholas Perry announced on YouTube that he was giving up veganism for health reasons. He began posting mukbang videos online, tapping into a trend that originated in Korea for watching people eat large quantities of food on camera. Soon Perry, who renamed himself Nikocado Avocado, was eating entire fast-food menus and mountains of flaming hot Cheetos and instant noodles for his digital audience. Within a few years his weight had ballooned to more than 400lb (around 180kg), and he sometimes appeared on videos wearing a CPAP oxygen mask or in a mobility scooter. Many of his hour-long videos have been viewed over a million times, but I found it too disturbing to watch for more than a few minutes what looks like an unending, slow-motion breakdown as Perry cries and rants and screams and flutters his eyelids and forces huge forkfuls of food into his mouth. Perry’s story is an extreme example of a feedback loop that many people are familiar with: you post something online, perhaps a snarky comment or a cat video or a bikini snap or a long, informative Twitter thread, and notice a spike in interest, which encourages you to post more of the same, until soon snark or cats or thirst traps or mansplaining become central to your online identity.
This is one of many mechanisms the writer and technologist Tobias Rose-Stockwell identifies by which social media encourages people to take extreme positions. Facebook’s own data has shown that borderline content – posts that almost meet the threshold to be banned for being too graphic, misleading or offensive – attract much higher engagement than more anodyne posts. We cannot help but gawk at the metaphorical (or actual) car crash in our social media feeds, and when we do we often feel compelled to respond, to signal our disgust or anger. This, in turn, triggers others to respond, creating a cascade of outrage.
This emotional volatility is often increased by the information we receive on social media – a photo or short video clip, an angry tweet – being stripped of its wider context. This can make it more inflammatory, because we are not witness to any attenuating circumstances, and contributes to a phenomenon Rose-Stockwell calls context creep, as people begin to project different meanings onto a controversy. A famous person, let’s call him Jim, is having a bad day and sends a rude tweet, which one group of outraged Twitter users conclude is subtly directed at his female colleague, proving that Jim is a raging misogynist. This triggers another group of outraged Twitter users to point out that it is dangerous to pile onto a vulnerable figure, Jim having spoken recently of his drug addiction. At which point, another group of angry users leaps in to rage about the stigmatisation of drug users, or the unassailability of male privilege… and so on.
Rose-Stockwell suggests it’s not coincidental that one of the most viral images on the internet – an image that at one point attracted 14,000 views per second on Tumblr – was “The Dress”, a photograph of an item of clothing that appeared blue-and-black to some viewers and white-and-gold to others. The image was ambiguous, but it encouraged certainty: those who saw it as blue-and-black couldn’t imagine how anyone could see it otherwise. Many of our moral emotions function in a similar way. Our judgments are rapid, instinctive and guided by our social group. When we clash with others online we are no more able to explain our values than we would be able to explain the neuroscience of “The Dress”.
In a book that makes heavy use of both graphs and metaphors, Rose-Stockwell asks readers to imagine a huge room with four billion humans inside, each with a microphone. Everyone is facing an electronic noticeboard under a sign that reads “the world’s most important sentence”. Something completely offensive appears, and then the words “if you don’t agree you are evil” followed by “if you agree, shame on you”. Eventually, you blurt out “this is ridiculous” and then notice a little ticker on your microphone that rates how much people have engaged with your outburst: four points. It’s a caricature, of course, but contains enough truth that, not for the first time, I felt a little sick imagining how much time I have spent in the digital equivalent of this room. When he recently launched Threads, a rival to Twitter, Meta’s CEO Mark Zuckerberg promised a “friendly place”, with an algorithmic feed that would de-rank news. Will this be enough to shift the vibe in the room, and if it does, what will it mean for Threads’ future? It’s possible that Twitter junkies cannot tear themselves away from the fights and melodrama.
Rose-Stockwell is a curious figure. This book includes an aside about how he only eats meat from species he has personally killed. In the early Noughties he unwittingly uncovered some secrets to internet virality. While backpacking in Cambodia he was persuaded by a monk to help a local village rebuild a reservoir. Rose-Stockwell was 22 at the time and emailed his friends, telling them about the villagers’ plight and asking for help. His friends started forwarding the email, and soon he was receiving messages from strangers asking what they could do. He ended up spending seven years in Cambodia, managing the reservoir project. He became interested in how the internet could be used to promote empathy.
In 2009, after returning to California, he made friends with social entrepreneurs with similar interests: one friend founded the petition platform, Change.org, another group were behind the viral film Kony 2012, made to raise awareness of atrocities committed by the Ugandan warlord Joseph Kony. (It’s unclear, from Rose-Stockwell’s account, the role he played in these efforts or what exactly his job was.)
Rose-Stockwell noticed in the early 2010s that even as popular sites such as Upworthy were coming to embody the way the internet could be used to pull on people’s heartstrings, they also showed how it could stoke moral outrage and deepen social division. The Yale psychologist Paul Bloom has observed that empathy is double-edged: it’s an emotion that can invoke violence as well as compassion, and it is easily manipulated. Politicians, for instance, exploit our capacity for empathy when they highlight atrocities committed by the enemy to drum up support for war.
In 2018 Rose-Stockwell started collaborating with the US psychologist Jonathan Haidt, who writes on how Americans have become less tolerant of challenging ideas and how social media has harmed young people’s mental health. In a foreword for Outrage Machine, Haidt credits Rose-Stockwell with helping him better understand the role played by technological design in these trends. The book, Rose-Stockwell’s first, was inspired by the success of a Medium post he published in late 2016 explaining how Facebook “broke democracy” – contributing to the lack of empathy and understanding between Trump and Clinton supporters with its algorithmic news feed, which ensured that many people did not encounter news that challenged their world-view. Outrage Machine also explores the history of journalism, and the proper role of tolerance and compromise in a well-functioning liberal democracy. This is, in other words, an ambitious book, albeit one that treads a lot of familiar ground. But Rose-Stockwell’s analysis is cogent and accessible.
[See also: Travel influencers are making tourism dumber]
The book concludes with a few proposed solutions, some of which are sensible if banal: he writes that it’s a good idea to limit your exposure to social media sites that are making you feel bad, to focus on building real friendships and relationships rather than online reach, and to try to disagree with people productively, by for instance not insulting them. He also suggests a low news diet, with a focus on wire services such as AP or Reuters and new sites such as The Flip Side that try to present both sides of an argument. (I am biased, of course, but this sounds very dull and is quite condescending.) He suggests that social media sites should have governing constitutions that can be amended by users, and a ranking algorithm that is transparent and that “proportionately serve[s] the most accurate information available” by “[prioritising] statistical facts above emotional anecdotes”. But who determines what’s accurate?
It’s a conclusion that reveals a naive belief that important truths can be condensed down into facts and figures, that if only we stopped winding each other up online we’d find the answers. Social media might amplify discontent and division, but that doesn’t mean we can code our way out of cultural conflict. ●
Piatkus, 416pp, £14.99
Purchasing a book may earn the NS a commission from Bookshop.org, who support independent bookshops
This article appears in the 26 Jul 2023 issue of the New Statesman, Summer Special