New Times,
New Thinking.

  1. Long reads
23 April 2020updated 26 Apr 2020 3:27pm

YouTube at 15: how the world’s biggest broadcaster broke the media

A system built to keep people watching videos has had huge consequences for our society and politics.

By George Grylls

In the opening scenes of the rapper Jukkie’s music video “In Too Deep”, two women return from a night in Cardiff covered in blood. They wash their hands before stripping out of their bloodied clothes. Then, as if taking part in a ritual, they stand in their underwear as the rapper drops their clothes into an oil drum and lights a match, and watch as the evidence of the crime they have just committed goes up in flames.

This music video was uploaded to YouTube last summer. It is still available to watch at any time of day and with no age filter. A BBC Wales investigation reported that Jukkie, whose real name is Laurent Mondo, has spent time in prison on drugs and knife crime charges. Following a recent spate of stabbings in Cardiff, the local police, councillors and an MP all want YouTube to take the music video down. They fear that Jukkie is normalising, even romanticising, the city’s gang culture.

“When I went to YouTube to ask them to remove it, they wouldn’t, saying it was legitimate artistic expression and they didn’t want to be seen banning grime and drill music videos,” says Stephen Doughty, Labour MP for Cardiff South and Penarth. “I’m talking about stuff that’s glorifying violence.” 

In Google’s glass-walled offices in King’s Cross, Ben McOwen Wilson, YouTube’s chief executive for Europe, the Middle East and Africa, clarified the platform’s stance: “Our position is we are not going to block drill wholesale.

“In Too Deep” glamorises violence, but there is undoubtedly a case that it has artistic merit. What is remarkable is that the people of Cardiff – who, through a variety of public institutions have stated their opposition to the video – have been overruled by YouTube. A piece of content that may have real consequences in Wales has been approved by an unnamed YouTube moderator sitting in front of a screen in Lisbon, Dublin or one of many unidentified locations in the UK. Such is the state of broadcasting legislation in 2020. How did we get here?

***

Fifteen years ago, on 23 April 2005, a young man stood in front of the elephant enclosure at San Diego Zoo and mumbled something self-consciously into a camera about the animals’ trunks. The man was Jawed Karim, one of YouTube’s three co-founders and the video, the first 18 seconds of footage uploaded to the site, would change history. “Me at the zoo” was indicative of the genre that YouTube would pioneer – the online clip.

Give a gift subscription to the New Statesman this Christmas from just £49

With this new, all-conquering form, YouTube was able to grow into a monopoly. “It’s a way for the world to consume and share video and there isn’t another one of those,” says Wilson. “There I don’t think of us really as being directly competitive to anyone.” Anticipating the shift towards online clips, Google purchased YouTube for $1.65bn in 2006. In 2018, Morgan Stanley estimated its worth to be 100 times that. As a consequence, the two most popular websites in the world are owned by the same company.

Since then, other tech giants have scrambled to keep up with internet users’ insatiable appetite for digital video. After the iPhone was launched in 2007, the rise of the smartphone accelerated the shift to online clips. In 2012, Facebook and Twitter bought audiovisual apps – Instagram and Vine respectively. The following year Facebook began autoplaying videos in people’s feeds, with Mark Zuckerberg publicly acknowledging in 2016 that his company was moving to a policy of “video first”.

“In general people find audiovisual an easy way to consume,” says Wilson. “That can be at the really prosaic end of things, like try reading some Ikea instructions or watch some bloke building it. Or it can be the news.”

While Facebook, Twitter, and more recently TikTok have cottoned on to the trend, YouTube remains the market leader. According to Ofcom, nine out of ten people in the UK visit YouTube every month, and young people between the ages of 16 and 34 spend an average of 64 minutes a day on the platform. For those between the ages of 16 and 24, YouTube is the most popular media source of all.

***

Dr Sharon Coen, a media psychologist at the University of Salford, notes that content on YouTube “adopts features that are typical of tabloids”. She highlights the way that the site’s users tend to frame their videos in a clickbait fashion. “They use sensationalised titles and headlines, references to identity, particular visual cues that elicit negative emotions.” Because audiovisual content can be so easily consumed and because, as a species, we are so influenced by moving images, there are strict broadcasting rules for television in the UK. No such legislation exists for the internet.

This free-for-all has consequences. YouTube has undoubtedly democratised the UK arts scene, helping to bring talents such as Stormzy and shows such as the mockumentary People Just Do Nothing to mainstream attention. But there is a strong argument that it has debased political discussion. Paul Joseph Watson, a conspiracist YouTube vlogger and Ukip activist, who makes videos with titles such as “JK Rowling is a Vile Piece of Sh*t” and “F**k the ‘Vote Remain’ Campaign”, has 1.8 million subscribers. That means that Watson – whose most popular video ponders whether the US government concealed warnings about 9/11 in the $20 bill – has an audience five times as large as Newsnight. He recently began broadcasting about coronavirus.

“Today these are the principal pipes through which many people get their news and information and view of the world,” says Damian Collins, the former chair of the Department for Digital, Culture, Media & Sport (DCMS) select committee. “We’ve got to move away from thinking about broadcasting being something the BBC and ITV do, something that’s done through dishes and signals and boxes, to thinking that it’s about audiovisual content and audiences. Where there’s a role for oversight is where the biggest audiences gather.”

Under Collins’ stewardship during the previous parliament, the DCMS select committee played a pioneering role in attempting to legislate against Big Tech. Faced with the continued ubiquity of harmful online content, the government recently announced that it was “minded to” introduce legislation. In what appears to be a watered-down version of Collins’ recommendations, the broadcasting regulator Ofcom will be given a role in publicly holding social media companies to account. The Big Tech platforms will have a “duty of care” to their users. If they fail to prevent harmful content from spreading, they could face substantial fines.

In reality, the sheer scale of YouTube means that, even if such legislation does make its way through the voting lobbies, its effect may well be cosmetic. “Ofcom has 1,000 employees in total. YouTube has 10,000 community moderators,” says Chris Stokel-Walker, a journalist and author of YouTubers: How YouTube Shook Up TV and Created a New Generation of Stars. “And even they can’t even keep up with the borderline cases.”

What is more, the proposed legislation confronts only content relating to terrorism, paedophilia and racism. The government’s white paper fails to address the content on YouTube that contravenes British broadcasting law on current affairs and news being reported with impartiality and balance, and does not refer to the idea of broadcasting as a public service.

“The people who set up the BBC were not interested in what was good for the industry, but in what was good for society,” says David Hendy, a historian of the British media, reflecting on the last big broadcasting boom. “In the aftermath of the First World War, the point of broadcasting was to heal society.”

In contrast, YouTube’s founders Steven Chen, Chad Hurley and Jawed Karim sold their stakes in the firm just 18 months after establishing it. The site makes its money through content that grabs people’s attention; the substance of that content is less important. This is why a sensationalist huckster like Paul Joseph Watson can broadcast to millions.

“Why aggressive, anxiety-provoking, maudlin, polarising discourse should prove more profitable than its opposite is a mystery,” George Saunders writes in the title essay of his 2007 collection, The Braindead Megaphone. “In surrendering our mass storytelling function to entities whose first priority is profit, we make a dangerous concession: ‘Tell us,’ we say in effect, ‘as much truth as you can while still making money.’ This is not the same as asking ‘Tell us the truth.’”

***

When you watch a video on YouTube – let’s say, for argument’s sake, a compilation of Arsenal goals from the 2003-04 “Invincibles” season – a series of recommended videos appears on the side of the viewing window. Because YouTube makes money through advertising, its main aim is to keep its viewers watching videos for as long as possible. It does this through its recommendation algorithm.

Using the information that you are in an Arsenal-watching mood, the recommendation algorithm populates the webpage with related videos – perhaps Thierry Henry’s flicked volley against Manchester United in 2000 or Arsène Wenger’s final press conference from 2018. When the first video ends, YouTube automatically plays another.

The cumulative effect of this will be familiar to many; it is mind-numbing. Your critical faculties shut down. Seconds fade into minutes, which fade into hours. You become a lotus-eater of personalised pixels.

But this does not last a single sitting. Over time, the algorithm uses your viewing history and any other information the site can gather about you to predict which recommendations will make you more likely to watch another video, and another, and another. In 2018, YouTube’s chief product officer Neal Mohan admitted that 70 per cent of videos watched on the platform were viewed via the recommendation algorithm.

The algorithm does not make these predictions in terms of preferences or moral choices, as a human would; it unremittingly rewards videos that keep the viewer engaged. The effects of this can be disturbing. In 2019, the New York Times reported that a home video of a child playing in a paddling pool in Rio de Janeiro, initially shared only with the child’s family, was watched 400,000 times in a matter of days because the algorithm was recommending it to users who had previously watched other videos of partially clothed children.

Another effect of recommendation by logic is that viewers are shown videos that further intensify emotions elicited by earlier videos. YouTube does not measure emotional responses, but it measures their effects. Just as drug addicts chase an ever more intense high, so YouTube viewers are more likely to remain enthralled if the subject matter they are presented with is gradually intensified. Just one more click. Just one more video. Critics call this the “rabbit hole effect”.

“They are not only curating the content, they are doing so with a commercial imperative,” says Damian Collins. “The question is then: does it matter what you direct people to, as long as you hold their attention?”

If you are watching Arsenal videos, intensification is not really a problem. You move from the greatest Dennis Bergkamp passes to the most insane Robin van Persie volleys. But if the subject matter is vaccines, pandemics, or politics, then it is. In 2019 a New York Times investigation argued that a fringe politician called Jair Bolsonaro gained enough popularity to run for the Brazilian presidency, in part because he benefited from the YouTube’s recommendation algorithm.

Everything from the rise of climate change denial to the recent resurgence of German neo-Nazism has been linked to the tendency towards extremes that keeps people watching online video clips.

Yvette Cooper, who as Labour chair of the home affairs select committee looked into extremist material online, found herself being recommended more and more far-right videos because of her research into the platform. She recently concluded that the website was “an organ of radicalisation”.

****

When I put this to Ben McOwen Wilson in a plywood meeting pod at Google’s London headquarters, he admitted that there had been problems with the algorithm, but that this had been addressed. “Last year we began down-rating content that was not in breach of our guidelines, not in breach of our policies, certainly not in breach of the law – but largely, not great content that we would be super-proud of to have on our platform,” says the former director of online at ITV. “We are not seeking to deliberately inject balance,” he adds.

“There are,” he says, “two exceptions. One is around violent extremism messaging, when we seek to push [users] down a different route. The second is where it is explicitly conspiracy theories. Then we absolutely surface links out to Encyclopaedia Britannica or Wikipedia or other videos on the site that say these shootings did happen, people did land on the moon, the Earth is not flat.” If you go on YouTube today, you will find the page covered with links to NHS pages with information on coronavirus.

I gave Wilson an example of where, despite the improvements, I thought the algorithm was still going wrong. Christopher Hitchens is popular on YouTube. Although his politics were hard to nail down, for most of his life (including his time at the New Statesman in the 1970s), Hitchens was pretty solidly associated with the left. Yet his criticism of Islam and his forceful style of debating have led to him being bracketed by YouTube’s algorithm as a figure of the right. Inadvertently or not, an editorial decision has been made about Hitchens and it is influencing the beliefs of millions of people.

“Did you get a Richard Dawkins video?” asked Wilson of what YouTube ought to have recommended me, based on an initial viewing of a Christopher Hitchens video. “From Richard Dawkins you might have got to David Attenborough. And from David Attenborough you might have got to penguin videos. But you’re not telling me that story.”

When I got home from the interview with Wilson, I tried to map the series of recommendations that Wilson had suggested. I typed “Christopher Hitchens” into the search bar of a clean browser and clicked on the first video that YouTube presented me with – “Hitchens delivers one of his best hammer blows to cocky audience member”. As Wilson suggested, after this video, I was indeed recommended a Richard Dawkins video – “Confused girl questioning Richard Dawkins religion”.

However, at this point the algorithm took me to a different place from the one Wilson suggested. I did not get David Attenborough and I did not get penguins. Instead, I got a more extreme version of the previous video – “Richard Dawkins Tears Muslim Politician a New One”. Then came videos of Jordan Peterson, the outspoken Canadian psychology professor. In three short clicks I had gone from watching Hitchens express his hope for “the good relations that could exist between different peoples, nations, races, countries, tribes, ethnicities” to watching white men aggressively confront Muslim women about their beliefs.

***

There is a reason why you can’t find pornography on YouTube. Every pixel of video uploaded to the site is trawled for nipples, buttocks and genitalia by “flesh filters”. Likewise, the appearance of certain racist words will set off algorithmic checks.

YouTube argues that it makes no editorial decisions, or rather that such decisions are so slight as to not require oversight. “We’re not the editor,” says Wilson. “I’m not signing off on it, I’m not approving of it, I’m not disapproving of it.” Nevertheless, even if technology enacts the removal of porn from YouTube, a human being made the original decision to ban it. The more you dig, the more it becomes clear that throughout the business there is a lot of editing, and most of it is done by human eyes.

“We have to work with the UK on knife crime and gang violence,” says Wilson, explaining YouTube’s policy of obeying “trusted flaggers”, who advise the company on the removal of certain videos. Charities, welfare groups and public bodies lend their expertise to YouTube’s 10,000 worldwide moderators so that they can make locally sensitive decisions on how to respond to difficult issues such as knife crime. “We work with Catch22, Red Thread, the Mayor’s office, the Metropolitan Police.”

However, as the case of “In Too Deep” shows, there is scope for error, disagreement or a combination of both. An anonymous moderator made a subjective decision: that the artistic merit of the video outweighed its potential for harm. As a society, we need to ask ourselves if we consider it appropriate to leave YouTube to its own devices, and to allow unnamed employees to remain the authority on such questions.

The implication of such a decision would be that we are moving to a more American approach to broadcasting, in which private companies enforce their own standards. “The role of the state is to draw the line between legal and non-legal,” says Jim Killock, a digital campaigner at Open Rights Group, which argues that such editorial decisions should remain in the hands of the tech companies. “A regulator will be making political decisions.”

***

The online clip is becoming the dominant media form. In our pockets, rucksacks and handbags, we hold a device that can film, edit and broadcast in a matter of seconds. That does not mean, however, that broadcasting standards cannot be upheld.

“There’s a difference between someone posting a video of their cat for their friends, and someone who is deliberately building their career and their reputation and an audience of very large numbers, using YouTube as the platform through which to do it,” says Damian Collins. “It’s the separation of the idea between freedom of speech, and freedom of reach.”

The alternative to this is to continue to allow huge audiences to watch unregulated videos recommended by an algorithm guided primarily by a commercial imperative. To do so would cement a change in the role of broadcasting in our democracy: that it is no longer a public service.

This has profound implications. Last month it emerged that many of YouTube’s moderators have been sent home because of the coronavirus pandemic. That implies that during an unprecedented crisis, the world’s biggest broadcaster will rely to a greater extent on algorithms to sort irony from sincerity, information from propaganda, and fact from fiction.

In a video entitled “Coronavirus,” the YouTuber Paul Joseph Watson claims that there is “no pandemic in Russia”, that the World Health Organisation has tried to “language-police the words you’re allowed to use to describe coronavirus”, and that the dangers of the disease may have been “exaggerated to whip up hysteria for political purposes”. At the time of writing, this video has been watched 879,000 times.

The false theory that 5G wireless technology is spreading the virus has also been linked to YouTube. The New York Times calculated that the ten most popular 5G conspiracy videos posted on YouTube were watched 5.8 million times in March. And while YouTube was reported by the BBC to have “banned all conspiracy theory videos falsely linking coronavirus symptoms to 5G networks” after complaints about an interview with David Icke in which he expounded his demonstrably false theories about the pandemic, Icke continues to broadcast misinformation about Covid-19 to his 880,000 subscribers.

The lax regulation of online broadcasting has real and dangerous results, of which the burning of 5G masts across the country are but one example. As the UK drifts towards an American-style broadcast model, the implications of this should be considered by the government as a matter of the utmost urgency. During this pandemic, we are already paying the costs for our inaction.

Content from our partners
Pitching in to support grassroots football
Putting citizen experience at the heart of AI-driven public services
Skills policy and industrial strategies must be joined up