Every so often a quote will escape from Silicon Valley that, even by the standards of Silicon Valley hubris, makes me wonder whether these people are trying to sound like super-villains or if they’ve simply lost all purchase on the world outside their cubicles. Here’s Google Brain researcher Minmin Chen extolling the power of a new YouTube recommendation algorithm called Reinforce, designed to expand users’ tastes and so get them to spend more time on the site: “We can really lead the users towards a different state, versus recommending content that is familiar,” said Chen, who made the comments at an artificial intelligence conference in February.
A multinational media corporation boasting about its ability to incite a “different state” in its viewers has a dystopic ring anyway, but Chen’s boast took on a more sinister cast when it was quoted last week in a New York Times feature called “The Making of a YouTube Radical”. The article focused on the YouTube history of one user: Caleb Cain. His journey is the archetype of the young disenfranchised online male, whose initial liberalism was worn down by absorbing self-help advice from male authority figures, who then ushered him along to positions of anti-feminism and white nationalism, before he finally became absorbed in the world of left-wing response videos to his former right-wing viewpoints.
Because YouTube’s business is about keeping people on the site, its single most important function is to make recommendations that encourage users to keep watching, by incorporating data about the content as well as drawing on information about what similar users liked: “If this is what you want, have more of it.” My own YouTube “time-sink” is skincare, and a few hours of carelessly following the “up next” sidebar can take me from a comparison of serums, to a blackhead extraction in an LA aesthetician’s office, to watching horrified as some clearly uninsured resident of a Tennessee trailer park has a DIY cyst removal.
It’s a grubby business, but not half so grubby as the trail that led to Cain watching a video called “Blacks Are More Racist Than Whites Now”. He has repudiated the positions on sexism and race that he once consumed for hours on end, describing his earlier self as having been “brainwashed” by an ideology that served his need for a “sense of belonging”. However, thinking of yourself as a victim of brainwashing is hardly a sign that you’ve taken responsibility for your own opinions, and the change may be little more than superficial.
“What is most surprising about Mr. Cain’s new life, on the surface, is how similar it feels to his old one,” notes the article. “He still watches dozens of YouTube videos every day and hangs on the words of his favourite creators. It is still difficult, at times, to tell where the YouTube algorithm stops and his personality begins.” Cain would probably never have found the far right without YouTube. He thinks he has reasoned his way out; perhaps YouTube has simply directed the same impulses in another direction.
The strange entanglement of social media and politics is yet to be fully understood, and when things seem to be going wrong, the impulse is often to pin the blame on a particular bad character (such as a worker in a Russian troll factory). Often, though, the culprit is ourselves: a normal human, with normal human weaknesses, such as a hunger to be part of a social group and a susceptibility to the internal reward systems triggered when we go for just one more click.
But if Cain seems to have undergone his odyssey from left to right and back again without learning anything substantial about himself, at least he doesn’t seem to have become appreciably worse in that time. The same can’t necessarily be said for YouTube itself, which has only become more adept at pulling users down rabbit holes. And because YouTube uses “black box” algorithms (generated by machine learning rather than directly controlled by engineers), it’s often the case that no one – not the company, nor the content creators, nor the viewers – has a clear idea of why someone is being sent into a particular abyss.
One exceptionally disturbing example was the revelation that YouTube was automatically recommending videos of partially clothed children to certain users. These videos would be innocent enough by themselves, but once they were gathered together, the effect was very different. “You like this, have more of it” – where the parameters of “it” have been set by exactly the kind of person you would imagine to have an interest in watching partially clothed children. Unthinkingly, YouTube had created an online environment in which paedophilia was normalised.
The nature of being human is that all of us are a negotiation between the individual and the collective. The things we believe, the things we desire and the things we do all spring not from some innate font of the self, but from the grey area where private disposition meets public possibility. Social media reshapes that grey area, and so inevitably it reshapes each of us, leading us towards that different state; yet the way it acts on us is largely hidden away in algorithms and occluded by unfounded confidence in our own rationalism.
It’s not that social media alone can turn anyone into a white nationalist, or a child molester, or a rabid cyst-squeezer. But social media can feed inklings, directing people along paths would never go down in their daily lives. The consequences are scary, and the ructions we’re encountering now are probably the barest beginnings of what social media will ultimately mean for the public and private sphere. That is, it’s scary unless you’re a company whose business model is engagement, and for whom extremism and polarisation offers an opportunity for profit.
This article appears in the 12 Jun 2019 issue of the New Statesman, The closing of the conservative mind