A still from Dishonored.
Show Hide image

Why are we still so bad at talking about video games?

In the past 30 years, video games have become more beautiful, more intricate and more intense - but we still lack a critical language to evaluate them. Will we ever move beyond previews and reviews?

I can’t remember the first computer game I played. It might have been Killer Gorilla, which was written by a British 17-year-old called Adrian Stephens who had seen screenshots of Donkey Kong in a magazine and decided to make his own version in his bedroom.

Killer Gorilla was published in 1983, the year I was born, so it must have been hanging round in my brother’s collection for several years before I played it. In those days, games came on a cassette tape, which whined with static if you put it in a music player. The machine we had was an Acorn Electron – another knock-off, this time of the more expensive BBC Micro.

Looking at pictures of Killer Gorilla now, it’s hard to believe it kept me occupied for so long, furiously tapping away at the keyboard – Z for left, X for right, and “return” to jump. There was no story (save the jealous love of a primate for a princess), the graphics were basic and the sound consisted mostly of a sad “bingy bongy boo” whenever you died, which was often.

Compare that with the big-name releases in the run-up to Christmas 2012; the so-called triple-A titles that dominate games magazines and newspaper reviews. In the past few weeks, I’ve played three of the best: Bethesda’s steampunk stealth adventure Dishonored, Gearbox Software’s sarcastic space western Borderlands 2 and 343 Industries’ straight-faced military romp Halo 4. Each will have cost more than £15m to make, and several million more to market, and would have involved hundreds of people (Halo 4 had 300 just in the game development team).

These games are gorgeous, delivering both sweeping vistas and fine-grained details, and Dishonored, in particular, has a voice-acting cast to rival a Hollywood film: Susan Sarandon, Chloë Grace Moretz and Mad Men’s John Slattery. They are all critically acclaimed, with each scoring around 90 per cent on the review aggregator site Metacritic.

And yet, I can’t help feeling that something is missing. Technically, video games have matured hugely since I was mashing the Electron’s keyboard in the 1980s, but I don’t have the conversations about them that I have about books or film or music. Having missed out on Channel 4’s GamesMaster from 1992 to 1998, I can think of only one recent television programme I’ve seen devoted to them: Charlie Brooker’s one-off Gameswipe. Most newspapers have a single short review a week, if that and games are rarely mentioned on bastions of arts programming such as Radio 4 or BBC2. Discussion of games focuses heavily on whether a particular title is worth buying.

Now, you might not find that surprising – because you think games are a niche pursuit or that they’re new. But you’d be wrong on both counts. In the US, 245.6 million video games were sold in 2011, according to the Entertainment Software Association. Microsoft says users have spent 3.3 billion hours playing its Halo series online. Read that again: 3.3 billion hours. As for being newfangled, how about this: a ten-year-old who played Pong when Atari first released it will have celebrated her 50th birthday this year.

Does this matter? It does if you think the unexamined hobby is not worth having. And it does if you wonder, like me, whether the lack of a serious cultural conversation about games is holding back innovation. The background of games in programming culture meant that for many years their development was seen purely in terms of what they could do. But while, say, improved graphical rendering means that modern titles look astonishing, I find myself thinking: is it really such an achievement for a sunset to look 96 per cent as good as a real one?

In 2004, Kieron Gillen wrote a much-referenced essay called “The New Games Journalism”, in which he eviscerated most of his contemporaries for being unimaginative drones, who churned out previews and reviews, and stopped writing about a game at the exact moment their readers started playing it.

He rejected the idea that “the worth of a video game lies in the video game, and by examining it like a twitching insect fixed on a slide, we can understand it” and instead urged writers to become “travel journalists to imaginary places”. The New Games Journalism would be interesting even to people who would never visit those places.

Gillen’s article prompted much soul-searching, and many sub-Tom Wolfe pieces in which people bored on for thousands of words about seeing a pixel and suddenly understanding what love was. But eight years later, the state of games writing is even more bleak. Metacritic, which I mentioned earlier, presents an obvious problem. The industry places enormous weight on the scores it aggregates; as Keza MacDonald of the gaming website IGN noted, “eager, inexperienced writers from smaller sites have been known to give very high scores knowing that their review will appear near the top of the listings and refer traffic”.

“As games have developed and there are more interesting things to talk about, like their narratives, their artistic statements, occasionally even their cultural significance, reviews are still often expected to be an overview of a game’s features with a numerical value on the end,” MacDonald tells me. “This is as much the audience’s problem as the outlets’. Readers expect scores and they expect ‘objective’ analyses of games, even as the games themselves have got to a point where that’s not possible any more.”

Gillen is surprisingly relaxed about the direction criticism has taken since his manifesto (and he has now “retired” from games journalism to write comics). “I’ve learned to be philosophical about this one,” he tells me. “The old has always feared and suspected the new. They’ll reject the new for failing to match the old on the old’s terms, failing to realise that its achievements are entirely separate . . . Fundamentally: eventually old people die.”

Elsewhere, however, others are continuing the fight he started. Naomi Alderman is a novelist, a games critic and a games writer, and she concurs that we need to find a way to write about games for people who don’t play them. “You need the vocabulary of an art critic to talk about the graphics, of a novel critic to talk about the storytelling, of a film critic to talk about the performances: not to mention music criticism, and gameplay criticism,” she says. “We need to find a way to talk about what’s interesting about a game –what makes the gameplay so enjoyable, what’s great about the aesthetics, how good the narrative is, and where it fits among other similar games.”

Playing Halo 4, Borderlands 2 and Dishonored side by side made me think of all the common features of first-person shooters; the tropes born of necessity, like slowly opening gates to disguise loading times, or travels by boat or aeroplane to keep you still while expository dialogue is delivered.  But there’s so little criticism out there that writes about games belonging to the same genre: in fact, the only sustained critique of the “narrator” character common to many shooters – because you need someone to tell you where to go and what to do – comes from 2007’s BioShock, where that control itself becomes an integral party of the story.

Perhaps that revolution in games criticism will never happen. Ed Stern, who was a writer on the 2011 shooter Brink, says: “It’s currently easy for the book-literate to find everything fascinating about games other than the games themselves. Culturally, sociologically, technologically, in terms of gender and race and sexual and generational politics, they’re a fascinating prism. They just tend not to mean very much in themselves – because it’s spectacularly, trudgingly hard to make games mean things, not least because the big ones are made by so many different pairs of hands.” For the sake of readers – and writers – I hope he’s wrong.

Helen Lewis is deputy editor of the New Statesman. She has presented BBC Radio 4’s Week in Westminster and is a regular panellist on BBC1’s Sunday Politics.

Picture: Bridgeman Images
Show Hide image

The people is sublime: the long history of populism, from Robespierre to Trump

If liberal democracy is to survive, the tide of populism will have to be turned back. The question is: how?

A spectre of populism is haunting the world’s liberal democracies. Donald Trump’s victory in the US presidential election, the narrow Leave majority in the EU referendum, Theresa May’s decision to call a snap election – breaking the spirit of the Fixed-Term Parliaments Act passed by the government of which she was a member – and Recep Tayyip Erdogan’s victory in the recent Turkish referendum all testify to the strength of the populist tide that is sweeping through the North Atlantic world. The consequences have been calamitous: a shrunken public realm, a demeaned civic culture, threatened minorities, contempt for the rule of law and an increasingly ugly public mood. If liberal democracy is to survive, the tide will have to be turned back. The question is: how?

The first essential is to understand the nature of the beast. This is more difficult than it sounds. Most democratic politicians seek popularity, but populism and popularity are not the same. Today’s populism is the descendant of a long line of ancestors. The first unmistakably populist movement in history appeared well over two centuries ago during the later stages of the French Revolution. It was led by Robespierre (Thomas Carlyle’s “sea-green incorruptible”) and the Jacobins who promised a reign of “virtue”. They were inspired by the cloudy prose of Jean-Jacques Rousseau, who believed that mere individuals should be subject to the general will of the social whole and – if necessary – “forced to be free”. As the revolution gathered pace and foreign armies mustered on France’s frontiers, the Jacobins launched the first organised, state-led and ideologically legitimised Terror in history. Chillingly, Robespierre declared, “The people is sublime, but individuals are weak.” That is the cry of populists through the ages. Appropriately, the Terror ended with Robespierre lying on a plank, screaming with pain before he was executed by guillotine.

The French Revolution – which began with the storming of the Bastille and ended with Napoleon’s ascent to an ersatz imperial throne – has an epic quality about it missing from later chapters in the populist story. Ironically, the second chapter, which opened half a century later, was the work of Louis Bonaparte, nephew of the great Napoleon. In 1848 came a second revolution and a second Republic; Louis Bonaparte was elected president by a huge majority. He tried and failed to amend the constitution to make it possible for him to have a second term; and then seized power in a coup d’état. Soon afterwards he became emperor as Napoleon III. (“Napoleon le petit”, in Victor Hugo’s savage phrase.) The whole story provoked one of Karl Marx’s best aphorisms: “History repeats itself; the first time as tragedy and the second as farce.”

There have been plenty of tragedies since – and plenty of farces, too. Trump’s victory was a tragedy, but farcical elements are already in evidence. Erdogan’s victory was even more tragic than Trump’s, but farce is conspicuously absent. The Leave victory in the referendum was tragic: arguably, the greatest tragedy in the three-century history of Britain’s union state. As with Trump, farce is already in evidence – the agitated comings and goings that have followed Theresa May’s loss of her Commons majority; the inane debate over the nature of the Brexit that Britain should seek; and the preposterous suggestion that, freed of the “Brussels” incubus, Britain will be able to conclude costless trade deals with the state-capitalist dictatorship of China and the “America First” neo-isolationists in Washington, DC. Unlike the French farce of Napoleon III’s Second Empire, however, the British farce now in progress is more likely to provoke tears than laughter.


Picture: André Carrilho

Populism is not a doctrine or a governing philosophy, still less an ideology. It is a disposition, perhaps a mood, a set of attitudes and above all a style. The People’s Party, which played a significant part in American politics in the late 19th century, is a case in point. The farmers whose grievances inspired the People’s Party wanted cheaper credit and transport to carry their products to markets in the eastern states. Hence the party’s two main proposals. One was the nationalisation of the railways, to cheapen transport costs; the other was “free silver” – the use of silver as well as gold as currency, supposedly to cheapen credit. Even then, this was not a particularly radical programme. It was designed to reform capitalism, not to replace it, as the largely Marxist social-democratic parties of Europe were seeking to do.

Rhetoric was a different matter. Mary Elizabeth Lease, a prominent member of the People’s Party, declared that America’s was no longer a government of the people by the people and for the people, but “a government of Wall Street, by Wall Street and for Wall Street”. The common people of America, she added, “are slaves and monopoly is the master”.

The Georgian populist Tom Watson once asked if Thomas Jefferson had dreamed that the party he founded would be “prostituted to the vilest purposes of monopoly” or that it would be led by “red-eyed Jewish millionaires”. The People’s Party’s constitutive Omaha Platform accused the two main parties of proposing “to sacrifice our homes, lives and children on the altar of Mammon; to destroy the multitude in order to secure corruption funds from the millionaires”. The party’s aim was “to restore the government of the Republic to the hands of ‘the plain people’ with which class it originated”. Theodore Roosevelt promised “to walk softly and carry a big stick”. The People’s Party walked noisily and carried a small stick. Jeremy Corbyn would have been at home in it.

Almost without exception, populists promise national regeneration in place of decline, decay and the vacillations and tergiversations of a corrupt establishment and the enervated elites that belong to it. Trump’s call to “make America great again” is an obvious recent case. His attacks on “crooked Hillary”, on the courts that have impeded his proposed ban on Muslim immigrants from capriciously chosen Middle Eastern and African countries, on the “fake news” of journalists seeking to hold his administration to account, and, most of all, his attack on the constitutional checks and balances that have been fundamental to US governance for more than 200 years, are the most alarming examples of populist practice, not just in American history but in the history of most of the North Atlantic world.

There are intriguing parallels between Trump’s regime and Erdogan’s. Indeed, Trump went out of his way to congratulate Erdogan on Turkey’s referendum result in April – which gives him the right to lengthen his term of office to ten years, to strengthen his control over the judiciary and to decide when to impose a state of emergency. Even before the referendum, he had dismissed more than 100,000 public servants, including teachers, prosecutors, judges and army officers; 4,000 were imprisoned. The Kurdish minority was – and is – repressed. True, none of this applies to Trump. But the rhetoric of the thin-skinned, paranoid US president and his equally thin-skinned and paranoid Turkish counterpart comes from the same repertoire. In the Turkish referendum Erdogan declared: “My nation stood upright and undivided.” It might have been Trump clamorously insisting that the crowd at his inauguration was bigger than it was.

***

The best-known modern British populists – Margaret Thatcher, Nigel Farage and David Owen – form a kind of counterpoint. In some ways, all three have harked back to the themes of the 19th-century American populists. Thatcher insisted that she was “a plain, straightforward provincial”, adding that her “Bloomsbury” was Grantham – “Methodism, the grocer’s shop, Rotary and all the serious, sober virtues, cultivated and esteemed in that environment”. Farage declared that the EU referendum was “a victory for ‘the real people’ of Britain” – implying, none too subtly, that the 48 per cent who voted Remain were somehow unreal or, indeed, un-British.

On a holiday job on a building site during the Suez War, Owen experienced a kind of epiphany. Hugh Gaitskell was criticising Anthony Eden, the prime minister, on television and in the House of Commons, but Owen’s workmates were solidly in favour of Eden. That experience, he said, made him suspicious of “the kind of attitude which splits the difference on everything. The rather defeatist, even traitorous attitude reflected in the pre-war Apostles at Cambridge.” (Owen voted for Brexit in 2016.)

Did he really believe that Bertrand Russell, John Maynard Keynes and George Moore were traitorous? Did he not know that they were Apostles? Or was he simply lashing out, Trump-like, at an elite that disdained him – and to which he yearned to belong?

Thatcher’s Grantham, Farage’s real people and David Owen’s workmates came from the same rhetorical stable as the American populists’ Omaha Platform. But the American populists really were plain, in their sense of the word, whereas Thatcher, Farage and Owen could hardly have been less so. Thatcher (at that stage Roberts) left Grantham as soon as she could and never looked back. She went to Somerville College, Oxford, where she was a pupil of the Nobel laureate Dorothy Hodgkin. She married the dashing and wealthy Denis Thatcher and abandoned science to qualify as a barrister before being elected to parliament and eventually becoming prime minister. Farage worked as a metals trader in the City before becoming leader of the UK Independence Party. Owen went to the private Bradfield College before going up to Cambridge to read medicine. Despite his Welsh antecedents, he looks and sounds like a well-brought-up English public school boy. He was elected to parliament in 1966 at the age of 28 and was appointed under-secretary for the navy at 30. He then served briefly as foreign secretary in James Callaghan’s miserable Labour government in the 1970s.

Much the same is true of Marine Le Pen in France. She is a hereditary populist – something that seems self-contradictory. The Front National (FN) she heads was founded by her father, Jean-Marie Le Pen – Holocaust denier, anti-Semite, former street brawler and sometime Poujadist. In the jargon of public relations, she has worked hard to “de-toxify” the FN brand. But the Front is still the Front; it appeals most strongly to the ageing and insecure in the de-industrialised areas of the north-east. Marine Le Pen applauded the Leave victory in Britain’s referendum – she seeks to limit immigration, just as Ukip did in the referendum and as the May government does now.

Above all, the Front National appeals to a mythologised past, symbolised by the figure of Joan of Arc. Joan was a simple, illiterate peasant from an obscure village in north-eastern France, who led the French king’s forces to a decisive victory over the English in the later stages of the Hundred Years War. She was captured by England’s Burgundian allies, and the English burned her at the stake at the age of 19. She was beatified in 1909 and canonised in 1920. For well over a century, she has been a heroine for the Catholic French right, for whom the revolutionary triad of liberté, egalité, fraternité is either vacuous or menacing.

***

The past to which the FN appeals is uniquely French. It is also contentious. A struggle over the ownership of the French past has been a theme of French politics ever since the French Revolution. But other mythologised pasts have figured again and again in populist rhetoric and still do. Mussolini talked of returning to the time of the Roman empire when the Mediterranean was Mare Nostrum. Trump’s “Make America great again” presupposes a past when America was great, and from which present-day Americans have strayed, thanks to Clintonesque crooks and the pedlars of fake news. “Take back control” – the mantra of the Brexiteers in the referendum – presupposes a past in which the British had control; Owen’s bizarre pre-referendum claim that, if Britain left the EU, she would be free to “rediscover the skills of blue water diplomacy” presupposed a time when she practised those skills. Vladimir Putin, another populist of sorts, is patently trying to harness memories of tsarist glory to his chariot wheels. Margaret Thatcher, the “plain, straightforward provincial” woman, sought to revive the “vigorous virtues” of her Grantham childhood and the “Victorian values” that underpinned them.

As well as mythologising the past, populists mythologise the people. Those for whom they claim to speak are undifferentiated, homogeneous and inert. Populists have nothing but contempt for de Tocqueville’s insight that the ever-present threat of majority tyranny can be kept at bay only by a rich array of intermediate institutions, including townships, law courts and a free press, underpinned by the separation of powers.

For populists, the threat of majority tyranny is a phantom, invented by out-of-touch and craven elitists. Law courts that stand in the way of the unmediated popular will are “enemies of the people”, as the Daily Mail put it. There is no need to protect minorities against the tyranny of the majority: minorities are either part of the whole, in which case they don’t need protection, or self-excluded from it, in which case they don’t deserve to be protected.

Apparent differences of interest or value that cut across the body of the people, that divide the collective sovereign against itself, are products of elite manipulation or, in Thatcher’s notorious phrase, of “the enemy within”. For there is a strong paranoid streak in the populist mentality. Against the pure, virtuous people stand corrupt, privileged elites and sinister, conspiratorial subversives. The latter are forever plotting to do down the former.

Like pigs searching for truffles, populists search for subversives. Inevitably, they find what they are looking for. Joe McCarthy was one of the most squalid examples of the populist breed: for years, McCarthyism was a baneful presence in Hollywood, in American universities, newspaper offices and in the public service, ruining lives, restricting free expression and making it harder for the United States to win the trust of its European allies. The barrage of hatred and contempt that the tabloid press unleashed on opponents of Theresa May’s pursuit of a “hard” Brexit is another example. Her astounding claim that a mysterious entity known as “Brussels” was seeking to interfere in the British general election is a third.

As the Princeton political scientist Jan-Werner Müller argues, all of this strikes at the heart of democratic governance. Democracy depends on open debate, on dialogue between the bearers of different values, in which the protagonists learn from each other and from which they emerge as different people. For the Nobel laureate, philosopher and economist Amartya Sen, democracy is, above all, “public reasoning”; and that is impossible without social spaces in which reasoning can take place. Populism is singular; democracy is plural. The great question for non-populists is how to respond to the populist threat.

Two answers are in contention. The first is Theresa May’s. It amounts to appeasement. May’s purported reason for calling a snap general election was that the politicians were divided, whereas the people were united. It is hard to think of a better – or more frightening – summary of the spirit of populism. The second answer is Emmanuel Macron’s. For the moment, at least, he is astonishingly popular in France. More important, his victory over Le Pen has shown that, given intelligence, courage and generosity of spirit, the noxious populist tide can be resisted and, perhaps, turned back. 

David Marquand’s most recent book is “Mammon’s Kingdom”: an Essay on Britain Now” (Allen Lane)