A still from Dishonored.
Show Hide image

Why are we still so bad at talking about video games?

In the past 30 years, video games have become more beautiful, more intricate and more intense - but we still lack a critical language to evaluate them. Will we ever move beyond previews and reviews?

I can’t remember the first computer game I played. It might have been Killer Gorilla, which was written by a British 17-year-old called Adrian Stephens who had seen screenshots of Donkey Kong in a magazine and decided to make his own version in his bedroom.

Killer Gorilla was published in 1983, the year I was born, so it must have been hanging round in my brother’s collection for several years before I played it. In those days, games came on a cassette tape, which whined with static if you put it in a music player. The machine we had was an Acorn Electron – another knock-off, this time of the more expensive BBC Micro.

Looking at pictures of Killer Gorilla now, it’s hard to believe it kept me occupied for so long, furiously tapping away at the keyboard – Z for left, X for right, and “return” to jump. There was no story (save the jealous love of a primate for a princess), the graphics were basic and the sound consisted mostly of a sad “bingy bongy boo” whenever you died, which was often.

Compare that with the big-name releases in the run-up to Christmas 2012; the so-called triple-A titles that dominate games magazines and newspaper reviews. In the past few weeks, I’ve played three of the best: Bethesda’s steampunk stealth adventure Dishonored, Gearbox Software’s sarcastic space western Borderlands 2 and 343 Industries’ straight-faced military romp Halo 4. Each will have cost more than £15m to make, and several million more to market, and would have involved hundreds of people (Halo 4 had 300 just in the game development team).

These games are gorgeous, delivering both sweeping vistas and fine-grained details, and Dishonored, in particular, has a voice-acting cast to rival a Hollywood film: Susan Sarandon, Chloë Grace Moretz and Mad Men’s John Slattery. They are all critically acclaimed, with each scoring around 90 per cent on the review aggregator site Metacritic.

And yet, I can’t help feeling that something is missing. Technically, video games have matured hugely since I was mashing the Electron’s keyboard in the 1980s, but I don’t have the conversations about them that I have about books or film or music. Having missed out on Channel 4’s GamesMaster from 1992 to 1998, I can think of only one recent television programme I’ve seen devoted to them: Charlie Brooker’s one-off Gameswipe. Most newspapers have a single short review a week, if that and games are rarely mentioned on bastions of arts programming such as Radio 4 or BBC2. Discussion of games focuses heavily on whether a particular title is worth buying.

Now, you might not find that surprising – because you think games are a niche pursuit or that they’re new. But you’d be wrong on both counts. In the US, 245.6 million video games were sold in 2011, according to the Entertainment Software Association. Microsoft says users have spent 3.3 billion hours playing its Halo series online. Read that again: 3.3 billion hours. As for being newfangled, how about this: a ten-year-old who played Pong when Atari first released it will have celebrated her 50th birthday this year.

Does this matter? It does if you think the unexamined hobby is not worth having. And it does if you wonder, like me, whether the lack of a serious cultural conversation about games is holding back innovation. The background of games in programming culture meant that for many years their development was seen purely in terms of what they could do. But while, say, improved graphical rendering means that modern titles look astonishing, I find myself thinking: is it really such an achievement for a sunset to look 96 per cent as good as a real one?

In 2004, Kieron Gillen wrote a much-referenced essay called “The New Games Journalism”, in which he eviscerated most of his contemporaries for being unimaginative drones, who churned out previews and reviews, and stopped writing about a game at the exact moment their readers started playing it.

He rejected the idea that “the worth of a video game lies in the video game, and by examining it like a twitching insect fixed on a slide, we can understand it” and instead urged writers to become “travel journalists to imaginary places”. The New Games Journalism would be interesting even to people who would never visit those places.

Gillen’s article prompted much soul-searching, and many sub-Tom Wolfe pieces in which people bored on for thousands of words about seeing a pixel and suddenly understanding what love was. But eight years later, the state of games writing is even more bleak. Metacritic, which I mentioned earlier, presents an obvious problem. The industry places enormous weight on the scores it aggregates; as Keza MacDonald of the gaming website IGN noted, “eager, inexperienced writers from smaller sites have been known to give very high scores knowing that their review will appear near the top of the listings and refer traffic”.

“As games have developed and there are more interesting things to talk about, like their narratives, their artistic statements, occasionally even their cultural significance, reviews are still often expected to be an overview of a game’s features with a numerical value on the end,” MacDonald tells me. “This is as much the audience’s problem as the outlets’. Readers expect scores and they expect ‘objective’ analyses of games, even as the games themselves have got to a point where that’s not possible any more.”

Gillen is surprisingly relaxed about the direction criticism has taken since his manifesto (and he has now “retired” from games journalism to write comics). “I’ve learned to be philosophical about this one,” he tells me. “The old has always feared and suspected the new. They’ll reject the new for failing to match the old on the old’s terms, failing to realise that its achievements are entirely separate . . . Fundamentally: eventually old people die.”

Elsewhere, however, others are continuing the fight he started. Naomi Alderman is a novelist, a games critic and a games writer, and she concurs that we need to find a way to write about games for people who don’t play them. “You need the vocabulary of an art critic to talk about the graphics, of a novel critic to talk about the storytelling, of a film critic to talk about the performances: not to mention music criticism, and gameplay criticism,” she says. “We need to find a way to talk about what’s interesting about a game –what makes the gameplay so enjoyable, what’s great about the aesthetics, how good the narrative is, and where it fits among other similar games.”

Playing Halo 4, Borderlands 2 and Dishonored side by side made me think of all the common features of first-person shooters; the tropes born of necessity, like slowly opening gates to disguise loading times, or travels by boat or aeroplane to keep you still while expository dialogue is delivered.  But there’s so little criticism out there that writes about games belonging to the same genre: in fact, the only sustained critique of the “narrator” character common to many shooters – because you need someone to tell you where to go and what to do – comes from 2007’s BioShock, where that control itself becomes an integral party of the story.

Perhaps that revolution in games criticism will never happen. Ed Stern, who was a writer on the 2011 shooter Brink, says: “It’s currently easy for the book-literate to find everything fascinating about games other than the games themselves. Culturally, sociologically, technologically, in terms of gender and race and sexual and generational politics, they’re a fascinating prism. They just tend not to mean very much in themselves – because it’s spectacularly, trudgingly hard to make games mean things, not least because the big ones are made by so many different pairs of hands.” For the sake of readers – and writers – I hope he’s wrong.

Helen Lewis is deputy editor of the New Statesman. She has presented BBC Radio 4’s Week in Westminster and is a regular panellist on BBC1’s Sunday Politics.

Show Hide image

The Bloody Mary is dead: all hail the Bloody Caesar

This Canadian version of an old standard is a good substitute for dinner.

It is not anti-Catholic bias that makes me dislike the Bloody Mary, that lumpish combination of tomato juice and vodka named after a 16th-century English queen who, despite the immense reach of her royal powers, found burning Protestants alive the most effective display of majesty.

My prejudice is against its contents: the pulverised tomatoes that look like run-off from a Tudor torture chamber. A whole tomato is a source of joy and, occasionally, wonder (I remember learning that the Farsi for tomato is gojeh farangi, which translates literally as “foreign plum”) – and I am as fond of pizza as anyone. Most accessories to the Bloody Mary are fine with me: Worcestershire sauce, Tabasco, celery, black pepper, even sherry or oysters. But generally I share the curmudgeon Bernard DeVoto’s mistrust of fruit juice in my spirits: “all pestilential, all gangrenous, all vile” was the great man’s verdict. His main objection was sweetness but I will include the admittedly savoury tomato in my ban. At the cocktail hour, I have been known to crave all kinds of odd concoctions but none has included pulp.

To many, the whole point of a Bloody Mary is that you don’t wait until the cocktail hour. This seems to entail a certain shying away from unpleasant realities. I know perfectly well the reaction I would get if I were to ask for a grilled tomato and a chilled Martini at brunch: my friends would start likening me to F Scott Fitzgerald and they wouldn’t be referring to my writing talent. Despite its remarkably similar contents, a Bloody Mary is a perfectly acceptable midday, middle-class beverage. If the original Mary were here to witness such hypocrisy, she would surely tut and reach for her firelighters.

Yet, like the good Catholic I certainly am not, I must confess, for I have seen the error of my ways. In July, on Vancouver Island, I tried a Bloody Caesar – Canada’s spirited response to England’s favourite breakfast tipple (“I’ll see your Tudor queen, you bunch of retrograde royalists, and raise you a Roman emperor”). The main difference is a weird yet oddly palatable concoction called Clamato: tomato juice thinned and refined by clam juice. Replace your standard slop with this stuff, which has all the tang of tomato yet flows like a veritable Niagara, and you will have a drink far stranger yet more delicious than the traditional version.

Apparently, the Caesar was invented by an Italian restaurateur in Calgary, Alberta, who wanted a liquid version of his favourite dish from the old country: spaghetti alle vongole in rosso (clam and tomato spaghetti). He got it – and, more importantly, the rest of us got something we can drink not at breakfast but instead of dinner. Find a really interesting garnish – pickled bull kelp or spicy pickled celery, say – and you can even claim to have eaten your greens.

I’m sure that dedicated fans of the Bloody Mary will consider this entire column heretical, which seems appropriate: that’s the side I was born on, being Jewish, and I like to hope I wouldn’t switch even under extreme forms of persuasion. But this cocktail is in any case a broad church: few cocktails come in so many different incarnations.

The original was invented, according to him, by Fernand Petiot, who was a French barman in New York during Prohibition (and so must have known a thing or two about hypocrisy). It includes lemon juice and a “layer” of Worcestershire sauce and the tomato juice is strained; it may also actually have been named after a barmaid.

All of which proves only that dogma has no place at the bar. Variety is the spice of life, which makes it ironic that the world’s spiciest cocktail bestows a frivolous immortality on a woman who believed all choice to be the work of the devil.

Next week John Burnside on nature

Nina Caplan is the 2014 Fortnum & Mason Drink Writer of the Year and 2014 Louis Roederer International Wine Columnist of the Year for her columns on drink in the New Statesman. She tweets as @NinaCaplan.

This article first appeared in the 08 October 2015 issue of the New Statesman, Putin vs Isis

Show Hide image

What Jeremy Corbyn can learn from Orwell

Corbyn’s ideas may echo George Orwell’s – but they’d need Orwell’s Britain to work. It’s time Corbyn accepted the British as they are today.

All Labour Party leaderships since 1900 have offered themselves as “new”, but Tony Blair’s succession in 1994 triggered a break with the past so ruthless that the Labour leadership virtually declared war on the party. Now it is party members’ turn and they, for now at any rate, think that real Labour is Jeremy.

To Keir Hardie, real Labour had been a trade union lobby expounding Fellowship. To the Webbs, real Labour was “common ownership” by the best means available. Sidney’s Clause Four (adopted 1918) left open what that might be. In the 1920s, the Christian Socialist R H Tawney stitched Equality into the banner, but during the Depression young intellectuals such as Evan Durbin and Hugh Gaitskell designated Planning as Labour’s modern mission. After the Second World War, Clement Attlee followed the miners (and the London Passenger Transport Board) into Nationalisation. Harold Wilson tried to inject Science and Technology into the mix but everything after that was an attempt to move Labour away from state-regulated markets and in the direction of market-regulated states.

What made the recent leadership contest so alarming was how broken was the intellectual tradition. None of the candidates made anything of a long history of thinking about the relationship between socialism and what the people want. Yvette Cooper wanted to go over the numbers; only they were the wrong numbers. Andy Burnham twisted and turned. Liz Kendall based her bid on two words: “Have me.” Only Jeremy Corbyn seemed to have any kind of Labour narrative to tell and, of course, ever the ­rebel, he was not responsible for any of it. His conference address in Brighton was little more than the notes of a street-corner campaigner to a small crowd.

Given the paucity of thinking, and this being an English party for now, it is only a matter of time before George Orwell is brought in to see how Jeremy measures up. In fact, it’s happened already. Rafael Behr in the Guardian and Nick Cohen in the Spectator both see him as the kind of hard-left intellectual Orwell dreaded, while Charles Cooke in the National Review and Jason Cowley in the New Statesman joined unlikely fashion forces to take a side-look at Jeremy’s dreadful dress sense – to Orwell, a sure sign of a socialist. Cooke thought he looked like a “burned-out geography teacher at a third-rate comprehensive”. Cowley thought he looked like a red-brick university sociology lecturer circa 1978. Fair enough. He does. But there is more. Being a middle-class teetotal vegetarian bicycling socialistic feministic atheistic metropolitan anti-racist republican nice guy, with allotment and “squashily pacifist” leanings to match, clearly puts him in the land of the cranks as described by Orwell in The Road to Wigan Pier (1937) – one of “that dreary tribe of high-minded women and sandal-wearers and bearded fruit-juice drinkers who come flocking towards the smell of ‘progress’ like bluebottles to a dead cat”. And though Corbyn, as “a fully fledged, fully bearded, unabashed socialist” (Huffington Post), might make all true Orwellians twitch, he really made their day when he refused to sing the National Anthem. Orwell cited precisely that (see “The Lion and the Unicorn”, 1941) as an example of the distance between left-wing intellectuals and the people. It seemed that, by standing there, mouth shut, Comrade Corbyn didn’t just cut his wrists, he lay down full length in the coffin and pulled the lid shut.


Trouble is, this line of attack not only misrepresents the Labour leader, it misrepresents Orwell. For the great man was not as unflinchingly straight and true as some people think. It is impossible, for instance, to think of Orwell singing “God Save the King”, because he, too, was one of that “dreary tribe” of London lefties, and even when he joined Labour he remained ever the rebel. As for Corbyn, for a start, he is not badly dressed. He just doesn’t look like Chuka or Tristram. He may look like a threadbare schoolteacher, but Orwell was one twice over. Orwell was never a vegetarian or a teetotaller, but, like Corbyn, neither was he interested in fancy food (or drink), he kept an allotment, drove a motorbike, bicycled, cared about the poor, cared about the environment, loathed the empire, came close to pacifism at one point, and opposed war with Germany well past the time when it was reasonable to do so.

In Orwell’s thinking about socialism, for too long his main reference point was the London Marxist left. Not only did he make speeches in favour of revolutions, he took part in one with a gun in his hand. Orwell was far more interested, as Corbyn has been far more interested, in speaking truth to power than in holding office. His loyalty was to the movement, or at least the idea of the movement, not to MPs or the front bench, which he rarely mentioned. There is nothing in Corbyn’s position that would have shocked Orwell and, should they have met, there’d have been much to talk about: belief in public ownership and non-economic values, confidence in the state’s ability to make life better, progressive taxation, national health, state education, social care, anti-socially useless banking, anti-colonialism and a whole lot of other anti-isms besides. It’s hard to be sure what Orwell’s position would have been on Trident and immigration. Not Corbyn’s, I suspect. He was not as alert to feminism as he might have been but equally, few men try to write novels from a woman’s point of view and all Orwellians recognise that Julia is the dark hero of Nineteen Eighty-Four. In truth they are both austere types, not in it for themselves and not on anyone else’s expense account either. Corbyn won the leadership because this shone through from the very beginning. He came across as unaffected and straightforward – much as Orwell tried to be in his writing.

Except, as powerfully expressed in these pages by John Gray, Corbyn’s politics were made for another world. What sort of world would he need? First off, he’d need a regulated labour market: regulated by the state in partnership with a labour movement sensitive to what people wanted and experienced in trying to provide it. He would also need capital controls, a manufacturing base capable of building the new investment with Keynesian payback, an efficient and motivated Inland Revenue, a widespread public-service ethos that sees the country as an asset, not a market, and an overwhelming democratic mandate to get things done. In other words, Corbyn needs Orwell’s Britain – not this one – and at the very least, if he can’t have that, he needs the freedom to act that the European Commission forbids.

There’s another problem. Orwell did not trust left-wing intellectuals and spent half his life trying to work out their motivations as a class who spoke for the people, went in search of the people, and praised the people, but did not know them or believe in them. True, Corbyn says he wants to be open and inclusive, but we know he can’t possibly mean it when he says it will be the party, not him or the PLP, that will decide policy, just as we knew it couldn’t possibly be true when he said he’d turn PMQs into the People’s Question Time. Jeremy hasn’t changed his mind in forty years, appears to have great difficulty (unlike Tony Benn) in fusing socialism to national identity or experience (Hardie, Ben Okri and Maya Angelou were bolted on to his Brighton speech) and seems to think that not being happy with what you are given somehow captures the historic essence of socialism (rather than its opposite).

Granted, not thinking outside the ­circle is an inherent fault of the sectarian left but some of our most prominent left-wing journalists have it, too. Working-class support for nationalisation? Good. Right answer! Working-class opposition to benefit scroungers and further mass immigration? Bad. Wrong answer! Would you like to try again? In his essay “In Defence of Comrade Zilliacus” (1947) Orwell reckoned that left-wing intellectuals saw only what they wanted to see. For all their talk of representing the people, they hated the masses. “What they are frightened of is the prevailing opinion within their own group . . . there is always an orthodoxy, a parrot-cry . . .”

The game is hard and he may go down in a welter of knives, yet Corbyn still has time. He may go on making the same speech – on the benefits of apple pie to apple growers – but at some point he will have to drop the wish-list and get on the side of the British people as they are, and live with that, and build into it. Only the nation state can even begin to do the things he wants to do. The quicker he gets that, the quicker we can see if the latest incarnation of new Labour has a future.

Robert Colls is the author of “George Orwell: English Rebel” (Oxford University Press)

This article first appeared in the 08 October 2015 issue of the New Statesman, Putin vs Isis