A still from Dishonored.
Show Hide image

Why are we still so bad at talking about video games?

In the past 30 years, video games have become more beautiful, more intricate and more intense - but we still lack a critical language to evaluate them. Will we ever move beyond previews and reviews?

I can’t remember the first computer game I played. It might have been Killer Gorilla, which was written by a British 17-year-old called Adrian Stephens who had seen screenshots of Donkey Kong in a magazine and decided to make his own version in his bedroom.

Killer Gorilla was published in 1983, the year I was born, so it must have been hanging round in my brother’s collection for several years before I played it. In those days, games came on a cassette tape, which whined with static if you put it in a music player. The machine we had was an Acorn Electron – another knock-off, this time of the more expensive BBC Micro.

Looking at pictures of Killer Gorilla now, it’s hard to believe it kept me occupied for so long, furiously tapping away at the keyboard – Z for left, X for right, and “return” to jump. There was no story (save the jealous love of a primate for a princess), the graphics were basic and the sound consisted mostly of a sad “bingy bongy boo” whenever you died, which was often.

Compare that with the big-name releases in the run-up to Christmas 2012; the so-called triple-A titles that dominate games magazines and newspaper reviews. In the past few weeks, I’ve played three of the best: Bethesda’s steampunk stealth adventure Dishonored, Gearbox Software’s sarcastic space western Borderlands 2 and 343 Industries’ straight-faced military romp Halo 4. Each will have cost more than £15m to make, and several million more to market, and would have involved hundreds of people (Halo 4 had 300 just in the game development team).

These games are gorgeous, delivering both sweeping vistas and fine-grained details, and Dishonored, in particular, has a voice-acting cast to rival a Hollywood film: Susan Sarandon, Chloë Grace Moretz and Mad Men’s John Slattery. They are all critically acclaimed, with each scoring around 90 per cent on the review aggregator site Metacritic.

And yet, I can’t help feeling that something is missing. Technically, video games have matured hugely since I was mashing the Electron’s keyboard in the 1980s, but I don’t have the conversations about them that I have about books or film or music. Having missed out on Channel 4’s GamesMaster from 1992 to 1998, I can think of only one recent television programme I’ve seen devoted to them: Charlie Brooker’s one-off Gameswipe. Most newspapers have a single short review a week, if that and games are rarely mentioned on bastions of arts programming such as Radio 4 or BBC2. Discussion of games focuses heavily on whether a particular title is worth buying.

Now, you might not find that surprising – because you think games are a niche pursuit or that they’re new. But you’d be wrong on both counts. In the US, 245.6 million video games were sold in 2011, according to the Entertainment Software Association. Microsoft says users have spent 3.3 billion hours playing its Halo series online. Read that again: 3.3 billion hours. As for being newfangled, how about this: a ten-year-old who played Pong when Atari first released it will have celebrated her 50th birthday this year.

Does this matter? It does if you think the unexamined hobby is not worth having. And it does if you wonder, like me, whether the lack of a serious cultural conversation about games is holding back innovation. The background of games in programming culture meant that for many years their development was seen purely in terms of what they could do. But while, say, improved graphical rendering means that modern titles look astonishing, I find myself thinking: is it really such an achievement for a sunset to look 96 per cent as good as a real one?

In 2004, Kieron Gillen wrote a much-referenced essay called “The New Games Journalism”, in which he eviscerated most of his contemporaries for being unimaginative drones, who churned out previews and reviews, and stopped writing about a game at the exact moment their readers started playing it.

He rejected the idea that “the worth of a video game lies in the video game, and by examining it like a twitching insect fixed on a slide, we can understand it” and instead urged writers to become “travel journalists to imaginary places”. The New Games Journalism would be interesting even to people who would never visit those places.

Gillen’s article prompted much soul-searching, and many sub-Tom Wolfe pieces in which people bored on for thousands of words about seeing a pixel and suddenly understanding what love was. But eight years later, the state of games writing is even more bleak. Metacritic, which I mentioned earlier, presents an obvious problem. The industry places enormous weight on the scores it aggregates; as Keza MacDonald of the gaming website IGN noted, “eager, inexperienced writers from smaller sites have been known to give very high scores knowing that their review will appear near the top of the listings and refer traffic”.

“As games have developed and there are more interesting things to talk about, like their narratives, their artistic statements, occasionally even their cultural significance, reviews are still often expected to be an overview of a game’s features with a numerical value on the end,” MacDonald tells me. “This is as much the audience’s problem as the outlets’. Readers expect scores and they expect ‘objective’ analyses of games, even as the games themselves have got to a point where that’s not possible any more.”

Gillen is surprisingly relaxed about the direction criticism has taken since his manifesto (and he has now “retired” from games journalism to write comics). “I’ve learned to be philosophical about this one,” he tells me. “The old has always feared and suspected the new. They’ll reject the new for failing to match the old on the old’s terms, failing to realise that its achievements are entirely separate . . . Fundamentally: eventually old people die.”

Elsewhere, however, others are continuing the fight he started. Naomi Alderman is a novelist, a games critic and a games writer, and she concurs that we need to find a way to write about games for people who don’t play them. “You need the vocabulary of an art critic to talk about the graphics, of a novel critic to talk about the storytelling, of a film critic to talk about the performances: not to mention music criticism, and gameplay criticism,” she says. “We need to find a way to talk about what’s interesting about a game –what makes the gameplay so enjoyable, what’s great about the aesthetics, how good the narrative is, and where it fits among other similar games.”

Playing Halo 4, Borderlands 2 and Dishonored side by side made me think of all the common features of first-person shooters; the tropes born of necessity, like slowly opening gates to disguise loading times, or travels by boat or aeroplane to keep you still while expository dialogue is delivered.  But there’s so little criticism out there that writes about games belonging to the same genre: in fact, the only sustained critique of the “narrator” character common to many shooters – because you need someone to tell you where to go and what to do – comes from 2007’s BioShock, where that control itself becomes an integral party of the story.

Perhaps that revolution in games criticism will never happen. Ed Stern, who was a writer on the 2011 shooter Brink, says: “It’s currently easy for the book-literate to find everything fascinating about games other than the games themselves. Culturally, sociologically, technologically, in terms of gender and race and sexual and generational politics, they’re a fascinating prism. They just tend not to mean very much in themselves – because it’s spectacularly, trudgingly hard to make games mean things, not least because the big ones are made by so many different pairs of hands.” For the sake of readers – and writers – I hope he’s wrong.

Helen Lewis is deputy editor of the New Statesman. She has presented BBC Radio 4’s Week in Westminster and is a regular panellist on BBC1’s Sunday Politics.

PAUL POPPER/POPPERFOTO
Show Hide image

No peace after progress

How the death of the industrial way of life gave us choice – and stoked resentment and fear.

Now that the making of useful and necessary things in Britain is only a shadow of what it once was, we can see more clearly the effects of the Manufacturing Age. The cost was high to the producers of prodigious wealth; a ten-year difference in life expectancy remains between people living in the richest areas and those in Glasgow. The (fleeting, it now seems) visitation of industrialism has made life more comfortable and its dismantling has liberated millions from choiceless occupations. The legacy is one of spectacular improvement, unequally shared.

Perhaps the most dramatic experience of the 20th century was the suddenness with which profligate plenty replaced a skinflint subsistence. Was it the speed of this that distracted us from wondering why, instead of the secure sustenance that generations of needy people had asked of an unyielding economic system, we were offered a promiscuous spillage of goods, promoted with quasi-religious zeal by the converts of a capitalism that had previously delivered to most of its captive workers a life of penury? Such a rapid reversal might have alerted us to changes beneath the surface that elided losses incurred.

The greatest of these was certainly not the extinction of the industrial way of life itself, release from which has been an unqualified blessing. But the transition from relentlessly work-driven lives (in the 1950s, two-thirds of Britain’s workers were still manual labourers) was marked by perfunctory obituaries for the disintegration of industrial communities, with no acknowledgement that, for a century and a half, they had represented the inescapable destiny of the people they sheltered.

Even less recognition was given to the fortitude with which they had borne a long, coercive labour. A way of life, buried without ceremony in the unmarked grave of progress, could not be mourned; and this has generated some social pathologies of our time: resentment over an arbitrary obliteration of industry, disengagement from a party of labour by those it called, like feudal lords, its “own people”, loss of memory of the economic migrants we also were, passing from the goad of industry into the pastures of consumption, and thence into the liberating servitude of technology.

Grief makes no judgement on the intrinsic value of what is lost. Absence of the known and familiar is the object of melancholy in its own right, even if replaced by something immeasurably better. Objectively, there was little to mourn in the vanished industrial way of life: insufficiency and humiliation, malice of overseer and manager, officiousness of poor-law administrator and means-test man. Male industrial workers exhausted in body and spirit, instead of protecting those for whom the power of their hands was the only shelter against destitution, visited similar punishment on their wives and children. There is nothing to be lamented in an end to the penitential life of women, scrubbing not only the red tiles of the kitchen floor, but even an arc of pavement outside the front door; their interception of men on payday before wages were wasted on beer and oblivion; the clenching against joyless invasion of their bodies in the boozy aftermath. But it was the only life they knew, and they adhered to it with grim stoicism and even pride.

There is much to be said for their resistance. The fragile lattice formed by women’s arms was often the only safety net against destitution. Trade unions and friendly and burial societies that shielded folk from economic violence foreshadowed the welfare state and the National Health Service.

The life of labouring people in Britain was strikingly homogeneous, despite diversity of occupation, dialect and local sensibility. There was the same collective experience: terraced house with parlour reserved for celebration or mourning; the three-piece suite, plaster figure on a stand behind the window, chenille curtain against the draught, engraving of The Stag at Bay on the wall; the deal table and Windsor chairs in the living room, the mantelpiece a domestic shrine with clock, candlesticks and pictures of soldiers smiling before they died; the music of cinders falling through the bars in the grate; cheerless bedrooms where husband and wife slept in high connubial state, more bier than bed, where sexual enjoyment was ritually sacrificed as flowers of frost formed on the inside of the window.

And everywhere photographs: wraithlike children with ringlets or in sailor suits, fated never to grow up; weddings in the back garden, a bouquet of lilies and a grandmother in boots and astrakhan hat; the smudged features of a kinsman no one can now identify. Identical memories, too: the shotgun wedding in the dingy finery of a Co-op hall; the funeral tableau around the grave, amid ominous inscriptions of “Sleeping where no shadows fall”; queues outside the ocean-going Savoy or Tivoli to watch Gone With the Wind; the pub where “Vilia” or “The Last Rose of Summer” was hammered out on a discordant piano.

The opening up of such sombre lives might have been expected to call forth cries of gratitude. Instead, a synthetic joy has emanated largely from the same sources that, until recently, offered people grudging survival only, the change of tune outsourced to producers of manufactured delight, purveyors of contrived euphoria to the people – a different order of industrial artefact from the shoes, utensils and textiles of another era.

***

A more authentic popular res­ponse exists beneath the official psalmody, a persistent murmur of discontent and powerlessness. Anger and aggression swirl around like dust and waste paper in the streets of our affluent, unequal society. As long-term recipients of the contempt of our betters, we know how to despise the vulnerable – people incapable of work, the poor, the timid and the fearful, those addicted to drugs and alcohol. Sullen resentment tarnishes the wealth of the world, a conviction that somebody else is getting the advantages that ought to be “ours” by right and by merit.

Rancour appears among those “left behind” in neighbourhoods besieged by unknown tongues and foreign accents: people who never voted for unchosen change, as all political options are locked up in a consensus of elites. “Give us back our country!”
they cry; even though that country is not in the custody of those from whom they would reclaim it. There was no space for the working class to grieve over its own dissolution. If, as E P Thompson said, that class was present at its own making, it was certainly not complicit in its own undoing.

Grief denied in individuals leads to damaging psychological disorders. There is no reason to believe that this differs for those bereaved of a known way of living. The working class has been colonised, as was the peasantry in the early industrial era. When the values, beliefs and myths of indigenous peoples are laid waste, these lose meaning, and people go to grieve in city slums and die from alcohol, drugs and other forms of self-inflicted violence. Though the dominant culture’s erasure of the manufacturing way of life in Britain was less intense than the colonial ruin of ancient societies, this subculture was equally unceremoniously broken. It is a question of degree. The ravages of drugs and alcohol and self-harm in silent former pit villages and derelict factory towns show convergence with other ruined cultures elsewhere in the world.

Depression is a symptom of repressed grief: here is the connection between unfinished mourning and popular resentment at having been cheated out of our fair share, our due, our place in the world. If we are unable to discern our own possible fate in suffering people now, this is perhaps a result of estrangement from unresolved wrongs in our own past. Nothing was ever explained. Globalisation occurred under a kind of social laissez-faire: no political education made the world more comprehensible to the disaffected and disregarded, people of small account to those who take decisions on their behalf and in their name.

Anyone who protested against our passage into this changed world was criminalised, called “wrecker” and “extremist”. The miners’ strike of 1984 was the symbol of this: their doomed fight to preserve a dignity achieved in pain and violence was presented by the merchants of deliverance not only as retrograde, but also as an act of outlawry. Resistance to compulsory change was derided as a response of nostalgics protecting the indefensible, when the whole world was on the brink of a new life. Early in her tenure of Downing Street, Margaret Thatcher, that sybil and prophet who knew about these things, warned that Britain would become “a less cosy, more abrasive” place: a vision confirmed by the Battle of Orgreave – redolent of civil war – and the anguish of Hillsborough.

It is too late to grieve now. Scar tissue has healed over the untreated wound. Though no one expects the ruling classes to understand the distress of perpetual “modernisation”, the leaders of labour might have been able to recognise capitalism’s realm of freedom and a gaudy consumerism that concealed hardening competitiveness and the growth of a crueller, more bitter society.

The ills of this best of all worlds, its excessive wealth and extreme inequality, are on show in hushed thoroughfares of London, shuttered sites of “inward investment”, where the only sound is the faint melody of assets appreciating; while elsewhere, people wait for charitable tins of denutrified substances to feed their family, or sit under a grubby duvet, a Styrofoam cup beseeching the pence of passers-by.

Unresolved feelings about industrialism, enforced with great harshness and abolished with equal contempt for those who served it, are certainly related to the stylish savagery of contemporary life. The alibi that present-day evils are an expression of “human nature” is a poor apology for what is clearly the nature – restless and opportunistic – of a social and economic system that has, so far at least, outwitted its opponents at every turn.

Jeremy Seabrook’s book “The Song of the Shirt” (C Hurst & Co) won the Bread and Roses Award for Radical Publishing 2016

This article first appeared in the 23 June 2016 issue of the New Statesman, Divided Britain