A still from Dishonored.
Show Hide image

Why are we still so bad at talking about video games?

In the past 30 years, video games have become more beautiful, more intricate and more intense - but we still lack a critical language to evaluate them. Will we ever move beyond previews and reviews?

I can’t remember the first computer game I played. It might have been Killer Gorilla, which was written by a British 17-year-old called Adrian Stephens who had seen screenshots of Donkey Kong in a magazine and decided to make his own version in his bedroom.

Killer Gorilla was published in 1983, the year I was born, so it must have been hanging round in my brother’s collection for several years before I played it. In those days, games came on a cassette tape, which whined with static if you put it in a music player. The machine we had was an Acorn Electron – another knock-off, this time of the more expensive BBC Micro.

Looking at pictures of Killer Gorilla now, it’s hard to believe it kept me occupied for so long, furiously tapping away at the keyboard – Z for left, X for right, and “return” to jump. There was no story (save the jealous love of a primate for a princess), the graphics were basic and the sound consisted mostly of a sad “bingy bongy boo” whenever you died, which was often.

Compare that with the big-name releases in the run-up to Christmas 2012; the so-called triple-A titles that dominate games magazines and newspaper reviews. In the past few weeks, I’ve played three of the best: Bethesda’s steampunk stealth adventure Dishonored, Gearbox Software’s sarcastic space western Borderlands 2 and 343 Industries’ straight-faced military romp Halo 4. Each will have cost more than £15m to make, and several million more to market, and would have involved hundreds of people (Halo 4 had 300 just in the game development team).

These games are gorgeous, delivering both sweeping vistas and fine-grained details, and Dishonored, in particular, has a voice-acting cast to rival a Hollywood film: Susan Sarandon, Chloë Grace Moretz and Mad Men’s John Slattery. They are all critically acclaimed, with each scoring around 90 per cent on the review aggregator site Metacritic.

And yet, I can’t help feeling that something is missing. Technically, video games have matured hugely since I was mashing the Electron’s keyboard in the 1980s, but I don’t have the conversations about them that I have about books or film or music. Having missed out on Channel 4’s GamesMaster from 1992 to 1998, I can think of only one recent television programme I’ve seen devoted to them: Charlie Brooker’s one-off Gameswipe. Most newspapers have a single short review a week, if that and games are rarely mentioned on bastions of arts programming such as Radio 4 or BBC2. Discussion of games focuses heavily on whether a particular title is worth buying.

Now, you might not find that surprising – because you think games are a niche pursuit or that they’re new. But you’d be wrong on both counts. In the US, 245.6 million video games were sold in 2011, according to the Entertainment Software Association. Microsoft says users have spent 3.3 billion hours playing its Halo series online. Read that again: 3.3 billion hours. As for being newfangled, how about this: a ten-year-old who played Pong when Atari first released it will have celebrated her 50th birthday this year.

Does this matter? It does if you think the unexamined hobby is not worth having. And it does if you wonder, like me, whether the lack of a serious cultural conversation about games is holding back innovation. The background of games in programming culture meant that for many years their development was seen purely in terms of what they could do. But while, say, improved graphical rendering means that modern titles look astonishing, I find myself thinking: is it really such an achievement for a sunset to look 96 per cent as good as a real one?

In 2004, Kieron Gillen wrote a much-referenced essay called “The New Games Journalism”, in which he eviscerated most of his contemporaries for being unimaginative drones, who churned out previews and reviews, and stopped writing about a game at the exact moment their readers started playing it.

He rejected the idea that “the worth of a video game lies in the video game, and by examining it like a twitching insect fixed on a slide, we can understand it” and instead urged writers to become “travel journalists to imaginary places”. The New Games Journalism would be interesting even to people who would never visit those places.

Gillen’s article prompted much soul-searching, and many sub-Tom Wolfe pieces in which people bored on for thousands of words about seeing a pixel and suddenly understanding what love was. But eight years later, the state of games writing is even more bleak. Metacritic, which I mentioned earlier, presents an obvious problem. The industry places enormous weight on the scores it aggregates; as Keza MacDonald of the gaming website IGN noted, “eager, inexperienced writers from smaller sites have been known to give very high scores knowing that their review will appear near the top of the listings and refer traffic”.

“As games have developed and there are more interesting things to talk about, like their narratives, their artistic statements, occasionally even their cultural significance, reviews are still often expected to be an overview of a game’s features with a numerical value on the end,” MacDonald tells me. “This is as much the audience’s problem as the outlets’. Readers expect scores and they expect ‘objective’ analyses of games, even as the games themselves have got to a point where that’s not possible any more.”

Gillen is surprisingly relaxed about the direction criticism has taken since his manifesto (and he has now “retired” from games journalism to write comics). “I’ve learned to be philosophical about this one,” he tells me. “The old has always feared and suspected the new. They’ll reject the new for failing to match the old on the old’s terms, failing to realise that its achievements are entirely separate . . . Fundamentally: eventually old people die.”

Elsewhere, however, others are continuing the fight he started. Naomi Alderman is a novelist, a games critic and a games writer, and she concurs that we need to find a way to write about games for people who don’t play them. “You need the vocabulary of an art critic to talk about the graphics, of a novel critic to talk about the storytelling, of a film critic to talk about the performances: not to mention music criticism, and gameplay criticism,” she says. “We need to find a way to talk about what’s interesting about a game –what makes the gameplay so enjoyable, what’s great about the aesthetics, how good the narrative is, and where it fits among other similar games.”

Playing Halo 4, Borderlands 2 and Dishonored side by side made me think of all the common features of first-person shooters; the tropes born of necessity, like slowly opening gates to disguise loading times, or travels by boat or aeroplane to keep you still while expository dialogue is delivered.  But there’s so little criticism out there that writes about games belonging to the same genre: in fact, the only sustained critique of the “narrator” character common to many shooters – because you need someone to tell you where to go and what to do – comes from 2007’s BioShock, where that control itself becomes an integral party of the story.

Perhaps that revolution in games criticism will never happen. Ed Stern, who was a writer on the 2011 shooter Brink, says: “It’s currently easy for the book-literate to find everything fascinating about games other than the games themselves. Culturally, sociologically, technologically, in terms of gender and race and sexual and generational politics, they’re a fascinating prism. They just tend not to mean very much in themselves – because it’s spectacularly, trudgingly hard to make games mean things, not least because the big ones are made by so many different pairs of hands.” For the sake of readers – and writers – I hope he’s wrong.

Helen Lewis is deputy editor of the New Statesman. She has presented BBC Radio 4’s Week in Westminster and is a regular panellist on BBC1’s Sunday Politics.

Show Hide image

The age of loneliness

Profound changes in technology, work and community are transforming our ultrasocial species into a population of loners.

Our dominant ideology is based on a lie. A series of lies, in fact, but I’ll focus on just one. This is the claim that we are, above all else, self-interested – that we seek to enhance our own wealth and power with little regard for the impact on others.

Some economists use a term to describe this presumed state of being – Homo economicus, or self-maximising man. The concept was formulated, by J S Mill and others, as a thought experiment. Soon it became a modelling tool. Then it became an ideal. Then it evolved into a description of who we really are.

It could not be further from the truth. To study human behaviour is to become aware of how weird we are. Many species will go to great lengths to help and protect their close kin. One or two will show occasional altruism towards unrelated members of their kind. But no species possesses a capacity for general altruism that is anywhere close to our own.

With the possible exception of naked mole-rats, we have the most social minds of all mammals. These minds evolved as an essential means of survival. Slow, weak, armed with rounded teeth and flimsy nails in a world of fangs and claws and horns and tusks, we survived through co-operation, reciprocity and mutual defence, all of which developed to a remarkable degree.

A review paper in the journal Frontiers in Psychology observes that Homo economicus  might be a reasonable description of chimpanzees. “Outsiders . . . would not expect to receive offers of food or solicitude; rather, they would be fiercely attacked . . . food is shared only under harassment; even mothers will not voluntarily offer novel foods to their own infants unless the infants beg for them.” But it is an unreasonable description of human beings.

How many of your friends, colleagues and neighbours behave like chimpanzees? A few, perhaps. If so, are they respected or reviled? Some people do appear to act as if they have no interests but their own – Philip Green and Mike Ashley strike me as possible examples – but their behaviour ­attracts general revulsion. The news is filled with spectacular instances of human viciousness: although psychopaths are rare, their deeds fill the papers. Daily acts of kindness are seldom reported, because they are everywhere.

Every day, I see people helping others with luggage, offering to cede their place in a queue, giving money to the homeless, setting aside time for others, volunteering for causes that offer no material reward. Alongside these quotidian instances are extreme and stunning cases. I think of my Dutch mother-in-law, whose family took in a six-year-old Jewish boy – a stranger – and hid him in their house for two years during the German occupation of the Netherlands. Had he been discovered, they would all have been sent to a concentration camp.

Studies suggest that altruistic tendencies are innate: from the age of 14 months, children try to help each other, attempting to hand over objects another child can’t reach. At the age of two, they start to share valued possessions. By the time they are three, they begin to protest against other people’s violation of moral norms.

Perhaps because we are told by the media, think tanks and politicians that competition and self-interest are the defining norms of human life, we disastrously mischaracterise the way in which other people behave. A survey commissioned by the Common Cause Foundation reported that 78 per cent of respondents believe others to be more selfish than they really are.

I do not wish to suggest that this mythology of selfishness is the sole or even principal cause of the epidemic of loneliness now sweeping the world. But it is likely to contribute to the plague by breeding suspicion and a sense of threat. It also appears to provide a doctrine of justification for those afflicted by isolation, a doctrine that sees individualism as a higher state of existence than community. Perhaps it is hardly surprising that Britain, the European nation in which neoliberalism is most advanced, is, according to government figures, the loneliness capital of Europe.

There are several possible reasons for the atomisation now suffered by the supremely social mammal. Work, which used to bring us together, now disperses us: many people have neither fixed workplaces nor regular colleagues and regular hours. Our leisure time has undergone a similar transformation: cinema replaced by television, sport by computer games, time with friends by time on Facebook.

Social media seems to cut both ways: it brings us together and sets us apart. It helps us to stay in touch, but also cultivates a tendency that surely enhances other people’s sense of isolation: a determination to persuade your followers that you’re having a great time. FOMO – fear of missing out – seems, at least in my mind, to be closely ­associated with loneliness.

Children’s lives in particular have been transformed: since the 1970s, their unaccompanied home range (in other words, the area they roam without adult supervision) has declined in Britain by almost 90 per cent. Not only does this remove them from contact with the natural world, but it limits their contact with other children. When kids played out on the street or in the woods, they quickly formed their own tribes, learning the social skills that would see them through life.

An ageing population, family and community breakdown, the decline of institutions such as churches and trade unions, the switch from public transport to private, inequality, an alienating ethic of consumerism, the loss of common purpose: all these are likely to contribute to one of the most dangerous epidemics of our time.

Yes, I do mean dangerous. The stress response triggered by loneliness raises blood pressure and impairs the immune system. Loneliness enhances the risk of depression, paranoia, addiction, cognitive decline, dem­entia, heart disease, stroke, viral infection, accidents and suicide. It is as potent a cause of early death as smoking 15 cigarettes a day, and can be twice as deadly as obesity.

Perhaps because we are in thrall to the ideology that helps to cause the problem, we turn to the market to try to solve it. Over the past few weeks, the discovery of a new American profession, the people-walker (taking human beings for walks), has caused a small sensation in the media. In Japan there is a fully fledged market for friendship: you can hire friends by the hour with whom to chat and eat and watch TV; or, more disturbingly, to pose for pictures that you can post on social media. They are rented as mourners at funerals and guests at weddings. A recent article describes how a fake friend was used to replace a sister with whom the bride had fallen out. What would the bride’s mother make of it? No problem: she had been rented, too. In September we learned that similar customs have been followed in Britain for some time: an early foray into business for the Home Secretary, Amber Rudd, involved offering to lease her posh friends to underpopulated weddings.



My own experience fits the current pattern: the high incidence of loneliness suffered by people between the ages of 18 and 34. I have sometimes been lonely before and after that period, but it was during those years that I was most afflicted. The worst episode struck when I returned to Britain after six years working in West Papua, Brazil and East Africa. In those parts I sometimes felt like a ghost, drifting through societies to which I did not belong. I was often socially isolated, but I seldom felt lonely, perhaps because the issues I was investigating were so absorbing and the work so frightening that I was swept along by adrenalin and a sense of purpose.

When I came home, however, I fell into a mineshaft. My university friends, with their proper jobs, expensive mortgages and settled, prematurely aged lives, had become incomprehensible to me, and the life I had been leading seemed incomprehensible to everyone. Though feeling like a ghost abroad was in some ways liberating – a psychic decluttering that permitted an intense process of discovery – feeling like a ghost at home was terrifying. I existed, people acknowledged me, greeted me cordially, but I just could not connect. Wherever I went, I heard my own voice bouncing back at me.

Eventually I made new friends. But I still feel scarred by that time, and fearful that such desolation may recur, particularly in old age. These days, my loneliest moments come immediately after I’ve given a talk, when I’m surrounded by people congratulating me or asking questions. I often experience a falling sensation: their voices seem to recede above my head. I think it arises from the nature of the contact: because I can’t speak to anyone for more than a few seconds, it feels like social media brought to life.

The word “sullen” evolved from the Old French solain, which means “lonely”. Loneliness is associated with an enhanced perception of social threat, so one of its paradoxical consequences is a tendency to shut yourself off from strangers. When I was lonely, I felt like lashing out at the society from which I perceived myself excluded, as if the problem lay with other people. To read any comment thread is, I feel, to witness this tendency: you find people who are plainly making efforts to connect, but who do so by insulting and abusing, alienating the rest of the thread with their evident misanthropy. Perhaps some people really are rugged individualists. But others – especially online – appear to use that persona as a rationale for involuntary isolation.

Whatever the reasons might be, it is as if a spell had been cast on us, transforming this ultrasocial species into a population of loners. Like a parasite enhancing the conditions for its own survival, loneliness impedes its own cure by breeding shame and shyness. The work of groups such as Age UK, Mind, Positive Ageing and the Campaign to End Loneliness is life-saving.

When I first wrote about this subject, and the article went viral, several publishers urged me to write a book on the theme. Three years sitting at my desk, studying isolation: what’s the second prize? But I found another way of working on the issue, a way that engages me with others, rather than removing me. With the brilliant musician Ewan McLennan, I have written a concept album (I wrote the first draft of the lyrics; he refined them and wrote the music). Our aim is to use it to help break the spell, with performances of both music and the spoken word designed to bring people together –which, we hope, will end with a party at the nearest pub.

By itself, our work can make only a tiny contribution to addressing the epidemic. But I hope that, both by helping people to acknowledge it and by using the power of music to create common sentiment, we can at least begin to identify the barriers that separate us from others, and to remember that we are not the selfish, ruthless beings we are told we are.

“Breaking the Spell of Loneliness” by Ewan McLennan and George Monbiot is out now. For a full list of forthcoming gigs visit: monbiot.com/music/

This article first appeared in the 20 October 2016 issue of the New Statesman, Brothers in blood