Video games are making us too comfortable with the modern surveillance state

Games like Grand Theft Auto V pacify our worst anxieties about the evils of our culture, by turning those evils on their head and finding ways to repaint reality so that our impulses to pry can be seen as good.

This article first appeared on

It's hard to remember the first time I noticed a camera filming me in public. There was no genesis point, no camera zero that commenced the age of being conscious of having an unseen audience. They just appeared and quietly multiplied, tolerable when used in ATMs and intersections, slightly unnerving when placed overhead in offices and casinos. Today, we have implicitly accepted the fact that everything has some capacity to monitor us, from hackable baby monitors and laptop webcams to the stores of personal data tracked by Google, Facebook, and the National Security Agency.

While it's easy to identify the growth of state and corporate surveillance, it’s clear that trying to reverse it is uniquely difficult. The NSA, despite clearly overstepping its authority in recent years, still has the right court on its side, and then there's the massive data-harvesting operations run by banks, search engines, social networks, shopping sites and local governments. We can fight particular aspects, but surveillance has become so pervasive it’s hard to see how any progress could be made against the whole. Most Americans are resigned to living under surveillance of one kind or another.

But today's mainstream video games offer a different narrative, one that's both more comforting and exciting. (Resignation, after all, doesn't make for a very exciting game.) From The Sims 3 to Grand Theft Auto V, which was released last week, mainstream games have turned voyeuristic surveillance into an active part of play, a mechanism through which players infer plot details, distinguish good guys from bad guys, and make assumptions about the emotional lives of the characters they're controlling. These games create a fictional pretext that allows the player, who would otherwise be subject to these incursions in the real world, to act them out against others in a virtual world — a sort of salve for, or at least temporary respite from, the everyday hum of modern surveillance.

Grand Theft Auto V, for instance, is peppered with subtle intrusions into the private lives of its three controllable characters, offering a kind of Platonic ideal for surveillance as a form of play — neither inherently good or bad. Switching between the three characters is mostly tactical during story missions — say, switching to a character in a rooftop sniper position while another character is pinned down in a shootout on the streets below. But when the player is not in a mission, the mechanics reward voyeuristic curiosity with a variety of short vignettes to remind the player that the three characters are autonomous individuals leading peculiar lives within the game world. You may switch to Franklin, an idealistic young man stuck living with his aunt in the game's fictional Compton, and see him trying to keep his hotheaded best friend out of a street fight; to Michael, an ageing bank robber forced into witness protection, as he gazes longingly at the sunset from the Hollywood Hills; or to Trevor, the game's amphetamine-addicted psychopath, just as he’s waking up from a blackout wearing only underwear and surrounded by dead bodies.

These stolen moments last only a few seconds before players gain control, but they reinforce the sense that the player is not living through each character, but instead watching them from above, omnisciently. Here, prying into another person's private being doesn’t come with any implicit agenda other than a curiosity to know them better, to not just know the details of life but to get as close to feeling another person’s subjectivity. One watches not to judge, condemn, or steal, but to feel more intimately connected to these characters, and they in turn are surrounded by implicit and explicit signs of surveillance, from extended sequences of surveying banks, jewelry stores, and government buildings in anticipation of heists to the regular discovery of a particular radio station playing in a car that you’ve just commanded them to steal.

The game uses voyeurism and surveillance to heighten the entertainment, of course: to advance a storyline about a FBI-CIA rivalry, to make fun of a fictional Facebook called Lifehacker, and for plenty of slapstick prurience, as when sneaking backstage to unmask a popular but lecherous game show host who's attempting to sleep with Michael's daughter, a contestant on the show. This scene prompts a long interactive car chase to catch and beat up the host — punishment not for being a creep, but for creeping on the wrong woman. More broadly, and problematically, Grand Theft Auto V portrays all of the benefits of peeping and spying but none of the dangers: A player's curiosity about others' private lives is rewarded with humor and drama and intimate detail, engendering sympathy even for those as overtly dislikable as the game's three main characters.

Other games use surveillance as a tool for enforcing morality. The forthcoming Watch Dogs is set in Chicago's near-future and organized around a massive, state-run network called the Central Operating System. The hero of the game is a hacker who has gained access to the network, and can use it to pry into the lives of everyone around him, but with the noble goal of protecting the innocent from the city's lurking evildoers. In one scenario, he aims his smartphone at a woman on the street and instantly pulls up the network's data about her, learning that she's recently broken up with a violent boyfriend who has a criminal record. You can then follow her and protect her from an eventual attack by her ex-boyfriend. Last year's Sleeping Dogs, an open world detective game set in Hong Kong, had a similar mechanic, in which players hack into CCTVs around the city to pick out drug dealers for the local police to arrest.

In these cases, the moral breach of surveillance is acceptable because the game worlds necessitate it, introducing villains that can only be stopped through surveillance — a mirror of the argument made in defense of current surveillance programs: that while the NSA and FBI have access to huge stores of private information, they promise only to pursue likely or suspect terrorists and traitors. When President Barack Obama claims that Americans “can’t have 100 percent security and also then have 100 percent privacy and zero inconvenience,” he’s calling for tolerance of the general idea of surveillance as a necessary compromise for safety from terrorism. Yet, in the fantasy of the game world, we can accept these logical extremes because the game builds a space where there are imminent threats behind every closed door — sex traffickers, drug dealers, gunrunners, terrorists. But the threats in real life are less imminent, and certainly more opaque, which is what makes these games so seductive: There are no unknowns — known or otherwise.

Video games don't reflect our world so much as simplify it. They euphemize subjects, sometimes troubling ones, and allow us total control in situations that, in real life, are often beyond our control. In so doing, they can pacify our discomfort with the messy state of affairs. We see this with today's surveillance-driven games, and we saw it last decade with the rise in popularity of the military shooter, concomitant with the wars in Iraq and Afghanistan. For instance, the enemy of 2007's Call of Duty 4: Modern Warfare is a nuclear-armed despot in an unnamed Arabian country, against whom a group of American and English special forces fight to prevent a terrorist attack on America. The New York Times’ Seth Schiesel described the game's “Death From Above” level, in which players direct a camera-controlled gun on a Lockheed AC-130 gunship and kill small human figures on a black and white videofeed, as “at once the most realistic scene and the mission that feels most like a video game, but only because for some modern soldiers, war really has come to resemble a video game.” And while this mission looked and sounded real, an eerie precursor to WikiLeak’s infamous “Collateral Murder” video, it came with built-in safeguards that immediately failed players who shot at civilians.

These military and surveillance games are escapist fantasies, allowing us to imagine a world in which our government's actions here and abroad are assumed to be logical, morally and ethically defensible, and ultimately able to stop all of the bad guys without hurting any innocents. The games' design takes all the difficult real-world elements away — deciding who is bad, finding them, choosing how to treat them, and ensuring that treatment maintains legal and moral integrity. It reduces war to a problem of shooting people in as efficient a way as possible, and terrorism to a problem of snooping on as many people as possible. Moreover, these games reliably portray America as the victim of faceless aggression — among us or across the world — thus necessitating war or widespread snooping. If only our misadventures in Afghanistan and Iraq, or the NSA's misadventures everywhere, had such a clarity of purpose and target.

But this is what it means to have fun today. We have made games into simulations that pacify our worst anxieties about the evils of our culture, by turning those evils on their head and finding ways to repaint reality so that our impulses to pry — or, in the case of military shooters, to wage war — can be seen as good. As America moves from a state of war to a state of soft (and sometimes not-so-soft) surveillance, our newest ways of playing hinge on the taking of information from another person's life through privileged access instead of personal engagement. We find amusement by invading the private lives of characters in their fishbowl worlds even though — or perhaps because — we are moving around our own fishbowl lives, watched by some unseen player trying to construct a narrative about us that's very different from our own.

Michael Thomsen has written about games, technology, sex, and books for Slate, The Daily Beast, The Atlantic, Edge, Kill Screen, The Believer, Guernica, and The New Inquiry. He's also author of Levitate the Primate: Handjobs, Internet Dating, and Other Issues for Men. Follow him on Twitter @mike_thomsen.

This article first appeared on

The police helicopter from Grand Theft Auto V.
Show Hide image

It's unfashionable to call someone a "genius" – but William Empson was one

Father than denying the contradictoriness of being human, Empson revelled in it, as The Face of Buddha reveals.

William Empson was a genius. Describing anyone in this way is distinctly unfashionable nowadays, because it suggests a level of achievement to which most of humanity cannot aspire. There is nothing you can do to acquire genius. Either you have it or, like the rest of us, you don’t – a state of affairs that cannot be remedied. The very idea smacks of elitism, one of the worst sins in the contemporary moral lexicon. But if talk of genius has come close to being banned in polite society, it is hard to know how else to describe Empson’s astonishing originality of mind.

One of the most influential 20th-century literary critics and the author of two seminal books on language, he was extremely receptive to new thinking and at the same time combative in defending his views. He was a poet of the first rank, whose spare and often cryptic verse was immediately understood and admired by Ludwig Wittgenstein. Incomparably more thoughtful than anything produced by the dull atheist prophets of our own day, his book Milton’s God (1961), in which he compares the Christian God to a commandant at Belsen, must be one of the fiercest assaults on monotheism ever published. And as a socialist who revered the British monarchy, he had a political outlook that was refreshingly non-standard.

Empson’s originality was not confined to his writing. He led a highly adventurous life. Expelled from his research fellowship and his name deleted from the records of his Cambridge college in 1929 when one of the porters found condoms in his rooms, he lost any prospect of a position in British academic life. For a time, he considered becoming a journalist or a civil servant. Instead his tutor I A Richards encouraged him to apply for posts in east Asia, and in 1931 he took up a position at a teacher training college in Japan. For some years he taught in China – mostly from memory, owing to a lack of books, and sleeping on a blackboard when his university was forced to move to Kunming during the Japanese siege of Beijing. By the late Thirties he was well known in London literary circles (written when he was only 22, his best-known book, Seven Types of Ambiguity, was published in 1930 and a collection of poems appeared in 1934) but just scraping a living from reviewing and a small private income. During the Second World War he worked at the BBC alongside George Orwell and Louis MacNeice.

He returned to China in 1947 to teach in Beijing, living through the stormy years just before and after Mao came to power and leaving only when the regime’s ideological demands became intolerably repressive. He continued his academic career, first at Kenyon College in Ohio, briefly at Gresham College in London, and finally at the University of Sheffield, where he was appointed head of the English department in 1953 and remained until his retirement in 1972, but always disdained academic jargon, writing in a light, glancing, conversational style.

Inordinately fond of drink and famously bohemian in appearance (T S Eliot, who admired his mind and enjoyed his company, commented on Empson’s scruffiness), he lived in a state of eccentric disorder that the poet Robert Lowell described as having “a weird, sordid nobility”. He was actively bisexual, marrying the South African-born sculptor Hetta Crouse, equally ­free-spirited, and with whom he enjoyed an open relationship that was sometimes turbulent yet never without affection. His later years were less eventful, though rarely free from controversy. In 1979 he was knighted, and awarded an honorary fellowship by the college that half a century earlier had struck his name from the books. He died in 1984.

The publishing history of this book is as extraordinary as the work itself. “The real story of The Face of the Buddha,” the cultural historian Rupert Arrowsmith writes in his richly learned introduction, “began in the ancient Japanese city of Nara, where, in the spring of 1932, the beauty of a particular set of Japanese sculptures struck Empson with revelatory force.” He was “bowled over” by three statues, including the Kudara Kannon, a 7th-century piece in the Horyuji temple representing the Bodhisattva of Mercy, which fascinated him because the left and right profiles of the statue seemed to have asymmetrical expressions: “The puzzlement and good humour of the face are all on the left, also the maternity and the rueful but amiable smile. The right is the divinity; a birdlike innocence and wakefulness; unchanging in irony, unresting in good works; not interested in humanity, or for that matter in itself . . . a wonderfully subtle and tender work.” Gripped by what the art historian Partha Mitter describes as a “magnificent obsession”, Empson travelled far and wide in the years that followed, visiting south-east Asia, China, Ceylon, Burma and India and ending up in the Ajanta caves, the fountainhead of Mahayana Buddhist art. First begun in Japan in 1932, The Face of the Buddha was written and repeatedly revised during these wanderings.

Empson made no copy of the manuscript and in a succession of mishaps it was lost for nearly 60 years. The story of its disappearance is resonant of the boozy Fitzrovia portrayed in Anthony Powell’s novels. On leaving for his foreign travels in 1947, Empson gave the manuscript to John Davenport, a family friend and literary critic, for safekeeping. The hard-drinking Davenport mislaid it and in 1952 told Empson he had left it in a taxi. Davenport’s memory was befuddled. He had in fact given the text to the Tamil poet and editor M J T Tambimuttu, who must have shelved it among the piles of books that filled the rat-infested flat vividly described in the memoirs of Julian Maclaren-Ross. When Tambimuttu retur­ned to Ceylon in 1949 he passed on Empson’s manuscript to Richard March, a fellow editor of Poetry London, which ­Tambimuttu had founded. March died soon afterwards and his papers mouldered in obscurity until 2003, when they were acquired by the British Museum. Two years later an enterprising curator at the museum, Jamie Anderson, spotted the manuscript and informed the author’s descendants of its rediscovery. Now Oxford University Press has brought out this beautifully illustrated volume, which will be of intense interest not only to devotees of Empson but to anyone interested in culture and religion.

Although a fragment of his analysis appeared in the article “Buddhas with double faces”, published in the Listener in 1936 and reprinted in the present volume, it is only now that we can fully appreciate Empson’s insight into Buddhist art. His deep interest in Buddhism was clear throughout his life. From the indispensable edition of his Complete Poems (Allen Lane, 2000) edited and annotated by his biographer John Haffenden, we learn that, while working in the Far Eastern department of the BBC, Empson wrote the outline of a ballet, The Elephant and the Birds, based on a story from Buddhist scriptures about Gautama in his incarnation as an elephant. His enduring fascination with the Buddha is evident in “The Fire Sermon”, a personal translation of the Buddha’s celebrated speech on the need to turn away from sensuous passions, which Empson used as the epigraph in successive editions of the collected poems. (A different translation is cited in the notes accompanying Eliot’s Waste Land, the longest section of which is also titled “The Fire Sermon”.)

Empson’s attitude to Buddhism, like the images of the Buddha that he so loved, was asymmetrical. He valued the Buddhist view as an alternative to the Western outlook, in which satisfying one’s desires by acting in the world was the principal or only goal in life. At the same time he thought that by asserting the unsatisfactoriness of existence as such – whether earthly or heavenly – Buddhism was more life-negating and, in this regard, even worse than Christianity, which he loathed. Yet he also believed Buddhism, in practice, had been more life-enhancing. Buddhism was a paradox: a seeming contradiction that contained a vital truth.

What Empson admired in Buddhist art was its ability to create an equilibrium from antagonistic human impulses. Writing here about Khmer art, he observes that cobras at Angkor are shown protecting the seated Buddha with their raised hoods. He goes on to speculate that the many-headed cobra is a metaphor for one of the Buddha’s canonical gestures – the raised hand with the palm forward, which means “do not fear”:

It has almost the same shape. To be sure, I have never had to do with a cobra, and perhaps after practical experience the paradox would seem an excessively monstrous one. But the high religions are devoted to contradictions of this sort . . . and the whole point of the snake is that the god has domesticated him as a protector.

It was this combination of opposite qual­ities that attracted Empson. “A good deal of the startling and compelling quality of the Far Eastern Buddha heads comes from combining things that seem incompatible,” he writes, “especially a complete repose or detachment with an active power to help the worshipper.” Art of this kind was not only beautiful, but also ethically valuable, because it was truer to human life. “The chief novelty of this Far Eastern Buddhist sculpture is the use of asymmetry to make the faces more human.”

Using 20th-century examples that illustrate such asymmetry, Empson elaborates in his Listener article:

It seems to be true that the marks of a person’s active experience tend to be stronger on the right, so that the left shows more of his inherent endowment or of the more passive experiences which have not involved the wilful use of facial muscles. All that is assumed here is that the muscles on the right generally respond more readily to the will and that the effects of old experiences pile up. The photograph of Mr Churchill will be enough to show that there is sometimes a contrast of this sort though it seems that in Baudelaire, who led a very different kind of life, the contrast was the other way round. In Mr Churchill the administrator is on the right, and on the left (by which of course I mean the left of the person or statue, which is on your right as you look) are the petulance, the romanticism, the gloomy moral strength and the range of imaginative power.

With such a prolific mind as Empson’s, it is risky to identify any ruling theme, but he returns repeatedly in his writings to the thought that the creativity of art and language comes from their irreducible open-endedness and susceptibility to conflicting interpretations. As he wrote in Seven Types of Ambiguity, “Good poetry is usually written from a background of conflict.” Rather than being an imperfection that must be overcome for the sake of clarity, ambiguity makes language inexhaustibly rich. In The Structure of Complex Words (1948) he showed how even the most straightforward-looking terms were “compacted with doctrines” that left their meaning equivocal. There was no ultimate simplicity concealed by the opacity of language. Thinking and speaking invoked deep structures of meaning which could be made more intelligible. But these structures could not be contained in any single body of ideas. Wittgenstein’s early ambition of reducing language to elem­entary propositions stating simple facts was impossible in principle. Inherently plural in meaning, words enabled different ways of seeing the world.

Empson’s message was not merely intellectual but, once again, ethical. “It may be,” he wrote in Complex Words, “that the human mind can recognise actually in­commensurable values, and that the chief human value is to stand up between them.” The image of the Buddha that he discovered in Nara embodied this incommensurability. Rather than trying to smooth out these clashing values into an oppressive ideal of perfection, as Christianity had done, the Buddhist image fused their conflicts into a paradoxical whole. Instead of erecting a hierarchy of better and worse attitudes in the manner of the “neo-Christians”, as Empson described the pious humanists of his day, the asymmetrical face of the Buddha showed how discordant emotions could be reconciled.

Whether Empson’s account of asymmetry can be anything like a universal theory is doubtful. In support of his theory he cited Darwin’s The Expression of the Emotions in Man and Animals to show that human emotions were expressed in similar ways in different cultures, and invoked speculation by contemporary psychologists on the contrasting functions of the right and left sides of the brain. But the scientific pretensions of Empson’s observations are less important than the spirit in which he made them. Entering into an initially alien form of art, he found a point of balance between values and emotions whose conflicts are humanly universal. Rather than denying the contradictoriness of the human mind and heart, he gloried in it.

It takes genius to grasp the ambiguities of art and language and to use them as Empson did. But if we can’t emulate his astonishing fertility of mind, we can learn from his insights. Both in his life and in his work he resisted the lure of harmony, which offers to mitigate conflicts of value at the price of simplifying and impoverishing the human world. Instead, Empson searched for value in the ambiguities of life. He found what he was looking for in the double faces of the Buddha described in this lost masterpiece.

John Gray is the New Statesman’s lead book reviewer

The Face of Buddha by William Epson, edited by Rupert Arrowsmith with a preface by Partha Mitter, is published by Oxford University Press (224pp, £30)

John Gray is the New Statesman’s lead book reviewer. His latest book is The Soul of the Marionette: A Short Enquiry into Human Freedom.

This article first appeared in the 23 June 2016 issue of the New Statesman, Divided Britain