Show Hide image

Screen test

Video games dominate Britain’s entertainment industry, yet we lack the critical vocabulary to unders

Cultural realities tend to lag behind economic ones. How else to explain that the UK’s biggest (worth £4.5bn-plus in annual sales) and fastest-growing (at close to 20 per cent annually) entertainment medium still barely registers on the nation’s more rarefied intellectual radar? I am talking, of course, about video games – as the field of interactive entertainment still rather quaintly tends to be known. And the reason for its neglect is not so much snobbery as a gaping absence in our critical vocabulary and sensibilities.

When, today, we ask a question such as “Is it art?” we are no longer looking for a yes or no answer. The 20th century decided that urinals, cans of soup, recorded silence, heaps of bricks and fake human excrement could all be art, of a certain kind. Under these circumstances, it would be more than a little perverse to deny the idea of art to objects as lovingly crafted, as considered and as creative as video games. The question that’s really at stake is something more specific. If video games are art, what kind of art are they? What are their particular attributes and potential? And, perhaps most importantly, just how good are they?

I recently posed similar questions to someone who is very definitely both an artist and a gamer: the writer Naomi Alderman. Alderman’s first novel, Disobedience, appeared in 2006 and won her the Orange Award for New Writers. In parallel to her work as a literary writer, however, she also spent three years pursuing a very different kind of career: that of lead writer on the experimental “alternate reality” game Perplex City. To many authors, such a venture might have felt like a period of time away from “real” writing. Yet, Alderman explained, for her it was more a discovery that these two modes of writing were not only compatible, but symbiotic. I asked her whether she had preferred working on her novel or on the game. “I couldn’t choose,” she said. “I feel that if I were to give up either the novel or the game, I wouldn’t be able to do the other.”

It’s a creative interconnection Alderman traces back to her childhood. “My first memory of playing a game was around 1981, when my mum took me to the Puffin Club exhibition, a kind of roadshow for kids who read books published by Puffin. I remember they had a bank of computers at this one where you could queue up to get ten minutes playing a text-based adventure game. And I thought, ‘This is absolutely brilliant.’ I was fascinated.” These games were some of the first things it was possible to play on a computer in which plot and character meant more than a handful of pixels dashing across the screen. For Alderman, as for many others, the experience was closely associated “with stories and with the idea of being able to walk into a story”. And the dizzying kind of thought experiment that the best fiction can undertake – its gleeful defiance of the rules of time and nature – lies close to the heart of what video games do best.

As a modern example, Alderman describes a game called Katamari. In it, for want of a better description, you roll stuff up. You control, she tells me, “a little ball, which is effectively sticky, and you’re rolling it around a landscape picking stuff up. As you do so, your ball gets bigger and bigger. It’s almost impossible to explain how much fun this is, the pleasure of growing your little ball, which starts off just big enough to pick up pins and sweets from a tabletop and ends up picking envelopes, then televisions, then tables, then houses, then streets; until in the end you can roll it across the whole world picking up clouds and continents.”

Katamari may sound like an oddity, but its pleasures are typical of a central kind of video-game experience, in that they are in part architectural: something one inhabits and encounters incrementally; a space designed to be occupied and experienced rather than viewed simply as a whole. Players in a well-made game will relish not just its appearance but also the feel of exploring and gradually mastering its unreal space. Yet, in what sense is any of this art, or even artistic? Just as every word within a novel has to be written, of course, every single element of any video game has to be crafted from scratch. To talk about the “art” element of games is, I would argue, to talk about the point at which this fantastically intricate undertaking achieves a particular concentration, complexity and resonance.

It’s worth remembering, too, just how young a medium video games are. Commercial games have existed for barely 30 years; the analogy with film, now almost 120 years old, is an illuminating one. In December 1895, the Lumière brothers, Auguste and Louis, showed the first films of real-life images to a paying audience, in Paris. This, clearly, was a medium, but not yet an art form; and for its first decade, film remained largely a novelty, a technology that astounded viewers with images such as trains rushing into a station, sending early audiences running out of cinemas in terror. It took several decades for film to master its own, unique artistic language: cinematography. It took time, too, for audiences to expect more from it than raw wonder or exhilaration. Yet today you would be hard-pushed to find a single person who does not admire at least one film as a work of art.

If, however, you ask about video games, the chances are that you’ll find plenty of people who don’t play them at all, let alone consider them of any artistic interest. This is hardly surprising: at first glance it can seem that many games remain, in artistic terms, at the level of cinema’s train entering a station – occasions for technological shock and awe, rather than for the more densely refined emotions of art.

Yet the nature of games as a creative medium has changed profoundly in recent years – as I discovered when I spoke to Justin Villiers, an award-winning screenwriter and film-maker who since late 2007 has been plying his trade in the realm of video games. Even a few years ago, he explained, his career move would have been artistically unthinkable. “In the old days, the games industry fed on itself. You’d have designers who were brought up on video games writing games themselves, so they were entirely self-referential; all the characters sounded like refugees from weak Star Trek episodes or Lord of the Rings out-takes. But now there is new blood in the industry – people with backgrounds in cinema and theatre and comic books and television. In the area in which I work, writing and direction, games are just starting to offer genuine catharsis, or to bring about epiphanies; they’re becoming more than simple tools to sublimate our desires or our fight for survival.”

I suggest the film analogy, and wonder what stage of cinema games now correspond to. “It reminds me of the late 1960s and early 1970s, because there were no rules, or, as soon as there were some, someone would come along and break them. Kubrick needed a lens for 2001: a Space Odyssey that didn’t exist, so, together with the director of photography, he invented one.” How does this translate to the world of games? “It’s like that in the industry right now. Around a table you have the creative director, lead animator, game designer, sound designer and me, and we’re all trying to work out how to create a moment in a game or a sequence that has never been done before, ever.”

Villiers is, he admits, an unlikely evangelist: someone who was initially deeply sceptical of games’ claims as art. But it would be wrong, he concedes, simply to assume that the current explosion of talent within the gaming industry will allow it to overtake film or television as a storytelling medium. Today’s best games may be as good as some films in their scripts, performances, art direction and suchlike. But most are still much worse; and in any case, the most cinematic games are already splitting off into a hybrid subgenre that lies outside the mainstream of gaming. If we are to understand the future of games, as both a medium and an art form, we must look to what is unique about them. And that is their interactivity.

To explore this further, I spoke to a game designer who is responsible for some of the most visionary titles to appear in recent years – Jenova Chen. Chen is co-founder of the California-based games studio thatgamecompany, a young firm whose mission, as he explains it, is breathtakingly simple: to produce games that are “beneficial and relevant to adult life; that can touch you as books, films and music can”.

Chen’s latest game, Flower, is the partial fulfilment of these ambitions, a work whose genesis in many ways seems closer to that of a poem or painting than an interactive entertainment. “I grew up in Shanghai,” he explains. “A huge city, one of the world’s biggest and most polluted. Then I came to America and one day I was driving from Los Angeles to San Francisco and I saw endless fields of green grass, and rows and rows of windmill farms. And I was shocked, because up until then I had never seen a scene like this. So I started to think: wouldn’t it be nice for people living in a city to turn a games console into a portal, leading into these endless green fields?”

From this grew a game that is both incredibly simple and utterly compelling. You control a petal from a single flower, and must move it around a shimmering landscape of fields and a gradually approaching city by directing a wind to blow it along, gathering other petals from other flowers as you go. Touch a button on the control pad to make the wind blow harder; let go to soften it; gently shift the controller in the air to change directions. You can, as I did on my first play, simply trace eddies in the air, or gust between tens of thousands of blades of grass. Or you can press further into the world of the game and begin to learn how the landscape of both city and fields is altered by your touch, springing into light and life as you pass.

“We want the player to feel like they are healing,” Chen tells me, “that they are creating life and energy and spreading light and love.” If this sounds hopelessly naive, it is important to remember that the sophistication of a game experience depends not so much on its conceptual complexity as on the intricacy of its execution. In Flower, immense effort has gone into making something that appears simple and beautiful, but that is minutely reactive and adaptable. Here, the sensation of “flow” – of immersion in the task of illumination and exploration – connects to some of those fundamental emotions that are the basis of all enduring art: its ability to enthral and transport its audience, to stir in them a heightened sense of time and place.

Still, an important question remains. What can’t games do? On the one hand, work such as Chen’s points to a huge potential audience for whole new genres of game. On the other hand, there are certain limitations inherent in the very fabric of an interactive medium, perhaps the most important of which is also the most basic: its lack of inevitability. As the tech-savvy critic and author Steven Poole has argued, “great stories depend for their effect on irreversibility – and this is because life, too, is irreversible. The pity and terror that Aristotle says we feel as spectators to a tragedy are clearly dependent on our apprehension of circumstances that cannot be undone.” Games have only a limited, and often incidental, ability to convey such feelings.

Thus, the greatest pleasure of games is immersion: you move, explore and learn, sometimes in the company of thousands of other players. There is nothing inherently mindless about such an interaction; but nor should there be any question of games replacing books or films. Instead – just as the printed word, recorded music and moving images have already done – this interactive art will continue to develop along with its audience. It will, I believe, become one of the central ways in which we seek to understand (and distract, and delight) ourselves in the 21st century. And, for the coming generations – for which the world before video games will seem as remote a past as one without cinema does to us – the best gift we can bequeath is a muscular and discerning critical engagement.

Tom Chatfield is the arts and books editor of Prospect magazine. His book on the culture of video games, “Gameland”, is forthcoming from Virgin Books (£18.99)


Pong (1972). The first true video game. Bounce a square white blob between two white bats. A software revolution.

Pac-Man (1980). A little yellow ball, in a maze, eating dots, being chased by ghosts. The beauty of interactive complexity arising from something simple and slightly crazy – and still fiendishly fun today.

Tetris (1989). This utterly abstract puzzle of falling blocks and vanishing lines was launched on the Nintendo Game Boy and single-handedly guaranteed the hand-held console’s triumph as a global phenomenon. Perhaps the purest logical play experience ever created.

Civilization (1991). View the world from the top down and guide a civilisation from hunter-gathering to landing on the moon. Hours, days and months of utterly absorbing micromanagement.

Doom (1993). Run around a scary maze wielding a selection of big guns being chased by aliens. Then chase your friends. Doom did it first and created a genre. For the first time, a computer had made grown men tremble.

Ultima Online (1997). Enter a living, breathing online world with thousands of other players; become a tradesman, buy your own house, chat, make and betray new friends. The first multiplayer online role-playing game is still, for many, the purest and greatest of them all.

The Sims (2000). Simulated daily activities for virtual people; help them and watch them live. For those who think games are all violent and mindless, note that this began the best-selling series of games in history – more than 100 million copies sold, and counting.

Bejeweled (2001). A simple, pretty puzzle game that changed the games industry simply because it could be downloaded in minutes by any computer attached to the internet. Digital distribution is the future, and this title first proved it.

Guitar Hero (2005). Live out your dreams of rock deification with friends gathered round to watch you pummel a plastic guitar. A revolution in cross-media: cool, sociable fun, and a licence to print money for its creators.

Wii Sports (2006). Wave your arms around while holding a white controller. Now anyone could play tennis and go bowling with family and friends in the living room. Nintendo delivered another revolution in gaming with this debut title for its Wii console.

This article first appeared in the 04 May 2009 issue of the New Statesman, Flu: Everything you need to know

An artist's version of the Reichstag fire, which Hitler blamed on the communists. CREDIT: DEZAIN UNKIE/ ALAMY
Show Hide image

The art of the big lie: the history of fake news

From the Reichstag fire to Stalin’s show trials, the craft of disinformation is nothing new.

We live, we’re told, in a post-truth era. The internet has hyped up postmodern relativism, and created a kind of gullible cynicism – “nothing is true, and who cares anyway?” But the thing that exploits this mindset is what the Russians call dezinformatsiya. Disinformation – strategic deceit – isn’t new, of course. It has played a part in the battle that has raged between mass democracy and its enemies since at least the First World War.

Letting ordinary people pick governments depends on shared trust in information, and this is vulnerable to attack – not just by politicians who want to manipulate democracy, but by those on the extremes who want to destroy it. In 1924, the first Labour government faced an election. With four days to go, the Daily Mail published a secret letter in which the leading Bolshevik Grigory Zinoviev heralded the government’s treaties with the Soviets as a way to help recruit British workers for Leninism. Labour’s vote actually went up, but the Liberal share collapsed, and the Conservatives returned to power.

We still don’t know exactly who forged the “Zinoviev Letter”, even after exhaustive investigations of British and Soviet intelligence archives in the late 1990s by the then chief historian of the Foreign Office, Gill Bennett. She concluded that the most likely culprits were White Russian anti-Bolsheviks, outraged at Labour’s treaties with Moscow, probably abetted by sympathetic individuals in British intelligence. But whatever the precise provenance, the case demonstrates a principle that has been in use ever since: cultivate your lie from a germ of truth. Zinoviev and the Comintern were actively engaged in trying to stir revolution – in Germany, for example. Those who handled the letter on its journey from the forger’s desk to the front pages – MI6 officers, Foreign Office officials, Fleet Street editors – were all too ready to believe it, because it articulated their fear that mass democracy might open the door to Bolshevism.

Another phantom communist insurrection opened the way to a more ferocious use of disinformation against democracy. On the night of 27 February 1933, Germany’s new part-Nazi coalition was not yet secure in power when news started to hum around Berlin that the Reichstag was on fire. A lone left-wing Dutchman, Marinus van der Lubbe, was caught on the site and said he was solely responsible. But Hitler assumed it was a communist plot, and seized the opportunity to do what he wanted to do anyway: destroy them. The suppression of the communists was successful, but the claim it was based on rapidly collapsed. When the Comintern agent Gyorgy Dimitrov was tried for organising the fire, alongside fellow communists, he mocked the charges against him, which were dismissed for lack of evidence.

Because it involves venturing far from the truth, disinformation can slip from its authors’ control. The Nazis failed to pin blame on the communists – and then the communists pinned blame on the Nazis. Dimitrov’s comrade Willi Münzenberg swiftly organised propaganda suggesting that the fire was too convenient to be Nazi good luck. A “counter-trial” was convened in London; a volume called The Brown Book of the Reichstag Fire and Hitler Terror was rushed into print, mixing real accounts of Nazi persecution of communists – the germ of truth again – with dubious documentary evidence that they had started the fire. Unlike the Nazis’ disinformation, this version stuck, for decades.

Historians such as Richard Evans have argued that both stories about the fire were false, and it really was one man’s doing. But this case demonstrates another disinformation technique still at work today: hide your involvement behind others, as Münzenberg did with the British great and good who campaigned for the Reichstag prisoners. In the Cold War, the real source of disinformation was disguised with the help of front groups, journalistic “agents of influence”, and the trick of planting a fake story in an obscure foreign newspaper, then watching as the news agencies picked it up. (Today, you just wait for retweets.)

In power, the Nazis made much use of a fictitious plot that did, abominably, have traction: The Protocols of the Elders of Zion, a forged text first published in Russia in 1903, claimed to be a record of a secret Jewish conspiracy to take over the world – not least by means of its supposed control of everyone from bankers to revolutionaries. As Richard Evans observes, “If you subject people to a barrage of lies, in the end they’ll begin to think well maybe they’re not all true, but there must be something in it.” In Mein Kampf, Hitler argued that the “big lie” always carries credibility – an approach some see at work not only in the Nazis’ constant promotion of the Protocols but in the pretence that their Kristallnacht pogrom in 1938 was spontaneous. (It is ironic that Hitler coined the “big lie” as part of an attack on the Jews’ supposed talent for falsehood.) Today, the daring of the big lie retains its force: even if no one believes it, it makes smaller untruths less objectionable in comparison. It stuns opponents into silence.

Unlike the Nazis, the Bolshevik leaders were shaped by decades as hunted revolutionaries, dodging the Tsarist secret police, who themselves had had a hand in the confection of the Protocols. They occupied the paranoid world of life underground, governed by deceit and counter-deceit, where any friend could be an informer. By the time they finally won power, disinformation was the Bolsheviks’ natural response to the enemies they saw everywhere. And that instinct endures in Russia even now.

In a competitive field, perhaps the show trial is the Soviet exercise in upending the truth that is most instructive today. These sinister theatricals involved the defendants “confessing” their crimes with great
sincerity and detail, even if the charges were ludicrous. By 1936, Stalin felt emboldened to drag his most senior rivals through this process – starting with Grigory Zinoviev.

The show trial is disinformation at its cruellest: coercing someone falsely to condemn themselves to death, in so convincing a way that the world’s press writes it up as truth. One technique involved was perfected by the main prosecutor, Andrey Vyshinsky, who bombarded the defendants with insults such as “scum”, “mad dogs” and “excrement”. Besides intimidating the victim, this helped to distract attention from the absurdity of the charges. Barrages of invective on Twitter are still useful for smearing and silencing enemies.


The show trials were effective partly because they deftly reversed the truth. To conspire to destroy the defendants, Stalin accused them of conspiring to destroy him. He imposed impossible targets on straining Soviet factories; when accidents followed, the managers were forced to confess to “sabotage”. Like Hitler, Stalin made a point of saying the opposite of what he did. In 1936, the first year of the Great Terror, he had a rather liberal new Soviet constitution published. Many in the West chose to believe it. As with the Nazis’ “big lie”, shameless audacity is a disinformation strategy in itself. It must have been hard to accept that any regime could compel such convincing false confessions, or fake an entire constitution.

No one has quite attempted that scale of deceit in the post-truth era, but reversing the truth remains a potent trick. Just think of how Donald Trump countered the accusation that he was spreading “fake news” by making the term his own – turning the charge on his accusers, and even claiming he’d coined it.

Post-truth describes a new abandonment of the very idea of objective truth. But George Orwell was already concerned that this concept was under attack in 1946, helped along by the complacency of dictatorship-friendly Western intellectuals. “What is new in totalitarianism,” he warned in his essay “The Prevention of Literature”, “is that its doctrines are not only unchallengeable but also unstable. They have to be accepted on pain of damnation, but on the other hand they are always liable to be altered on a moment’s notice.”

A few years later, the political theorist Hannah Arendt argued that Nazis and Stalinists, each immersed in their grand conspiratorial fictions, had already reached this point in the 1930s – and that they had exploited a similar sense of alienation and confusion in ordinary people. As she wrote in her 1951 book, The Origins of Totalitarianism: “In an ever-changing, incomprehensible world the masses had reached the point where they would, at the same time, believe everything and nothing, think that everything was possible and that nothing was true.” There is a reason that sales of Arendt’s masterwork – and Orwell’s Nineteen Eighty-Four – have spiked since November 2016.

During the Cold War, as the CIA got in on the act, disinformation became less dramatic, more surreptitious. But show trials and forced confessions continued. During the Korean War, the Chinese and North Koreans induced a series of captured US airmen to confess to dropping bacteriological weapons on North Korea. One lamented that he could barely face his family after what he’d done. The pilots were brought before an International Scientific Commission, led by the eminent Cambridge scientist Joseph Needham, which investigated the charges. A documentary film, Oppose Bacteriological Warfare, was made, showing the pilots confessing and Needham’s Commission peering at spiders in the snow. But the story was fake.

The germ warfare hoax was a brilliant exercise in turning democracy’s expectations against it. Scientists’ judgements, campaigning documentary, impassioned confession – if you couldn’t believe all that, what could you believe? For the genius of disinformation is that even exposure doesn’t disable it. All it really has to do is sow doubt and confusion. The story was finally shown to be fraudulent in 1998, through documents transcribed from Soviet archives. The transcripts were authenticated by the historian Kathryn Weathersby, an expert on the archives. But as Dr Weathersby laments, “People come back and say ‘Well, yeah, but, you know, they could have done it, it could have happened.’”

There’s an insidious problem here: the same language is used to express blanket cynicism as empirical scepticism. As Arendt argued, gullibility and cynicism can become one. If opponents of democracy can destroy the very idea of shared, trusted information, they can hope to destabilise democracy itself.

But there is a glimmer of hope here too. The fusion of cynicism and gullibility can also afflict the practitioners of disinformation. The most effective lie involves some self-deception. So the show trial victims seem to have internalised the accusations against them, at least for a while, but so did their tormentors. As the historian Robert Service has written, “Stalin frequently lied to the world when he was simultaneously lying to himself.”

Democracy might be vulnerable because of its reliance on the idea of shared truth – but authoritarianism has a way of undermining itself by getting lost in its own fictions. Disinformation is not only a danger to its targets. 

Phil Tinline’s documentary “Disinformation: A User’s Guide” will be broadcast on BBC Radio 4 at 8pm, 17 March

This article first appeared in the 04 May 2009 issue of the New Statesman, Flu: Everything you need to know