Show Hide image

Digital revolution: how technology has changed what it means to be an artist

A new exhibition at the Barbican shows how the technology behind video games is turbocharging the human imagination. But is it art? (Yes.) 

Umbrellium at the Barbican.

At the Liverpool Biennial this month, I had an epiphany: the video installation is the last refuge of the scoundrel. In a world where almost everyone I know carries a video recorder around in their pocket – and where Google is trying to get us to wear them on our faces – the mere act of committing something to tape is no longer alienating enough to kick the mundane into that strange sphere we call “art”. In 1972, when Ana Mendieta made Untitled (Death of a Chicken), an art gallery was probably the only place you could see a naked woman, splattered in blood, holding some recently deceased poultry. Now we have Reddit.

At the biennial – alongside several great exhibitions, including a well-curated selection of Whistlers at the Bluecoat – there were some spectacularly dull video installations. One was even canny enough to make a feature of its dullness by asserting: “Uri Aran plays with the way endless repetition can strip the banal of its meaning.” Oh, does he now? I resent artists who make me feel like this – like I’m some great big square who doesn’t get the ineffable postmodern profundity of resting a box of raisins over a vitrine filled with snot and would probably just prefer an insipid watercolour of a bee.

Nonetheless, I maintain that the feeling of “Yeah, but I could have done that” is lethal to the enjoyment of modern and conceptual art – and without the obvious artistry evidenced by mastery of a skill such as painting, sculpture or ceramics, it is easily evoked. Artists, take note: you can’t just film an orange for seven minutes and pretend it’s a meditation on solitude.

A small cheer, then, for the Barbican’s new exhibition “Digital Revolution”, which is devoted to the effect that technology has had on art and design. There is a video installation here that is enough to make anyone a convert to the medium: Chris Milk’s The Treachery of Sanctuary (pictured below). It features a triptych of video screens in a dark room, mirrored in a pool of water, which project your silhouette on to a white background. You see yourself stomp, fidget and gesticulate.

In the first screen, your silhouette disintegrates into birds and flies away; in the second, the birds peck you to nothingness. The final screen allows you to become a bird: fling open your arms and your silhouette suddenly has gigantic eagle’s wings, unfurling with a satisfying “whoomph”. The artist’s notes reference religious fervour, self-doubt and the cave paintings of Lascaux but even without any of that knowledge, the experience is memorable. It is art of this precise moment in history. First, it uses Kinect – the motion-sensing hardware developed by Microsoft that lets you pretend to ten-pin bowl or jet-ski from the comfort of your front room. Second, it uses that technology to answer a need created by another: specifically, that CGI moment in a science-fiction film when an ordinary human being suddenly sprouts wings (the X-Men movies do this well, as does Kevin Smith’s Dogma). Watching those films, surely every viewer wants to know what that might feel like? It turns out that even a small approximation is immensely satisfying. Every adult who stood in front of Milk’s triptych looked as wonderstruck as a child.

The other huge opportunity that the advance of the digital realm offers to artists is the possibility of crowd-sourcing. Elsewhere in the show, Milk has collaborated with another artist, Aaron Koblin, on The Johnny Cash Project, which offers people around the world a chance to redraw a frame of their choice from the video for the song “Ain’t No Grave”. Some are faithful replicas, others wild interpretations in a pointillist or impressionist style. The finished video, updated regularly on YouTube, is both recognisable and eerily estranged from its source material. Again, this is a work that speaks to a specific cultural moment. Technology has created the shared reference, through the mass-media power of the music video, and supplied the means to mimic it, through Photoshop and other software.

The reason I got this reviewing gig is that the exhibition also features games. There are early classics, which show how the commercial power of titles such as Pong, Pac-Man and Super Mario Bros drove innovations in hardware, alongside examples of later experimental and indie games.

A breathless pause here to deal with Roger Ebert’s question, which has haunted the medium for the past decade: can a video game be art? The answer, on this evidence, is yes. Ebert, a film critic, suggested that the interactive element of games showed there was no guiding creative intent: “I believe art is created by an artist. If you change it, you become the artist. Would Romeo and Juliet have been better with a different ending? . . . If you can go through ‘every emotional journey available’, doesn’t that devalue each and every one of them?”

But what many of the best games do is show their workings – they uncover the straitjacket of context within which your choices are made. My favourite title, BioShock (not on show here), is a good example, because of its unreliable narrator. The Barbican gives you a chance to play last year’s bleak indie hit Papers, Please, which forces you to act as an immigration official in a crushingly monotonous Soviet bureaucracy. Every choice leads to misery; there is no way to “win”.

Cute Circuit.

The problem with Ebert’s question was that it prompted a retrenchment. As a journalist born in the 1940s and a representative of the more established medium of film, he unwittingly pushed the button in every gamer marked: “But grown-ups don’t understand me!” And so, for several years, games manufacturers decided the best way to gain the approval of the cultural gatekeepers was to mimic cinema. The 2000s became the decade of the interminable cut scene, in which you were forced to watch mediocre mo-capped actors plank their way through utter bibble about lost home worlds and tactical assault teams before you got to the good bits (that is, shooting people in the face).

Finally, though, games are shrugging off their inferiority complex and the irony is that only now is their artistic potential being realised. While it’s true that the mega-hits – the soldier-simulations and dystopian rampage fantasies – are often immensely dumb and cliché-ridden, the same could be said of Hollywood blockbusters. At least in the game world, indie titles have an easy route to consumers, through the Steam software for PC and Mac and the Xbox and PlayStation marketplaces.

Avant-garde games have gone in two distinct directions, both driven by the urge to get away from the desire to “win” or “lose” and a desire to be more self-consciously “artistic”. Some, such as Dear Esther, Dys4ia or Gone Home, try to tell small-scale stories through a game mechanic focused on discovery; they are perhaps better described as “interactive fiction”. Others, such as Journey, Flower or Proteus, do away with any measure of progress and concentrate instead on creating a pure sensory experience.

What this exhibition delivers is a feeling of abundance, of creativity, of the potential of this palette of tools to turb0charge the human imagination. There’s more, too: a sense of connection unimaginable even 30 years ago. Inside the online game Minecraft, there are vast cities built by the mouse clicks of thousands. The indie title Fez has puzzles that are so difficult they can only be solved by an online horde comparing notes on a forum. But underneath it all, there is fragility. The Barbican’s first section, “Digital Archaeology”, reminds you that what is cheap to create is also easily discarded. In 2012, New York’s Museum of Modern Art started collecting the source code of old games, reasoning they could easily be lost for ever as the consoles needed to play them broke down or were thrown out as rubbish.

There’s the rub of artistic expression in our digital world: we can send data across the world at a keystroke, but in 100 years’ time, the highest achievements of our culture might be inaccessible to us, locked away in a digital space to which we no longer have the key. Mind you, when it comes to video installations of solitary oranges, I’m not sure that’s any great loss.

“Digital Revolution” is at the Barbican Centre, London EC2, until 14 September

The Liverpool Biennial runs until 26 October at various locations across the city

Helen Lewis is deputy editor of the New Statesman. She has presented BBC Radio 4’s Week in Westminster and is a regular panellist on BBC1’s Sunday Politics.

This article first appeared in the 08 July 2014 issue of the New Statesman, The end of the red-top era?

Getty
Show Hide image

Leader: The unresolved Eurozone crisis

The continent that once aspired to be a rival superpower to the US is now a byword for decline, and ethnic nationalism and right-wing populism are thriving.

The eurozone crisis was never resolved. It was merely conveniently forgotten. The vote for Brexit, the terrible war in Syria and Donald Trump’s election as US president all distracted from the single currency’s woes. Yet its contradictions endure, a permanent threat to continental European stability and the future cohesion of the European Union.

The resignation of the Italian prime minister Matteo Renzi, following defeat in a constitutional referendum on 4 December, was the moment at which some believed that Europe would be overwhelmed. Among the champions of the No campaign were the anti-euro Five Star Movement (which has led in some recent opinion polls) and the separatist Lega Nord. Opponents of the EU, such as Nigel Farage, hailed the result as a rejection of the single currency.

An Italian exit, if not unthinkable, is far from inevitable, however. The No campaign comprised not only Eurosceptics but pro-Europeans such as the former prime minister Mario Monti and members of Mr Renzi’s liberal-centrist Democratic Party. Few voters treated the referendum as a judgement on the monetary union.

To achieve withdrawal from the euro, the populist Five Star Movement would need first to form a government (no easy task under Italy’s complex multiparty system), then amend the constitution to allow a public vote on Italy’s membership of the currency. Opinion polls continue to show a majority opposed to the return of the lira.

But Europe faces far more immediate dangers. Italy’s fragile banking system has been imperilled by the referendum result and the accompanying fall in investor confidence. In the absence of state aid, the Banca Monte dei Paschi di Siena, the world’s oldest bank, could soon face ruin. Italy’s national debt stands at 132 per cent of GDP, severely limiting its firepower, and its financial sector has amassed $360bn of bad loans. The risk is of a new financial crisis that spreads across the eurozone.

EU leaders’ record to date does not encourage optimism. Seven years after the Greek crisis began, the German government is continuing to advocate the failed path of austerity. On 4 December, Germany’s finance minister, Wolfgang Schäuble, declared that Greece must choose between unpopular “structural reforms” (a euphemism for austerity) or withdrawal from the euro. He insisted that debt relief “would not help” the immiserated country.

Yet the argument that austerity is unsustainable is now heard far beyond the Syriza government. The International Monetary Fund is among those that have demanded “unconditional” debt relief. Under the current bailout terms, Greece’s interest payments on its debt (roughly €330bn) will continually rise, consuming 60 per cent of its budget by 2060. The IMF has rightly proposed an extended repayment period and a fixed interest rate of 1.5 per cent. Faced with German intransigence, it is refusing to provide further funding.

Ever since the European Central Bank president, Mario Draghi, declared in 2012 that he was prepared to do “whatever it takes” to preserve the single currency, EU member states have relied on monetary policy to contain the crisis. This complacent approach could unravel. From the euro’s inception, economists have warned of the dangers of a monetary union that is unmatched by fiscal and political union. The UK, partly for these reasons, wisely rejected membership, but other states have been condemned to stagnation. As Felix Martin writes on page 15, “Italy today is worse off than it was not just in 2007, but in 1997. National output per head has stagnated for 20 years – an astonishing . . . statistic.”

Germany’s refusal to support demand (having benefited from a fixed exchange rate) undermined the principles of European solidarity and shared prosperity. German unemployment has fallen to 4.1 per cent, the lowest level since 1981, but joblessness is at 23.4 per cent in Greece, 19 per cent in Spain and 11.6 per cent in Italy. The youngest have suffered most. Youth unemployment is 46.5 per cent in Greece, 42.6 per cent in Spain and 36.4 per cent in Italy. No social model should tolerate such waste.

“If the euro fails, then Europe fails,” the German chancellor, Angela Merkel, has often asserted. Yet it does not follow that Europe will succeed if the euro survives. The continent that once aspired to be a rival superpower to the US is now a byword for decline, and ethnic nationalism and right-wing populism are thriving. In these circumstances, the surprise has been not voters’ intemperance, but their patience.

This article first appeared in the 08 December 2016 issue of the New Statesman, Brexit to Trump