Show Hide image

Screen test

Video games dominate Britain’s entertainment industry, yet we lack the critical vocabulary to unders

Cultural realities tend to lag behind economic ones. How else to explain that the UK’s biggest (worth £4.5bn-plus in annual sales) and fastest-growing (at close to 20 per cent annually) entertainment medium still barely registers on the nation’s more rarefied intellectual radar? I am talking, of course, about video games – as the field of interactive entertainment still rather quaintly tends to be known. And the reason for its neglect is not so much snobbery as a gaping absence in our critical vocabulary and sensibilities.

When, today, we ask a question such as “Is it art?” we are no longer looking for a yes or no answer. The 20th century decided that urinals, cans of soup, recorded silence, heaps of bricks and fake human excrement could all be art, of a certain kind. Under these circumstances, it would be more than a little perverse to deny the idea of art to objects as lovingly crafted, as considered and as creative as video games. The question that’s really at stake is something more specific. If video games are art, what kind of art are they? What are their particular attributes and potential? And, perhaps most importantly, just how good are they?

I recently posed similar questions to someone who is very definitely both an artist and a gamer: the writer Naomi Alderman. Alderman’s first novel, Disobedience, appeared in 2006 and won her the Orange Award for New Writers. In parallel to her work as a literary writer, however, she also spent three years pursuing a very different kind of career: that of lead writer on the experimental “alternate reality” game Perplex City. To many authors, such a venture might have felt like a period of time away from “real” writing. Yet, Alderman explained, for her it was more a discovery that these two modes of writing were not only compatible, but symbiotic. I asked her whether she had preferred working on her novel or on the game. “I couldn’t choose,” she said. “I feel that if I were to give up either the novel or the game, I wouldn’t be able to do the other.”

It’s a creative interconnection Alderman traces back to her childhood. “My first memory of playing a game was around 1981, when my mum took me to the Puffin Club exhibition, a kind of roadshow for kids who read books published by Puffin. I remember they had a bank of computers at this one where you could queue up to get ten minutes playing a text-based adventure game. And I thought, ‘This is absolutely brilliant.’ I was fascinated.” These games were some of the first things it was possible to play on a computer in which plot and character meant more than a handful of pixels dashing across the screen. For Alderman, as for many others, the experience was closely associated “with stories and with the idea of being able to walk into a story”. And the dizzying kind of thought experiment that the best fiction can undertake – its gleeful defiance of the rules of time and nature – lies close to the heart of what video games do best.

As a modern example, Alderman describes a game called Katamari. In it, for want of a better description, you roll stuff up. You control, she tells me, “a little ball, which is effectively sticky, and you’re rolling it around a landscape picking stuff up. As you do so, your ball gets bigger and bigger. It’s almost impossible to explain how much fun this is, the pleasure of growing your little ball, which starts off just big enough to pick up pins and sweets from a tabletop and ends up picking envelopes, then televisions, then tables, then houses, then streets; until in the end you can roll it across the whole world picking up clouds and continents.”

Katamari may sound like an oddity, but its pleasures are typical of a central kind of video-game experience, in that they are in part architectural: something one inhabits and encounters incrementally; a space designed to be occupied and experienced rather than viewed simply as a whole. Players in a well-made game will relish not just its appearance but also the feel of exploring and gradually mastering its unreal space. Yet, in what sense is any of this art, or even artistic? Just as every word within a novel has to be written, of course, every single element of any video game has to be crafted from scratch. To talk about the “art” element of games is, I would argue, to talk about the point at which this fantastically intricate undertaking achieves a particular concentration, complexity and resonance.

It’s worth remembering, too, just how young a medium video games are. Commercial games have existed for barely 30 years; the analogy with film, now almost 120 years old, is an illuminating one. In December 1895, the Lumière brothers, Auguste and Louis, showed the first films of real-life images to a paying audience, in Paris. This, clearly, was a medium, but not yet an art form; and for its first decade, film remained largely a novelty, a technology that astounded viewers with images such as trains rushing into a station, sending early audiences running out of cinemas in terror. It took several decades for film to master its own, unique artistic language: cinematography. It took time, too, for audiences to expect more from it than raw wonder or exhilaration. Yet today you would be hard-pushed to find a single person who does not admire at least one film as a work of art.

If, however, you ask about video games, the chances are that you’ll find plenty of people who don’t play them at all, let alone consider them of any artistic interest. This is hardly surprising: at first glance it can seem that many games remain, in artistic terms, at the level of cinema’s train entering a station – occasions for technological shock and awe, rather than for the more densely refined emotions of art.

Yet the nature of games as a creative medium has changed profoundly in recent years – as I discovered when I spoke to Justin Villiers, an award-winning screenwriter and film-maker who since late 2007 has been plying his trade in the realm of video games. Even a few years ago, he explained, his career move would have been artistically unthinkable. “In the old days, the games industry fed on itself. You’d have designers who were brought up on video games writing games themselves, so they were entirely self-referential; all the characters sounded like refugees from weak Star Trek episodes or Lord of the Rings out-takes. But now there is new blood in the industry – people with backgrounds in cinema and theatre and comic books and television. In the area in which I work, writing and direction, games are just starting to offer genuine catharsis, or to bring about epiphanies; they’re becoming more than simple tools to sublimate our desires or our fight for survival.”

I suggest the film analogy, and wonder what stage of cinema games now correspond to. “It reminds me of the late 1960s and early 1970s, because there were no rules, or, as soon as there were some, someone would come along and break them. Kubrick needed a lens for 2001: a Space Odyssey that didn’t exist, so, together with the director of photography, he invented one.” How does this translate to the world of games? “It’s like that in the industry right now. Around a table you have the creative director, lead animator, game designer, sound designer and me, and we’re all trying to work out how to create a moment in a game or a sequence that has never been done before, ever.”

Villiers is, he admits, an unlikely evangelist: someone who was initially deeply sceptical of games’ claims as art. But it would be wrong, he concedes, simply to assume that the current explosion of talent within the gaming industry will allow it to overtake film or television as a storytelling medium. Today’s best games may be as good as some films in their scripts, performances, art direction and suchlike. But most are still much worse; and in any case, the most cinematic games are already splitting off into a hybrid subgenre that lies outside the mainstream of gaming. If we are to understand the future of games, as both a medium and an art form, we must look to what is unique about them. And that is their interactivity.

To explore this further, I spoke to a game designer who is responsible for some of the most visionary titles to appear in recent years – Jenova Chen. Chen is co-founder of the California-based games studio thatgamecompany, a young firm whose mission, as he explains it, is breathtakingly simple: to produce games that are “beneficial and relevant to adult life; that can touch you as books, films and music can”.

Chen’s latest game, Flower, is the partial fulfilment of these ambitions, a work whose genesis in many ways seems closer to that of a poem or painting than an interactive entertainment. “I grew up in Shanghai,” he explains. “A huge city, one of the world’s biggest and most polluted. Then I came to America and one day I was driving from Los Angeles to San Francisco and I saw endless fields of green grass, and rows and rows of windmill farms. And I was shocked, because up until then I had never seen a scene like this. So I started to think: wouldn’t it be nice for people living in a city to turn a games console into a portal, leading into these endless green fields?”

From this grew a game that is both incredibly simple and utterly compelling. You control a petal from a single flower, and must move it around a shimmering landscape of fields and a gradually approaching city by directing a wind to blow it along, gathering other petals from other flowers as you go. Touch a button on the control pad to make the wind blow harder; let go to soften it; gently shift the controller in the air to change directions. You can, as I did on my first play, simply trace eddies in the air, or gust between tens of thousands of blades of grass. Or you can press further into the world of the game and begin to learn how the landscape of both city and fields is altered by your touch, springing into light and life as you pass.

“We want the player to feel like they are healing,” Chen tells me, “that they are creating life and energy and spreading light and love.” If this sounds hopelessly naive, it is important to remember that the sophistication of a game experience depends not so much on its conceptual complexity as on the intricacy of its execution. In Flower, immense effort has gone into making something that appears simple and beautiful, but that is minutely reactive and adaptable. Here, the sensation of “flow” – of immersion in the task of illumination and exploration – connects to some of those fundamental emotions that are the basis of all enduring art: its ability to enthral and transport its audience, to stir in them a heightened sense of time and place.

Still, an important question remains. What can’t games do? On the one hand, work such as Chen’s points to a huge potential audience for whole new genres of game. On the other hand, there are certain limitations inherent in the very fabric of an interactive medium, perhaps the most important of which is also the most basic: its lack of inevitability. As the tech-savvy critic and author Steven Poole has argued, “great stories depend for their effect on irreversibility – and this is because life, too, is irreversible. The pity and terror that Aristotle says we feel as spectators to a tragedy are clearly dependent on our apprehension of circumstances that cannot be undone.” Games have only a limited, and often incidental, ability to convey such feelings.

Thus, the greatest pleasure of games is immersion: you move, explore and learn, sometimes in the company of thousands of other players. There is nothing inherently mindless about such an interaction; but nor should there be any question of games replacing books or films. Instead – just as the printed word, recorded music and moving images have already done – this interactive art will continue to develop along with its audience. It will, I believe, become one of the central ways in which we seek to understand (and distract, and delight) ourselves in the 21st century. And, for the coming generations – for which the world before video games will seem as remote a past as one without cinema does to us – the best gift we can bequeath is a muscular and discerning critical engagement.

Tom Chatfield is the arts and books editor of Prospect magazine. His book on the culture of video games, “Gameland”, is forthcoming from Virgin Books (£18.99)

VIDEO GAMES: THE CANON

Pong (1972). The first true video game. Bounce a square white blob between two white bats. A software revolution.

Pac-Man (1980). A little yellow ball, in a maze, eating dots, being chased by ghosts. The beauty of interactive complexity arising from something simple and slightly crazy – and still fiendishly fun today.

Tetris (1989). This utterly abstract puzzle of falling blocks and vanishing lines was launched on the Nintendo Game Boy and single-handedly guaranteed the hand-held console’s triumph as a global phenomenon. Perhaps the purest logical play experience ever created.

Civilization (1991). View the world from the top down and guide a civilisation from hunter-gathering to landing on the moon. Hours, days and months of utterly absorbing micromanagement.

Doom (1993). Run around a scary maze wielding a selection of big guns being chased by aliens. Then chase your friends. Doom did it first and created a genre. For the first time, a computer had made grown men tremble.

Ultima Online (1997). Enter a living, breathing online world with thousands of other players; become a tradesman, buy your own house, chat, make and betray new friends. The first multiplayer online role-playing game is still, for many, the purest and greatest of them all.

The Sims (2000). Simulated daily activities for virtual people; help them and watch them live. For those who think games are all violent and mindless, note that this began the best-selling series of games in history – more than 100 million copies sold, and counting.

Bejeweled (2001). A simple, pretty puzzle game that changed the games industry simply because it could be downloaded in minutes by any computer attached to the internet. Digital distribution is the future, and this title first proved it.

Guitar Hero (2005). Live out your dreams of rock deification with friends gathered round to watch you pummel a plastic guitar. A revolution in cross-media: cool, sociable fun, and a licence to print money for its creators.

Wii Sports (2006). Wave your arms around while holding a white controller. Now anyone could play tennis and go bowling with family and friends in the living room. Nintendo delivered another revolution in gaming with this debut title for its Wii console.

This article first appeared in the 04 May 2009 issue of the New Statesman, Flu: Everything you need to know

Getty
Show Hide image

“I felt so frantic I couldn’t see my screen”: why aren’t we taking mental health sick days?

Some employees with mental health problems fake reasons for taking days off, or struggle in regardless. What should companies be doing differently?

“I would go to the loo and just cry my eyes out. And sometimes colleagues could hear me. Then I would just go back to my desk as if nothing had happened. And, of course, no one would say anything because I would hide it as well as I could.”

How many times have you heard sobbing through a work toilet door – or been the person in the cubicle?

Jaabir Ramlugon is a 31-year-old living in north London. He worked in IT for four years, and began having to take time off for depressive episodes after starting at his company in 2012. He was eventually diagnosed with borderline personality disorder last January.

At first, he would not tell his employers or colleagues why he was taking time off.

“I was at the point where I was in tears going to work on the train, and in tears coming back,” he recalls. “Some days, I just felt such a feeling of dread about going into work that I just physically couldn’t get up ... I wouldn’t mention my mental health; I would just say that my asthma was flaring up initially.”

It wasn’t until Ramlugon was signed off for a couple of months after a suicide attempt that he told his company what he was going through. Before that, a “culture of presenteeism” at his work – and his feeling that he was “bunking off” because there was “nothing physically wrong” – made him reluctant to tell the truth about his condition.

“I already felt pretty low in my self-esteem; the way they treated me amplified that”

Eventually, he was dismissed by his company via a letter describing him as a “huge burden” and accusing him of “affecting” its business. He was given a dismissal package, but feels an alternative role or working hours – a plan for a gradual return to work – would have been more supportive.

“I already felt pretty low in my self-esteem. The way they treated me definitely amplified that, especially with the language that they used. The letter was quite nasty because it talked about me being a huge burden to the company.”

Ramlugon is not alone. Over three in ten employees say they have experienced mental health problems while in employment, according to the Chartered Institute of Personnel and Development. Under half (43 per cent) disclose their problem to their employer, and under half (46 per cent) say their organisation supports staff with mental health problems well.

I’ve spoken to a number of employees in different workplaces who have had varying experiences of suffering from mental ill health at work.

***

Taking mental health days off sick hit the headlines after an encouraging message from a CEO to his employee went viral. Madalyn Parker, a web developer, informed her colleagues in an out-of-office message that she would be taking “today and tomorrow to focus on my mental health – hopefully I’ll be back next week refreshed and back to 100 per cent”.

Her boss Ben Congleton’s reply, which was shared tens of thousands of times, personally thanked her – saying it’s “an example to us all” to “cut through the stigma so we can bring our whole selves to work”.

“Thank you for sending emails like this,” he wrote. “Every time you do, I use it as a reminder of the importance of using sick days for mental health – I can’t believe this is not standard practice at all organisations.”


Congleton went on to to write an article entitled “It’s 2017 and Mental Health is still an issue in the workplace”, arguing that organisations need to catch up:

“It’s 2017. We are in a knowledge economy. Our jobs require us to execute at peak mental performance. When an athlete is injured they sit on the bench and recover. Let’s get rid of the idea that somehow the brain is different.”

But not all companies are as understanding.

In an investigation published last week, Channel 5 News found that the number of police officers taking sick days for poor mental health has doubled in six years. “When I did disclose that I was unwell, I had some dreadful experiences,” one retired detective constable said in the report. “On one occasion, I was told, ‘When you’re feeling down, just think of your daughters’. My colleagues were brilliant; the force was not.”

“One day I felt so frantic I couldn’t see my screen”

One twenty-something who works at a newspaper echoes this frustration at the lack of support from the top. “There is absolutely no mental health provision here,” they tell me. “HR are worse than useless. It all depends on your personal relationships with colleagues.”

“I was friends with my boss so I felt I could tell him,” they add. “I took a day off because of anxiety and explained what it was to my boss afterwards. But that wouldn’t be my blanket approach to it – I don’t think I’d tell my new boss [at the same company], for instance. I have definitely been to work feeling awful because if I didn’t, it wouldn’t get done.”

Presenteeism is a rising problem in the UK. Last year, British workers took an average of 4.3 days off work due to illness – the lowest number since records began. I hear from many interviewees that they feel guilty taking a day off for a physical illness, which makes it much harder to take a mental health day off.

“I felt a definite pressure to be always keen as a young high-flyer and there were a lot of big personalities and a lot of bitchiness about colleagues,” one woman in her twenties who works in media tells me. “We were only a small team and my colleague was always being reprimanded for being workshy and late, so I didn’t want to drag the side down.”

Diagnosed with borderline personality disorder, which was then changed to anxiety and depression, she didn’t tell her work about her illness. “Sometimes I struggled to go to work when I was really sick. And my performance was fine. I remember constantly sitting there sort of eyeballing everyone in mild amusement that I was hiding in plain sight. This was, at the time, vaguely funny for me. Not much else was.

“One day I just felt so frantic I couldn’t see my screen so I locked myself in the bathroom for a bit then went home, telling everyone I had a stomach bug so had to miss half the day,” she tells me. “I didn’t go in the next day either and concocted some elaborate story when I came back.”

Although she has had treatment and moved jobs successfully since, she has never told her work the real reason for her time off.

“In a small company you don’t have a confidential person to turn to; everyone knows everyone.”

“We want employers to treat physical and mental health problems as equally valid reasons for time off sick,” says Emma Mamo, head of workplace wellbeing at the mental health charity Mind. “Staff who need to take time off work because of stress and depression should be treated the same as those who take days off for physical health problems, such as back or neck pain.”

She says that categorising a day off as a “mental health sick day” is unhelpful, because it could “undermine the severity and impact a mental health problem can have on someone’s day-to-day activities, and creates an artificial separation between mental and physical health.”

Instead, employers should take advice from charities like Mind on how to make the mental health of their employees an organisational priority. They can offer workplace initiatives like Employee Assistance Programmes (which help staff with personal and work-related problems affecting their wellbeing), flexible working hours, and clear and supportive line management.

“I returned to work gradually, under the guidance of my head of department, doctors and HR,” one journalist from Hertfordshire, who had to take three months off for her second anorexia inpatient admission, tells me. “I was immensely lucky in that my line manager, head of department and HR department were extremely understanding and told me to take as much time as I needed.”

“They didnt make me feel embarrassed or ashamed – such feelings came from myself”

“They knew that mental health – along with my anorexia I had severe depression – was the real reason I was off work ... I felt that my workplace handled my case in an exemplary manner. It was organised and professional and I wasn’t made to feel embarrassed or ashamed from them – such feelings came from myself.”

But she still at times felt “flaky”, “pathetic” and “inefficient”, despite her organisation’s good attitude. Indeed, many I speak to say general attitudes have to change in order for people to feel comfortable about disclosing conditions to even the closest friends and family, let alone a boss.

“There are levels of pride,” says one man in his thirties who hid his addiction while at work. “You know you’re a mess, but society dictates you should be functioning.” He says this makes it hard to have “the mental courage” to broach this with your employer. “Especially in a small company – you don’t have a confidential person to turn to. Everyone knows everyone.”

“But you can’t expect companies to deal with it properly when it’s dealt with so poorly in society as it is,” he adds. “It’s massively stigmatised, so of course it’s going to be within companies as well. I think there has to be a lot more done generally to make it not seem like it’s such a big personal failing to become mentally ill. Companies need direction; it’s not an easy thing to deal with.”

Until we live in a society where it feels as natural taking a day off for feeling mentally unwell as it does for the flu, companies will have to step up. It is, after all, in their interest to have their staff performing well. When around one in four people in Britain experience mental ill health each year, it’s not a problem they can afford to ignore.

If your manager doesn’t create the space for you to be able to talk about wellbeing, it can be more difficult to start this dialogue. It depends on the relationship you have with your manager, but if you have a good relationship and trust them, then you could meet them one-to-one to discuss what’s going on.

Having someone from HR present will make the meeting more formal, and normally wouldn’t be necessary in the first instance. But if you didn’t get anywhere with the first meeting then it might be a sensible next step.

If you still feel as though you’re not getting the support you need, contact Acas or Mind's legal line on 0300 466 6463.

Anoosh Chakelian is senior writer at the New Statesman.

This article first appeared in the 04 May 2009 issue of the New Statesman, Flu: Everything you need to know