Turning point

The announcement of the 2008 Turner Prize shortlist has prompted the usual carping. But let’s not fo

Do you remember how people used to hate modern art? I do. Because it wasn't very long ago. Actually, I can be more precise than that. People hated modern art until about 1991. Which was also the year that Channel 4 began broadcasting the Turner Prize. I know. I was there.

Last November, as the corks popped on Horseferry Road and Channel 4 celebrated 25 years of broadcasting, I had something to celebrate as well. Last year was the 30th anniversary of my becoming an art critic. My first review appeared in the Guardian on 1 April 1977. April Fool's Day. I mention it here not because I, too, want to have my back slapped - parties cloud your judgement - but because those 30 years of incessant art cri ticism qualify me perfectly to write about the impact of Channel 4 and the Turner Prize. I was around before either of them. I know what effect both of them have had. I remember vividly the situation before the two of them got together.

These days, of course, it's all so different. Not only do we take modern art in our stride, but we appear to have developed an unquenchable thirst for it. Queues of excited kidults wind their way around Tate Modern waiting for a go on the slides. Newspaper headlines blare out virtually every day how much this hedge-fundist has paid for that Damien Hirst. It's a favourite nat ional topic. Yes, the odd grumpelstiltskin from Somerset can still be heard at Turner Prize time posing that tedious annual question: Is it art? But no one takes that kind of complaint too seriously any more. It's part of the theatre of the Turner Prize. It's not serious. It's not vicious. It's not like it used to be.

In the old days, I kept a box in which I collected all the rude letters I received referring me to the story of the emperor's new clothes. I called it my Emperor Box. It's one of Hans Christian Andersen's most quoted offerings. A king gets conned into believing that he's wearing a beautiful suit of clothing when, actually, there's nothing there - he's naked. But the king believes the conmen because he doesn't want to appear a fool. The same goes for his queen, his court and everyone else in the land. Everyone except for a little boy, who comes along to the procession, sees immediately that the king is bollock-naked, and begins shouting it out.

So many readers of my articles in support of modern art felt the need to remind me of this story that my Emperor Box quickly overflowed, and I ended up chucking it away in about 1985. Had I kept up the collection, there would now be no room in my house for me. What amused me most about this correspondence was the way that everyone who wrote seemed to believe that only they were clever or truthful enough to make the comparison between Andersen's story and the modern art con. Their hero was the little boy, with whom they identified fiercely. And whenever they encountered modern art that they did not like, or did not understand, they began frantically casting themselves in his role and insisting that all the other inhabitants of the land were being fooled.

When the rude letters first started arriving, I used to write back dutifully to their senders, pointing out a crucial flaw in their thinking. In Andersen's fairy tale, the people who believed that the king was clothed made up the majority, and the little boy was the exception. In the case of modern art, it was the other way around. In England, in 1985, the vast majority of people seemed convinced that modern art was a con. The newspapers agreed and kept up an incessant attack on any and all new art. Remember the Tate Gallery bricks? Critics like me, who were trying to write supportively about it, and who didn't believe that anyone was trying to con anyone else, were branded fools and charlatans, too, and subjected to a nasty barrage of mockery.

As I look back now on those days, it is hard to believe how much has changed. Today, Tate Modern is nothing less than the most visited museum in the world. People love going there. And the Turner Prize exhibition is usually the best-attended show of the year at Tate Britain. What has been forgotten is how much smaller and more local an event it used to be before Channel 4 took it on.

Frankly, the early history of the Turner Prize is embarrassing. In the first few years after its inception in 1984, even I could have won, because it was open to anyone involved in art - critics, curators, museum directors, the lot. You didn't even need to be an artist. No one was sure what the rules were. Or who was eligible. And although a few pictures were sometimes displayed in the rotunda of the Tate, there was no proper exhibition of shortlisted artists for visitors to see or judge. The winners just seemed to emerge in that mysterious British way you also find with knighthoods, or membership applications to the Garrick.

All this changed when Channel 4 got involved. I was at the channel at the time, and vividly remember the debate with the Tate over what the prize should be. Clearly it had to be a prize for artists, but what sort of artist, and how many of them? In previous years, there had been short-lists of five, six, seven and even eight. It was Channel 4 which insisted that the shortlist be kept at the manageable number of four. And set a younger age limit so that the prize could become an encouragement for artists in the first half of their career, rather than a good-service gong slipped to them just before the end.

The other big change was the exhibition. The Tate, which had struggled hopelessly for so many years to attract audiences to its displays of modern art, was reluctant to give over any space to an annual display of shortlisted artists. It was afraid no one would come and that the galleries would remain empty. That was what it was used to. These days, the Turner Prize show can be relied upon to pull in up to 100,000 punters. Back in 1990, when Channel 4 first got involved, if you put up a sign outside a gallery saying "Modern Art Inside", everyone would have gone the other way.

As it happens, the first year of Channel 4's involvement was worryingly quiet. Having been reinvented from scratch, the prize was finding its way. So quiet was the reaction that I remember getting called in to a meeting with the director of programmes at Channel 4, John Willis, and being told that we should drop it and sponsor something at the British Film Institute instead. I disagreed, and was granted another go. Then came 1992. Everything changed. Damien Hirst was invited on to the shortlist for the first time, and through some potent chemical reaction caused by the fusion of his pushy personality with the rightness of the moment, everyone suddenly noticed what was going on at the Tate. From a story that was buried somewhere after the obituaries in the newspapers, the Turner Prize turned abruptly into a front-pager.

The following year - when Rachel White read's sad and iconic plaster cast of a Victorian house in the East End was included, and won - was even crazier. One moment no one was interested. The next moment the whole world seemed to be. Looking back now on this extra ordinary sea change in mood and pace, I can see, of course, that it wasn't the Turner Prize alone, or Channel 4's coverage of it, that was responsible. Various forces were at work here. A rare generation of talented artists, the YBAs, had emerged in unison, producing work that appeared to capture a new national optimism. And the revamped Turner Prize, with its younger rules, became a brilliant shop window for them.

After all those years of Margaret Thatcher and the regressive Britishness that she embodied, the country was sick of grumpelstiltskin-thinking. A change in aesthetics was as desirable as a new prime minister. All those designer lofts that began springing up in Docklands didn't need frilly paintings in frilly gold frames to decorate them, either. They needed art that was fresh, modern and of its time. Basically, Britain had finally learned to accept modernity. It had taken a century, but, finally, it had happened.

Without a pixel of doubt, it was the biggest cultural turnaround of my critical life. And although you can argue for ever about the exact ratio of responsibility for this change that should be allotted to the Turner Prize and to Channel 4, what is unarguable is that both of them were involved in it, up to their necks.

A version of this essay appears in "25 x 4: Channel 4 at 25", edited by Rosie Boycott and Meredith Etherington-Smith, published by Cultureshock Media (£25). Info: http://www.cultureshockmedia.co.uk


For the first time in ten years, women outnumber men on the Turner Prize shortlist. Bangladeshi-born Runa Islam is now based in London and works mainly in film. Her 2004 work Be the First to See What You See As You See It (still, below left) follows a woman wandering through a gallery filled with fine china, as she gently starts to tip the pieces surrounding her to the floor. Islam’s influences include Ingmar Bergman.

Goshka Macuga is a “cultural archaeologist”, and produces sculptural arrangements that often include work by other artists.

The Belfast-born, Glasgow-based sculptor Cathy Wilkes explores issues of femininity and sexuality. She often uses mannequins, as in Non-Verbal (left), which was exhibited at the 2005 Scotland and Venice Biennale.

Mark Leckey, the only male nominee, is the favourite to win. His exhibition “Industrial Light and Magic” combined disparate media and featured pop-cultural icons such as Felix the Cat. Ladbrokes has put his odds at 5/6.

Natasha Periyan

This article first appeared in the 26 May 2008 issue of the New Statesman, Moral crisis?

Show Hide image

The age of loneliness

Profound changes in technology, work and community are transforming our ultrasocial species into a population of loners.

Our dominant ideology is based on a lie. A series of lies, in fact, but I’ll focus on just one. This is the claim that we are, above all else, self-interested – that we seek to enhance our own wealth and power with little regard for the impact on others.

Some economists use a term to describe this presumed state of being – Homo economicus, or self-maximising man. The concept was formulated, by J S Mill and others, as a thought experiment. Soon it became a modelling tool. Then it became an ideal. Then it evolved into a description of who we really are.

It could not be further from the truth. To study human behaviour is to become aware of how weird we are. Many species will go to great lengths to help and protect their close kin. One or two will show occasional altruism towards unrelated members of their kind. But no species possesses a capacity for general altruism that is anywhere close to our own.

With the possible exception of naked mole-rats, we have the most social minds of all mammals. These minds evolved as an essential means of survival. Slow, weak, armed with rounded teeth and flimsy nails in a world of fangs and claws and horns and tusks, we survived through co-operation, reciprocity and mutual defence, all of which developed to a remarkable degree.

A review paper in the journal Frontiers in Psychology observes that Homo economicus  might be a reasonable description of chimpanzees. “Outsiders . . . would not expect to receive offers of food or solicitude; rather, they would be fiercely attacked . . . food is shared only under harassment; even mothers will not voluntarily offer novel foods to their own infants unless the infants beg for them.” But it is an unreasonable description of human beings.

How many of your friends, colleagues and neighbours behave like chimpanzees? A few, perhaps. If so, are they respected or reviled? Some people do appear to act as if they have no interests but their own – Philip Green and Mike Ashley strike me as possible examples – but their behaviour ­attracts general revulsion. The news is filled with spectacular instances of human viciousness: although psychopaths are rare, their deeds fill the papers. Daily acts of kindness are seldom reported, because they are everywhere.

Every day, I see people helping others with luggage, offering to cede their place in a queue, giving money to the homeless, setting aside time for others, volunteering for causes that offer no material reward. Alongside these quotidian instances are extreme and stunning cases. I think of my Dutch mother-in-law, whose family took in a six-year-old Jewish boy – a stranger – and hid him in their house for two years during the German occupation of the Netherlands. Had he been discovered, they would all have been sent to a concentration camp.

Studies suggest that altruistic tendencies are innate: from the age of 14 months, children try to help each other, attempting to hand over objects another child can’t reach. At the age of two, they start to share valued possessions. By the time they are three, they begin to protest against other people’s violation of moral norms.

Perhaps because we are told by the media, think tanks and politicians that competition and self-interest are the defining norms of human life, we disastrously mischaracterise the way in which other people behave. A survey commissioned by the Common Cause Foundation reported that 78 per cent of respondents believe others to be more selfish than they really are.

I do not wish to suggest that this mythology of selfishness is the sole or even principal cause of the epidemic of loneliness now sweeping the world. But it is likely to contribute to the plague by breeding suspicion and a sense of threat. It also appears to provide a doctrine of justification for those afflicted by isolation, a doctrine that sees individualism as a higher state of existence than community. Perhaps it is hardly surprising that Britain, the European nation in which neoliberalism is most advanced, is, according to government figures, the loneliness capital of Europe.

There are several possible reasons for the atomisation now suffered by the supremely social mammal. Work, which used to bring us together, now disperses us: many people have neither fixed workplaces nor regular colleagues and regular hours. Our leisure time has undergone a similar transformation: cinema replaced by television, sport by computer games, time with friends by time on Facebook.

Social media seems to cut both ways: it brings us together and sets us apart. It helps us to stay in touch, but also cultivates a tendency that surely enhances other people’s sense of isolation: a determination to persuade your followers that you’re having a great time. FOMO – fear of missing out – seems, at least in my mind, to be closely ­associated with loneliness.

Children’s lives in particular have been transformed: since the 1970s, their unaccompanied home range (in other words, the area they roam without adult supervision) has declined in Britain by almost 90 per cent. Not only does this remove them from contact with the natural world, but it limits their contact with other children. When kids played out on the street or in the woods, they quickly formed their own tribes, learning the social skills that would see them through life.

An ageing population, family and community breakdown, the decline of institutions such as churches and trade unions, the switch from public transport to private, inequality, an alienating ethic of consumerism, the loss of common purpose: all these are likely to contribute to one of the most dangerous epidemics of our time.

Yes, I do mean dangerous. The stress response triggered by loneliness raises blood pressure and impairs the immune system. Loneliness enhances the risk of depression, paranoia, addiction, cognitive decline, dem­entia, heart disease, stroke, viral infection, accidents and suicide. It is as potent a cause of early death as smoking 15 cigarettes a day, and can be twice as deadly as obesity.

Perhaps because we are in thrall to the ideology that helps to cause the problem, we turn to the market to try to solve it. Over the past few weeks, the discovery of a new American profession, the people-walker (taking human beings for walks), has caused a small sensation in the media. In Japan there is a fully fledged market for friendship: you can hire friends by the hour with whom to chat and eat and watch TV; or, more disturbingly, to pose for pictures that you can post on social media. They are rented as mourners at funerals and guests at weddings. A recent article describes how a fake friend was used to replace a sister with whom the bride had fallen out. What would the bride’s mother make of it? No problem: she had been rented, too. In September we learned that similar customs have been followed in Britain for some time: an early foray into business for the Home Secretary, Amber Rudd, involved offering to lease her posh friends to underpopulated weddings.



My own experience fits the current pattern: the high incidence of loneliness suffered by people between the ages of 18 and 34. I have sometimes been lonely before and after that period, but it was during those years that I was most afflicted. The worst episode struck when I returned to Britain after six years working in West Papua, Brazil and East Africa. In those parts I sometimes felt like a ghost, drifting through societies to which I did not belong. I was often socially isolated, but I seldom felt lonely, perhaps because the issues I was investigating were so absorbing and the work so frightening that I was swept along by adrenalin and a sense of purpose.

When I came home, however, I fell into a mineshaft. My university friends, with their proper jobs, expensive mortgages and settled, prematurely aged lives, had become incomprehensible to me, and the life I had been leading seemed incomprehensible to everyone. Though feeling like a ghost abroad was in some ways liberating – a psychic decluttering that permitted an intense process of discovery – feeling like a ghost at home was terrifying. I existed, people acknowledged me, greeted me cordially, but I just could not connect. Wherever I went, I heard my own voice bouncing back at me.

Eventually I made new friends. But I still feel scarred by that time, and fearful that such desolation may recur, particularly in old age. These days, my loneliest moments come immediately after I’ve given a talk, when I’m surrounded by people congratulating me or asking questions. I often experience a falling sensation: their voices seem to recede above my head. I think it arises from the nature of the contact: because I can’t speak to anyone for more than a few seconds, it feels like social media brought to life.

The word “sullen” evolved from the Old French solain, which means “lonely”. Loneliness is associated with an enhanced perception of social threat, so one of its paradoxical consequences is a tendency to shut yourself off from strangers. When I was lonely, I felt like lashing out at the society from which I perceived myself excluded, as if the problem lay with other people. To read any comment thread is, I feel, to witness this tendency: you find people who are plainly making efforts to connect, but who do so by insulting and abusing, alienating the rest of the thread with their evident misanthropy. Perhaps some people really are rugged individualists. But others – especially online – appear to use that persona as a rationale for involuntary isolation.

Whatever the reasons might be, it is as if a spell had been cast on us, transforming this ultrasocial species into a population of loners. Like a parasite enhancing the conditions for its own survival, loneliness impedes its own cure by breeding shame and shyness. The work of groups such as Age UK, Mind, Positive Ageing and the Campaign to End Loneliness is life-saving.

When I first wrote about this subject, and the article went viral, several publishers urged me to write a book on the theme. Three years sitting at my desk, studying isolation: what’s the second prize? But I found another way of working on the issue, a way that engages me with others, rather than removing me. With the brilliant musician Ewan McLennan, I have written a concept album (I wrote the first draft of the lyrics; he refined them and wrote the music). Our aim is to use it to help break the spell, with performances of both music and the spoken word designed to bring people together –which, we hope, will end with a party at the nearest pub.

By itself, our work can make only a tiny contribution to addressing the epidemic. But I hope that, both by helping people to acknowledge it and by using the power of music to create common sentiment, we can at least begin to identify the barriers that separate us from others, and to remember that we are not the selfish, ruthless beings we are told we are.

“Breaking the Spell of Loneliness” by Ewan McLennan and George Monbiot is out now. For a full list of forthcoming gigs visit: monbiot.com/music/

This article first appeared in the 20 October 2016 issue of the New Statesman, Brothers in blood