Damon Albarn's band Blur and their fans felt London belonged to them. Photo: Rex
Show Hide image

Britpop: an insider’s tale of music’s last great gold rush

Twenty years ago, it felt like John Niven and his fellow indie kids had won pop's cold war. But then the madness set in.

It’s such an awful term, isn’t it? A genuinely dreary expression – Britpop. So bovine and literal, containing none of the wit or musicality of “punk rock” or “acid house”. Let’s face it, even “skiffle” – with all its onomatopoeic bounce and shuffle – was a better word to describe a genre than Britpop. Still, we’d best call it something if we’re to remain on the same page.

Exactly twenty years ago this month, in the spring of 1994, I moved from Scotland to London, renting a room from my friend John Kellett in a Georgian maisonette in Notting Hill Gate. John was the head of legal and business affairs at Go! Discs, which was enjoying huge success with Paul Weller and the Beautiful South and was getting ready to release the first Portishead album. I was moving from working at a tiny independent label in Glasgow to my first major job, at London Records, then part of the PolyGram group. Go! Discs was based in Chiswick, west London. We were in nearby Hammersmith. Most mornings that summer, John and I would race each other to work in our company cars, speeding along the Westway.

I wasn’t the only indie kid graduating up from the bush leagues that year. In the weeks and months following my move south, Blur released Parklife and Oasis put out Definitely Maybe: the two records that heralded the Imperial Phase of what would come to be known as Britpop, a movement that had been birthed a year earlier – albeit in a crude, forced, C-section kind of way – by a Select magazine cover featuring the Auteurs, Pulp, Suede, Denim and Saint Etienne. (Note to readers much under 30: Select was a kind of Q or Mojo for rave and indie kids whose existence exactly spanned the Nineties.)

Bliss was it in that dawn to be alive, but to be young, overpaid and living in London was very – well, heaven might be stretching it, but you certainly felt glad you weren’t in the Shetland Isles, or out in Hackensack, New Jersey.

Indie London of the Eighties had been a grim old place, a sad wasteland where you stared through your fringe at the June Brides or the Shop Assistants as they played in a brightly lit room above a pub, the carpet crunching beneath you as you frugged shambolically under the powerful spell of three Hofmeisters. In our world in 1988, to see a band like Primal Scream filling the big hall at Ulu (capacity: 700) was like seeing the Stones at Madison Square Garden in 1975. A few short years later this kind of gig would be a warm-up show . . .

By all means go ahead and cock your snook in the cold light of 2014, but it’s hard to overstate how exciting the early Oasis shows were, or the thrill of hearing Blur’s “For Tomorrow” in a speeding car on the Westway. Of hearing records you loved coming out of radios in offices and factories all over the country, rather than from the stereo in a sordid bedroom containing you and five of your mates. Suddenly the bands you liked were in the charts and you and your friends were working at major labels, and it felt like we had won the indie cold war of the Eighties. Suddenly you were in the VIP box at Maine Road, lurid with drugs and icy champagne. Suddenly watching Death by Milkfloat at the Camden Falcon felt a long, long way away as the capital came alive for us.

The street names I learned for the first time during that hot summer of 1994 are as sweet to me today as a litany: Westbourne Park Road, Ladbroke Grove, Camden Parkway and Old Compton Street. Of course, we were just doing what generation after generation before us had done – finding our feet in London and deciding it belonged to us and no one else. We painted it in our own colours: the gold of dawn, the chalky white of Ecstasy and cocaine and the bold red of New Labour.

We were in from the cold. And very soon we created an environment where Cast could have a double platinum debut album, where Blur and Oasis were discussed on the national news, where Leon from Northern Uproar could talk openly of buying a casino, and yet still aliens did not come and destroy our planet.

As you get older, you realise that every generation has its moment where impotence becomes prepotency. Where it gets its shot in office. The hippies of the Sixties swapped tie-and-dye and four-skin joints for velvet suits and gold coke spoons and ran CBS and Warner Brothers in the Seventies. The punk rockers of the Seventies wore Yohji Yamamoto suits and turned rebellion into money as they presided over the cold stream of synthetic pop music that we indie kids waged war against in the Eighties. And in our turn, in the Nineties, we untucked our Ralph Lauren shirts and talked about “having it” and “larging it” and we thought Audioweb not altogether a bad thing, and we dumbed it down and watched the cash pour in.

It was to be the last great gold rush of the music industry, when having a decent hit meant you were selling over a million albums at 13 quid a pop. As opposed to today, when you’re celebrating doing 100,000 at £7 per unit. We were selling ten times the volume at twice the price. It did not lead to reasonable behaviour or sane decisions. And, again like every generation before us, we eventually came to realise that our moment of dominance was hollow and riven with compromise. Cocaine destroyed you. We went to war in Iraq. Cast broke up. And, as John Harris sagely noted in his superlative study of the period, The Last Party, Leon from Northern Uproar did not get that casino.

As the decade drew to a close it all changed. Noel went into the kitchen at Supernova Heights one morning in 1998 to start the day with a lager and a chunky line of bugle and thought, “What the fuck am I doing?” In four short years we went from “you might as well do the white line” to Jarvis desolately singing “bye-bye” at the end of This Is Hardcore.

Britpop. Look upon its works, ye mighty, and, what? Sigh? Laugh? Shrug? Do not judge us too harshly. Like Francis Ford Coppola making Apocalypse Now – if you can picture Coppola snapping his fingers Manc-style in an untucked Ralph Lauren shirt and crocodile-effect Patrick Cox loafers – we were young, we had too much money and we had access to too much “equipment”.

And, little by little, we went insane.

John Niven is the author of “Kill Your Friends”, “The Amateurs” and “Second Coming” (all published by Vintage)

Show Hide image

The age of loneliness

Profound changes in technology, work and community are transforming our ultrasocial species into a population of loners.

Our dominant ideology is based on a lie. A series of lies, in fact, but I’ll focus on just one. This is the claim that we are, above all else, self-interested – that we seek to enhance our own wealth and power with little regard for the impact on others.

Some economists use a term to describe this presumed state of being – Homo economicus, or self-maximising man. The concept was formulated, by J S Mill and others, as a thought experiment. Soon it became a modelling tool. Then it became an ideal. Then it evolved into a description of who we really are.

It could not be further from the truth. To study human behaviour is to become aware of how weird we are. Many species will go to great lengths to help and protect their close kin. One or two will show occasional altruism towards unrelated members of their kind. But no species possesses a capacity for general altruism that is anywhere close to our own.

With the possible exception of naked mole-rats, we have the most social minds of all mammals. These minds evolved as an essential means of survival. Slow, weak, armed with rounded teeth and flimsy nails in a world of fangs and claws and horns and tusks, we survived through co-operation, reciprocity and mutual defence, all of which developed to a remarkable degree.

A review paper in the journal Frontiers in Psychology observes that Homo economicus  might be a reasonable description of chimpanzees. “Outsiders . . . would not expect to receive offers of food or solicitude; rather, they would be fiercely attacked . . . food is shared only under harassment; even mothers will not voluntarily offer novel foods to their own infants unless the infants beg for them.” But it is an unreasonable description of human beings.

How many of your friends, colleagues and neighbours behave like chimpanzees? A few, perhaps. If so, are they respected or reviled? Some people do appear to act as if they have no interests but their own – Philip Green and Mike Ashley strike me as possible examples – but their behaviour ­attracts general revulsion. The news is filled with spectacular instances of human viciousness: although psychopaths are rare, their deeds fill the papers. Daily acts of kindness are seldom reported, because they are everywhere.

Every day, I see people helping others with luggage, offering to cede their place in a queue, giving money to the homeless, setting aside time for others, volunteering for causes that offer no material reward. Alongside these quotidian instances are extreme and stunning cases. I think of my Dutch mother-in-law, whose family took in a six-year-old Jewish boy – a stranger – and hid him in their house for two years during the German occupation of the Netherlands. Had he been discovered, they would all have been sent to a concentration camp.

Studies suggest that altruistic tendencies are innate: from the age of 14 months, children try to help each other, attempting to hand over objects another child can’t reach. At the age of two, they start to share valued possessions. By the time they are three, they begin to protest against other people’s violation of moral norms.

Perhaps because we are told by the media, think tanks and politicians that competition and self-interest are the defining norms of human life, we disastrously mischaracterise the way in which other people behave. A survey commissioned by the Common Cause Foundation reported that 78 per cent of respondents believe others to be more selfish than they really are.

I do not wish to suggest that this mythology of selfishness is the sole or even principal cause of the epidemic of loneliness now sweeping the world. But it is likely to contribute to the plague by breeding suspicion and a sense of threat. It also appears to provide a doctrine of justification for those afflicted by isolation, a doctrine that sees individualism as a higher state of existence than community. Perhaps it is hardly surprising that Britain, the European nation in which neoliberalism is most advanced, is, according to government figures, the loneliness capital of Europe.

There are several possible reasons for the atomisation now suffered by the supremely social mammal. Work, which used to bring us together, now disperses us: many people have neither fixed workplaces nor regular colleagues and regular hours. Our leisure time has undergone a similar transformation: cinema replaced by television, sport by computer games, time with friends by time on Facebook.

Social media seems to cut both ways: it brings us together and sets us apart. It helps us to stay in touch, but also cultivates a tendency that surely enhances other people’s sense of isolation: a determination to persuade your followers that you’re having a great time. FOMO – fear of missing out – seems, at least in my mind, to be closely ­associated with loneliness.

Children’s lives in particular have been transformed: since the 1970s, their unaccompanied home range (in other words, the area they roam without adult supervision) has declined in Britain by almost 90 per cent. Not only does this remove them from contact with the natural world, but it limits their contact with other children. When kids played out on the street or in the woods, they quickly formed their own tribes, learning the social skills that would see them through life.

An ageing population, family and community breakdown, the decline of institutions such as churches and trade unions, the switch from public transport to private, inequality, an alienating ethic of consumerism, the loss of common purpose: all these are likely to contribute to one of the most dangerous epidemics of our time.

Yes, I do mean dangerous. The stress response triggered by loneliness raises blood pressure and impairs the immune system. Loneliness enhances the risk of depression, paranoia, addiction, cognitive decline, dem­entia, heart disease, stroke, viral infection, accidents and suicide. It is as potent a cause of early death as smoking 15 cigarettes a day, and can be twice as deadly as obesity.

Perhaps because we are in thrall to the ideology that helps to cause the problem, we turn to the market to try to solve it. Over the past few weeks, the discovery of a new American profession, the people-walker (taking human beings for walks), has caused a small sensation in the media. In Japan there is a fully fledged market for friendship: you can hire friends by the hour with whom to chat and eat and watch TV; or, more disturbingly, to pose for pictures that you can post on social media. They are rented as mourners at funerals and guests at weddings. A recent article describes how a fake friend was used to replace a sister with whom the bride had fallen out. What would the bride’s mother make of it? No problem: she had been rented, too. In September we learned that similar customs have been followed in Britain for some time: an early foray into business for the Home Secretary, Amber Rudd, involved offering to lease her posh friends to underpopulated weddings.



My own experience fits the current pattern: the high incidence of loneliness suffered by people between the ages of 18 and 34. I have sometimes been lonely before and after that period, but it was during those years that I was most afflicted. The worst episode struck when I returned to Britain after six years working in West Papua, Brazil and East Africa. In those parts I sometimes felt like a ghost, drifting through societies to which I did not belong. I was often socially isolated, but I seldom felt lonely, perhaps because the issues I was investigating were so absorbing and the work so frightening that I was swept along by adrenalin and a sense of purpose.

When I came home, however, I fell into a mineshaft. My university friends, with their proper jobs, expensive mortgages and settled, prematurely aged lives, had become incomprehensible to me, and the life I had been leading seemed incomprehensible to everyone. Though feeling like a ghost abroad was in some ways liberating – a psychic decluttering that permitted an intense process of discovery – feeling like a ghost at home was terrifying. I existed, people acknowledged me, greeted me cordially, but I just could not connect. Wherever I went, I heard my own voice bouncing back at me.

Eventually I made new friends. But I still feel scarred by that time, and fearful that such desolation may recur, particularly in old age. These days, my loneliest moments come immediately after I’ve given a talk, when I’m surrounded by people congratulating me or asking questions. I often experience a falling sensation: their voices seem to recede above my head. I think it arises from the nature of the contact: because I can’t speak to anyone for more than a few seconds, it feels like social media brought to life.

The word “sullen” evolved from the Old French solain, which means “lonely”. Loneliness is associated with an enhanced perception of social threat, so one of its paradoxical consequences is a tendency to shut yourself off from strangers. When I was lonely, I felt like lashing out at the society from which I perceived myself excluded, as if the problem lay with other people. To read any comment thread is, I feel, to witness this tendency: you find people who are plainly making efforts to connect, but who do so by insulting and abusing, alienating the rest of the thread with their evident misanthropy. Perhaps some people really are rugged individualists. But others – especially online – appear to use that persona as a rationale for involuntary isolation.

Whatever the reasons might be, it is as if a spell had been cast on us, transforming this ultrasocial species into a population of loners. Like a parasite enhancing the conditions for its own survival, loneliness impedes its own cure by breeding shame and shyness. The work of groups such as Age UK, Mind, Positive Ageing and the Campaign to End Loneliness is life-saving.

When I first wrote about this subject, and the article went viral, several publishers urged me to write a book on the theme. Three years sitting at my desk, studying isolation: what’s the second prize? But I found another way of working on the issue, a way that engages me with others, rather than removing me. With the brilliant musician Ewan McLennan, I have written a concept album (I wrote the first draft of the lyrics; he refined them and wrote the music). Our aim is to use it to help break the spell, with performances of both music and the spoken word designed to bring people together –which, we hope, will end with a party at the nearest pub.

By itself, our work can make only a tiny contribution to addressing the epidemic. But I hope that, both by helping people to acknowledge it and by using the power of music to create common sentiment, we can at least begin to identify the barriers that separate us from others, and to remember that we are not the selfish, ruthless beings we are told we are.

“Breaking the Spell of Loneliness” by Ewan McLennan and George Monbiot is out now. For a full list of forthcoming gigs visit: monbiot.com/music/

This article first appeared in the 20 October 2016 issue of the New Statesman, Brothers in blood