Show Hide image

The lost herd

When Gordon Brown became Prime Minister in 2007, he made great play of appointing figures from outs

On 11 May 2007, in a speech at the Imagination Gallery in the West End of London during which he announced his candidacy for the leadership of the Labour Party, Gordon Brown promised a "new politics" of openness, reform and change. He pledged to govern "in a different way", with a fresh style and new personnel. "I will reach out to put national interest before sectional interest," he said, "and I will form a government of all the talents, bringing people together to listen, to learn and solve problems, building on a broad sense of national purpose."

Within 48 hours of entering Downing Street as Prime Minister, on 27 June, Brown announced that the former United Nations deputy secretary general Mark Malloch Brown, the former first sea lord Admiral Sir Alan West, the former secretary general of the Confederation of British Industry Sir Digby Jones and Ara Darzi, one of the country's leading surgeons, would be ennobled and made ministers in government. Over the past two years, other non-politicians have joined Brown's ministerial ranks, including his former chief of staff and ex-head of the television regulator Ofcom, Stephen Carter, and the former City fund manager and multimillionaire Paul Myners.

Today, the Prime Minister's big tent is slowly being folded away, its frame dismantled, as one after another of the chief recruits to his "government of all the talents", called "goats" by Whitehall insiders, slips the ministerial tethers to graze in pastures new. Of the original quartet, only Lord West remains in office.

Should we be surprised? The Prime Minister is by reputation both a party-political tribalist and a keen centraliser of power - his former permanent secretary Andrew Turnbull described him as "Stalinist" and his former cabinet colleague Charles Clarke called him a "control freak". He always seemed an unlikely goatherd. Here was an opportunity for him to show the country his pluralist intentions and bipartisan credentials.

Tony Blair had been a strong advocate of big-tent politics: think of the late Roy Jenkins's report on proportional representation and Chris Patten's commission on policing in Ulster. Brown went beyond Blair, who deployed the great and the good from across the political spectrum only to advise, review and report, by bringing political outsiders directly into government.

Goats, however, are notoriously stubborn creatures, unpredictable and difficult to control. Malloch Brown became Lord Malloch-Brown of St Leonard's Forest in the County of West Sussex and was appointed minister of state for Africa, Asia and the UN at the Foreign Office. Within a fortnight of taking office, he had announced, much to the annoyance of Washington, that Brown and George W Bush would not be "joined at the hip" in the manner of Bush and Blair, a remark that seemed to suggest the end of the "special relationship".

When Malloch Brown resigned this month for "personal and family reasons", he said he remained "completely loyal to the Prime Minister". Yet reports since have suggested that the former international diplomat could no longer tolerate working in chaotic Whitehall, and had told colleagues that he had been party to better "strategic thinking" in Latin America and south-east Asia than in Downing Street. In a farewell salvo on Wednesday, Lord Malloch-Brown became the first senior minister to admit that British troops need more helicopters in Afghanistan - contradicting the Prime Minister and the Foreign Secretary - and he conceded that Brown's future looked "bleak". So much for loyalty.

His resignation was followed on 14 July by that of the Iraqi-born Ara Darzi - who, as Lord Darzi of Denham, was appointed by Brown as under-secretary of state at the Department for Health. Known as Robo-Doc for his pioneering work in the advancement of minimal invasive surgery and his use of surgical robots, Darzi fuelled speculation about an early election in October 2007 by publishing an unexpected interim report on his plans for NHS reform. He also angered campaigners, and Labour backbenchers, in a speech to the Lords in January 2008, by abandoning Lab­our's historic commitment to eliminate mixed-sex wards from NHS hospitals.

Darzi said he was resigning to focus on his medical work and academic research, but one has to ask: is this the time for a health minister to quit, as the Department of Heath grapples with a swine flu epidemic? He leaves the government having failed to see through the "once-in-a-generation" reforms he announced the government would be making to the NHS. Perhaps his only memorable contribution to political life is the time he leapt across the red benches in the Lords to save the life of a fellow Labour peer, Lord Brennan, who had collapsed after a heart attack.

Arguably the most controversial resignation - and appointment - among the goats was that of Digby Jones. The corpulent, conservative recent head of the CBI took the title Digby, Lord Jones of Birmingham, and became minister for UK trade and investment in the (then) Department for Business, Enterprise and Regulatory Reform. He quit the government after just 18 months in the post following a series of disagreements with Brown over spending and taxation, rows with civil servants, and a stream of gaffes - including some embarrassing remarks at a forum of Middle Eastern entrepreneurs. "We don't care what colour you are," he said. "We don't care if we can't pronounce your names and we don't care where your money comes from. We just want you to invest in our country." Jones then said: "I'm a goat, not a professional politician."

Since leaving government, Jones has spent his time criticising both Brown and civil servants, telling a Commons select committee in January this year that the job of junior minister was "one of the most dehumanising and depersonalising experiences a human being can have".

So who is left? The sole remaining goat from the original herd is the former first sea lord, Admiral Sir Alan West, who became Lord West of Spithead and was appointed under-secretary of state for security and counterterrorism at the Home Office by the Prime Minister in June 2007. Home Office press officers have since described him as "gaffe-prone", a "liability" and a "nightmare to manage". In November 2007, he questioned the government's plans to hold terror suspects for up to 42 days without charge, stating in a live BBC radio interview that he was not "totally convinced" of the case for change - only to perform a U-turn less than two hours later, after a hurried meeting with Brown.

His explanation: "Being a simple sailor, not a politician, maybe I didn't choose my words well." (The PM's spokesman issued his own memorable clarification: "I think he thought it was necessary to make sure his position was properly understood. I'm not sure he has changed his mind. Lord West made his position quite clear. Lord West gave his views quite clearly in his second statement.")

West is known for his bravery. In 1982, as the 34-year-old officer in command of the frigate HMS Ardent when it was sunk by Argentinian bombers during the Falklands conflict, he was the last to leave the sinking ship. His action earned him the Distinguished Service Cross. Nearly three decades on, the "simple sailor" remains the last man standing on the sinking ship of government. One source close to West says he has no plans to quit and that he is committed to his Home Office role - but adds "for the foreseeable future".

Brown's aides are curiously unwilling to lay any blows on the fleeing goats. One Downing Street aide told me each of them had "enrichgovernment" and that their contributions to public life "remain a genuinely positive story". What about Digby Jones? "Digby is Digby," I was told. "We knew he would be outspoken from the moment he was appointed."

But is this a genuinely positive story? One could argue that it was foolhardy to tread down this path in the first place. Political outsiders are, almost by definition, either ignorant of political rules, regulations, conventions and customs, or unwilling to conform to them. This was an accident waiting to happen.

Then there is the issue of ideology. As James Purnell (who resigned from the cabinet in June) has been busy pointing out, ideas matter, and constructing big tents in politics, welcoming as they may be, risks losing sight of this. New Labour was built on the assumption that modern politics is no longer ideological, substantive or divisive, that what matters is what works, and that there are bureaucratic, technical and pragmatic fixes to every political problem. This has proved to be a fiction. Bringing in outsiders to add expertise and experience to government is not new: Clement Attlee succeeded with the trade union leader Ernest Bevin, and Margaret Thatcher with the businessman David Young. Brown's mistake was to pretend that he could defy the laws of politics by appointing people who neither owed him party loyalty nor necessarily shared his political values. Jones, for example, is said to have discussed becoming a Con­servative MP once with the then Tory leader, Michael Howard. As head of the CBI, he had long opposed a range of Labour economic and social policies, chief among them the minimum wage. Why make him a Labour minister?

But, above all else, this is a story of a government of all the talents that could not keep those talents for long. On the one hand, we had a prime minister who thought he wanted independent goats in his administration but really needed loyal sheep; on the other hand, we had non-politicians who thought they could adapt to politics simply by virtue of their experience or expertise.

The shortsightedness identified by Lord Malloch-Brown and the bureaucracy singled out by Lord Jones are now hallmarks of modern British governance. The end result is a group of outsiders who have returned to the outside world, disillusioned, disappointed and depressed. That Lord Myners has announced he is leaving the Treasury to become a student of theology speaks volumes about life as a minister today. Whether we like it or not, politics will continue to be dominated by professionals.

Mehdi Hasan is senior editor (politics) of the New Statesman

Mehdi Hasan is a contributing writer for the New Statesman and the co-author of Ed: The Milibands and the Making of a Labour Leader. He was the New Statesman's senior editor (politics) from 2009-12.

This article first appeared in the 27 July 2009 issue of the New Statesman, On tour with the far right

Show Hide image

The age of loneliness

Profound changes in technology, work and community are transforming our ultrasocial species into a population of loners.

Our dominant ideology is based on a lie. A series of lies, in fact, but I’ll focus on just one. This is the claim that we are, above all else, self-interested – that we seek to enhance our own wealth and power with little regard for the impact on others.

Some economists use a term to describe this presumed state of being – Homo economicus, or self-maximising man. The concept was formulated, by J S Mill and others, as a thought experiment. Soon it became a modelling tool. Then it became an ideal. Then it evolved into a description of who we really are.

It could not be further from the truth. To study human behaviour is to become aware of how weird we are. Many species will go to great lengths to help and protect their close kin. One or two will show occasional altruism towards unrelated members of their kind. But no species possesses a capacity for general altruism that is anywhere close to our own.

With the possible exception of naked mole-rats, we have the most social minds of all mammals. These minds evolved as an essential means of survival. Slow, weak, armed with rounded teeth and flimsy nails in a world of fangs and claws and horns and tusks, we survived through co-operation, reciprocity and mutual defence, all of which developed to a remarkable degree.

A review paper in the journal Frontiers in Psychology observes that Homo economicus  might be a reasonable description of chimpanzees. “Outsiders . . . would not expect to receive offers of food or solicitude; rather, they would be fiercely attacked . . . food is shared only under harassment; even mothers will not voluntarily offer novel foods to their own infants unless the infants beg for them.” But it is an unreasonable description of human beings.

How many of your friends, colleagues and neighbours behave like chimpanzees? A few, perhaps. If so, are they respected or reviled? Some people do appear to act as if they have no interests but their own – Philip Green and Mike Ashley strike me as possible examples – but their behaviour ­attracts general revulsion. The news is filled with spectacular instances of human viciousness: although psychopaths are rare, their deeds fill the papers. Daily acts of kindness are seldom reported, because they are everywhere.

Every day, I see people helping others with luggage, offering to cede their place in a queue, giving money to the homeless, setting aside time for others, volunteering for causes that offer no material reward. Alongside these quotidian instances are extreme and stunning cases. I think of my Dutch mother-in-law, whose family took in a six-year-old Jewish boy – a stranger – and hid him in their house for two years during the German occupation of the Netherlands. Had he been discovered, they would all have been sent to a concentration camp.

Studies suggest that altruistic tendencies are innate: from the age of 14 months, children try to help each other, attempting to hand over objects another child can’t reach. At the age of two, they start to share valued possessions. By the time they are three, they begin to protest against other people’s violation of moral norms.

Perhaps because we are told by the media, think tanks and politicians that competition and self-interest are the defining norms of human life, we disastrously mischaracterise the way in which other people behave. A survey commissioned by the Common Cause Foundation reported that 78 per cent of respondents believe others to be more selfish than they really are.

I do not wish to suggest that this mythology of selfishness is the sole or even principal cause of the epidemic of loneliness now sweeping the world. But it is likely to contribute to the plague by breeding suspicion and a sense of threat. It also appears to provide a doctrine of justification for those afflicted by isolation, a doctrine that sees individualism as a higher state of existence than community. Perhaps it is hardly surprising that Britain, the European nation in which neoliberalism is most advanced, is, according to government figures, the loneliness capital of Europe.

There are several possible reasons for the atomisation now suffered by the supremely social mammal. Work, which used to bring us together, now disperses us: many people have neither fixed workplaces nor regular colleagues and regular hours. Our leisure time has undergone a similar transformation: cinema replaced by television, sport by computer games, time with friends by time on Facebook.

Social media seems to cut both ways: it brings us together and sets us apart. It helps us to stay in touch, but also cultivates a tendency that surely enhances other people’s sense of isolation: a determination to persuade your followers that you’re having a great time. FOMO – fear of missing out – seems, at least in my mind, to be closely ­associated with loneliness.

Children’s lives in particular have been transformed: since the 1970s, their unaccompanied home range (in other words, the area they roam without adult supervision) has declined in Britain by almost 90 per cent. Not only does this remove them from contact with the natural world, but it limits their contact with other children. When kids played out on the street or in the woods, they quickly formed their own tribes, learning the social skills that would see them through life.

An ageing population, family and community breakdown, the decline of institutions such as churches and trade unions, the switch from public transport to private, inequality, an alienating ethic of consumerism, the loss of common purpose: all these are likely to contribute to one of the most dangerous epidemics of our time.

Yes, I do mean dangerous. The stress response triggered by loneliness raises blood pressure and impairs the immune system. Loneliness enhances the risk of depression, paranoia, addiction, cognitive decline, dem­entia, heart disease, stroke, viral infection, accidents and suicide. It is as potent a cause of early death as smoking 15 cigarettes a day, and can be twice as deadly as obesity.

Perhaps because we are in thrall to the ideology that helps to cause the problem, we turn to the market to try to solve it. Over the past few weeks, the discovery of a new American profession, the people-walker (taking human beings for walks), has caused a small sensation in the media. In Japan there is a fully fledged market for friendship: you can hire friends by the hour with whom to chat and eat and watch TV; or, more disturbingly, to pose for pictures that you can post on social media. They are rented as mourners at funerals and guests at weddings. A recent article describes how a fake friend was used to replace a sister with whom the bride had fallen out. What would the bride’s mother make of it? No problem: she had been rented, too. In September we learned that similar customs have been followed in Britain for some time: an early foray into business for the Home Secretary, Amber Rudd, involved offering to lease her posh friends to underpopulated weddings.



My own experience fits the current pattern: the high incidence of loneliness suffered by people between the ages of 18 and 34. I have sometimes been lonely before and after that period, but it was during those years that I was most afflicted. The worst episode struck when I returned to Britain after six years working in West Papua, Brazil and East Africa. In those parts I sometimes felt like a ghost, drifting through societies to which I did not belong. I was often socially isolated, but I seldom felt lonely, perhaps because the issues I was investigating were so absorbing and the work so frightening that I was swept along by adrenalin and a sense of purpose.

When I came home, however, I fell into a mineshaft. My university friends, with their proper jobs, expensive mortgages and settled, prematurely aged lives, had become incomprehensible to me, and the life I had been leading seemed incomprehensible to everyone. Though feeling like a ghost abroad was in some ways liberating – a psychic decluttering that permitted an intense process of discovery – feeling like a ghost at home was terrifying. I existed, people acknowledged me, greeted me cordially, but I just could not connect. Wherever I went, I heard my own voice bouncing back at me.

Eventually I made new friends. But I still feel scarred by that time, and fearful that such desolation may recur, particularly in old age. These days, my loneliest moments come immediately after I’ve given a talk, when I’m surrounded by people congratulating me or asking questions. I often experience a falling sensation: their voices seem to recede above my head. I think it arises from the nature of the contact: because I can’t speak to anyone for more than a few seconds, it feels like social media brought to life.

The word “sullen” evolved from the Old French solain, which means “lonely”. Loneliness is associated with an enhanced perception of social threat, so one of its paradoxical consequences is a tendency to shut yourself off from strangers. When I was lonely, I felt like lashing out at the society from which I perceived myself excluded, as if the problem lay with other people. To read any comment thread is, I feel, to witness this tendency: you find people who are plainly making efforts to connect, but who do so by insulting and abusing, alienating the rest of the thread with their evident misanthropy. Perhaps some people really are rugged individualists. But others – especially online – appear to use that persona as a rationale for involuntary isolation.

Whatever the reasons might be, it is as if a spell had been cast on us, transforming this ultrasocial species into a population of loners. Like a parasite enhancing the conditions for its own survival, loneliness impedes its own cure by breeding shame and shyness. The work of groups such as Age UK, Mind, Positive Ageing and the Campaign to End Loneliness is life-saving.

When I first wrote about this subject, and the article went viral, several publishers urged me to write a book on the theme. Three years sitting at my desk, studying isolation: what’s the second prize? But I found another way of working on the issue, a way that engages me with others, rather than removing me. With the brilliant musician Ewan McLennan, I have written a concept album (I wrote the first draft of the lyrics; he refined them and wrote the music). Our aim is to use it to help break the spell, with performances of both music and the spoken word designed to bring people together –which, we hope, will end with a party at the nearest pub.

By itself, our work can make only a tiny contribution to addressing the epidemic. But I hope that, both by helping people to acknowledge it and by using the power of music to create common sentiment, we can at least begin to identify the barriers that separate us from others, and to remember that we are not the selfish, ruthless beings we are told we are.

“Breaking the Spell of Loneliness” by Ewan McLennan and George Monbiot is out now. For a full list of forthcoming gigs visit:

This article first appeared in the 20 October 2016 issue of the New Statesman, Brothers in blood