Andrew Marr presenting "History of the World". Photograph: BBC
Show Hide image

Popular history has been conquered by a complacent liberalism

Television history, in particular, has changed - and not always for the better.

At the end of September, the BBC screened the first part of its eight-part History of the World, written and presented by Andrew Marr. Within days of the show’s first episode, another world historian, Eric Hobsbawm, died, at the age of 94. Although the proximity of the two events was coincidental, it did seem as if the baton was being passed from one public historian, keen to paint the “big picture” and with a taste for the grand sweep, to another.

However, a closer comparison of the two men reveals how far popular history has changed and not always for the better. For Marr’s series shows the extent to which the struggle to interpret our history has been won by a complacent liberalism. And victory has been rather easy, as many historians have simply refused to join the fight.

This is a triumph with serious consequences – especially for anyone trying to make sense of the aftermath of the 2008 financial crisis. Our view of history shapes our attitude towards contemporary politics. Consciously or not, we are perpetually judging the present by the measure of the past. Once a particular “history” that supports the status quo becomes all-powerful, it is very difficult to make alternative political and economic solutions seem either plausible or necessary.

Hobsbawm and his generation of intellectuals were keenly aware of how damaging liberal complacency could be. In the 1920s, the postwar victors were convinced that they could impose the pre-1914 laissez-faire order without fully incorporating the working class – as if the First World War and the Russian Revolution had never happened. Hobsbawm, a teenager in 1930s Berlin, witnessed for himself the disastrous consequences of such conventional thinking – soaring unemployment, social breakdown and the rise of Nazism.

Yet after the Second World War the tables were turned. Now the western elite accepted that states had to control markets and improve workers’ lives. This analysis lay behind the grand social-democratic projects of the era – welfare states, the Bretton Woods system and the Marshall Plan. It also underpinned the “Marxist” history that dominated the postwar era: socio-economic forces were central; and educated experts, who understood those forces, could use reason and science to improve society and help the working classes.

However, just as Marxist-influenced history reached its high tide in the 1960s and 1970s, serious signs of decay had set in. For the convulsions of 1968 triggered a powerful – and in many ways justified – critique that left it looking deeply old-fashioned. The main target of the ’68-ers was the technocrat-worker alliance that characterised the era of social democracy. Scientists, the new radicals argued, far from being progressive, were apparatchiks of the military- industrial complex. As for the supposedly heroic working class, they had sold their souls to consumerism. History’s real heroes were the victims of cultural discrimination – whether ethnic minorities, colonised peoples, women or homosexuals.

This political sea change required an entirely new view of the past. With cultural identity now central, historians scrutinised subjective perceptions, not objective economic conditions. In the vanguard were Ranajit Guha and the “subaltern” historical school, who sought to rescue the cultures of the Indian poor for posterity; meanwhile, gender historians surveyed the many ways in which patriarchy shaped everything from progressive politics to the micro-power relations of everyday life.

This cultural turn was accompanied by an attack on notions of historical “progress”. Marxists were now lumped together with liberal “Whig” historians, such as Macaulay, as the false sirens of an Enlightenment that claimed “rationality” would make society more fair and free, when it often did the exact opposite. Newly fashionable “postmodernist” thinkers now saw progress as the highway to the Gulag and the death camp.

Postmodernists insisted that historians themselves, with their simple-minded “grand narratives” – whether “Whig” stories of progress towards liberal democracy and capitalism, or Marxist fables of progress towards communism – were contributing to this oppressive way of thinking. Historians had to avoid all grand theorising and concentrate on the marginalised and the powerless.

One casualty of this approach was economic history. Once the aristocracy of the profession, economic historians were now regarded as servants of hegemony. As free markets swept across the globe after 1989, there were even stronger reasons for historians – like most humanities academics, generally people of the left – to concentrate on the cultural sphere.

Hobsbawm represented all the postmodernists hated: the mandarin surveying the world from Olympian heights, uninterested in everyday life. Surely this was the attitude that led to grandiose projects of social engineering that caused the death of millions (not least in the Soviet Union, of which Hobsbawm was an unrepentant supporter)?

In somewhat diluted form, postmodernist ideas have reshaped both academic and popular history, and with many positive effects. In their insistence on the value of the experiences of ordinary people, such histories fit with a more democratic age. The recent BBC series presented by Pamela Cox, Servants: The True Story of Life Below Stairs, is an excellent example – giving us the real voices of the domestic servant class and driving a coach and horses through the Tory romanticism of Downton Abbey. These histories have also had a broader cultural effect, contributing to a growing intolerance of the abuse of power in ordinary life.

Necessary and important as these gains have been, the rejection of the most influential grand narratives has brought serious losses. In their abandonment of the big picture in favour of the fragment, academic historians have ceded the political high ground. And this crucial strategic space has been occupied by popular historians from the liberal centre and the right, such as Niall Ferguson and Andrew Roberts.

Until the 1980s, it was the right that was most suspicious of grand narratives, whether Marxist or Whig. For them, history was one damn thing after another, a long series of accidents and of great men making decisions; or, for a more “new-age” right, of randomly significant butterflies fluttering their wings in some corner of the globe or other.

Yet after the fall of the Berlin Wall in 1989, it suddenly seemed that history was going their way; the right embraced grand narratives with relish. In his history of the cold war, The Atlantic and its Enemies (2010), the Thatcherite Norman Stone shamefacedly admitted that he had ended up writing a Whig history of progress, even though he had spent much of his youth condemning the very idea.
And it is various forms of Whiggery that dominate our history today – whether propagated by those on the centre-right, such as Stone, or on the centre-left, such as Marr. Much of this is ideological rather than economic Whiggery: history is seen as a battle between liberalism and totalitarianism, and liberalism has won. The political theorist Francis Fukuyama put the argument most eloquently in his 1989 essay “The End of History?”. But it also underpinned some of the most popular histories of the early 21st century, including Simon Schama’s A History of Britain (2000-02).

We also see it in the widespread notion that Nazism and Stalinism were essentially the same – violently utopian ideologies created by dangerous anti-liberal intellectuals. We are still battling with their heirs, we are told, but the struggle will inevitably be won. This, in essence, was the history that George W Bush used to justify the invasion of Iraq.

Economic Whiggery is equally influential, though it is more frequently peddled by science writers and economists than by historians. Steven Pinker’s The Better Angels of Our Nature (2011) tells an optimistic story of the defeat of warrior values by a peaceable liberalism. More forthright is Matt Ridley’s The Rational Optimist (2010). Yoking Darwinism to free market economics, it casts merchants as the main agents of progress in human history.

Ridley’s analysis is a cruder version of a fashionable “big history” that combines the insights of evolutionary science with praise for economic globalisation (though it tends to be much more pessimistic about its environmental consequences than Ridley is). In his History of the World, Marr has whipped up all these trends into a tasty dish, mixing evolutionary history with Whiggish enthusiasms about global trade and warnings about “utopian” ideologies, though marinated in a conservative pessimism about human nature and political improvement.

Marr’s series is expertly crafted and stimulating; but it rests on an unexamined assumption common to much popular history today: it assumes, as Margaret Thatcher once put it, that “there is no such thing as society”. Marr gives us evolutionary imperatives, economic forces, ideologies and great (or evil) men. But the social groups Hobsbawm saw as central to history are in the background.

The Marxists certainly saw these social groups in crude terms – the postmodernists rightly argued that history was not just driven by economic “classes” but by a range of different groups, in part founded on culture and identity. Even so, occupation is enormously important in creating those identities. To ignore that is to deprive ourselves of a powerful tool for understanding.

The violence of the 20th century was not primarily caused by the pursuit of illiberal utopias, nor by evil dictators. It was largely caused by struggles between groups – social, national and ethnic – over questions of hierarchy and equality. These were the battles that brought Hobsbawm and his fellow Berliners on to the streets.

We saw the consequences of this analytical failure in the incomprehension of commentators and policy-makers when confronted with the turmoil of the Arab spring. Having seen the Middle East as the site of a struggle between liberals and “totalitarian” Islamists, they were bewildered as conflict exploded between competing social and ethno-religious groups – poor Islamists, their more business-orientated co-religionists, leftist workers, cosmopolitan liberals, Shias, Sunnis and Christians.

It is no surprise that the left is not flourishing in this intellectual environment. For the left is primarily concerned with equality. And if social hierarchies and the struggles of social and ethnic groups to flatten or bolster them are airbrushed from the historical record, the left’s agenda appears wholly irrelevant.

But even more serious, perhaps, is the effect of Whiggish ideas of gradual progress on our understanding of the financial crisis. We are so used to thinking of history as a process of gradual improvement that we find it difficult to remember how suddenly world orders break down – as they did in 1918, the 1930s or the 1970s – and how radically our ideas have to change in response. Whig gradualism simply cannot prepare us for the very serious challenges ahead.

The Danish philosopher Søren Kierkegaard wrote that “life must be lived forward, but it can only be understood backward”. The British are right to value their historians, and the BBC should be investing in grand histories. Yet they have to choose the right ones. For bad history may be worse than no history at all.

David Priestland is the author of “Merchant, Soldier, Sage: a New History of Power” (Allen Lane, £20)

This article first appeared in the 05 November 2012 issue of the New Statesman, What if Romney wins?

Show Hide image

The age of loneliness

Profound changes in technology, work and community are transforming our ultrasocial species into a population of loners.

Our dominant ideology is based on a lie. A series of lies, in fact, but I’ll focus on just one. This is the claim that we are, above all else, self-interested – that we seek to enhance our own wealth and power with little regard for the impact on others.

Some economists use a term to describe this presumed state of being – Homo economicus, or self-maximising man. The concept was formulated, by J S Mill and others, as a thought experiment. Soon it became a modelling tool. Then it became an ideal. Then it evolved into a description of who we really are.

It could not be further from the truth. To study human behaviour is to become aware of how weird we are. Many species will go to great lengths to help and protect their close kin. One or two will show occasional altruism towards unrelated members of their kind. But no species possesses a capacity for general altruism that is anywhere close to our own.

With the possible exception of naked mole-rats, we have the most social minds of all mammals. These minds evolved as an essential means of survival. Slow, weak, armed with rounded teeth and flimsy nails in a world of fangs and claws and horns and tusks, we survived through co-operation, reciprocity and mutual defence, all of which developed to a remarkable degree.

A review paper in the journal Frontiers in Psychology observes that Homo economicus  might be a reasonable description of chimpanzees. “Outsiders . . . would not expect to receive offers of food or solicitude; rather, they would be fiercely attacked . . . food is shared only under harassment; even mothers will not voluntarily offer novel foods to their own infants unless the infants beg for them.” But it is an unreasonable description of human beings.

How many of your friends, colleagues and neighbours behave like chimpanzees? A few, perhaps. If so, are they respected or reviled? Some people do appear to act as if they have no interests but their own – Philip Green and Mike Ashley strike me as possible examples – but their behaviour ­attracts general revulsion. The news is filled with spectacular instances of human viciousness: although psychopaths are rare, their deeds fill the papers. Daily acts of kindness are seldom reported, because they are everywhere.

Every day, I see people helping others with luggage, offering to cede their place in a queue, giving money to the homeless, setting aside time for others, volunteering for causes that offer no material reward. Alongside these quotidian instances are extreme and stunning cases. I think of my Dutch mother-in-law, whose family took in a six-year-old Jewish boy – a stranger – and hid him in their house for two years during the German occupation of the Netherlands. Had he been discovered, they would all have been sent to a concentration camp.

Studies suggest that altruistic tendencies are innate: from the age of 14 months, children try to help each other, attempting to hand over objects another child can’t reach. At the age of two, they start to share valued possessions. By the time they are three, they begin to protest against other people’s violation of moral norms.

Perhaps because we are told by the media, think tanks and politicians that competition and self-interest are the defining norms of human life, we disastrously mischaracterise the way in which other people behave. A survey commissioned by the Common Cause Foundation reported that 78 per cent of respondents believe others to be more selfish than they really are.

I do not wish to suggest that this mythology of selfishness is the sole or even principal cause of the epidemic of loneliness now sweeping the world. But it is likely to contribute to the plague by breeding suspicion and a sense of threat. It also appears to provide a doctrine of justification for those afflicted by isolation, a doctrine that sees individualism as a higher state of existence than community. Perhaps it is hardly surprising that Britain, the European nation in which neoliberalism is most advanced, is, according to government figures, the loneliness capital of Europe.

There are several possible reasons for the atomisation now suffered by the supremely social mammal. Work, which used to bring us together, now disperses us: many people have neither fixed workplaces nor regular colleagues and regular hours. Our leisure time has undergone a similar transformation: cinema replaced by television, sport by computer games, time with friends by time on Facebook.

Social media seems to cut both ways: it brings us together and sets us apart. It helps us to stay in touch, but also cultivates a tendency that surely enhances other people’s sense of isolation: a determination to persuade your followers that you’re having a great time. FOMO – fear of missing out – seems, at least in my mind, to be closely ­associated with loneliness.

Children’s lives in particular have been transformed: since the 1970s, their unaccompanied home range (in other words, the area they roam without adult supervision) has declined in Britain by almost 90 per cent. Not only does this remove them from contact with the natural world, but it limits their contact with other children. When kids played out on the street or in the woods, they quickly formed their own tribes, learning the social skills that would see them through life.

An ageing population, family and community breakdown, the decline of institutions such as churches and trade unions, the switch from public transport to private, inequality, an alienating ethic of consumerism, the loss of common purpose: all these are likely to contribute to one of the most dangerous epidemics of our time.

Yes, I do mean dangerous. The stress response triggered by loneliness raises blood pressure and impairs the immune system. Loneliness enhances the risk of depression, paranoia, addiction, cognitive decline, dem­entia, heart disease, stroke, viral infection, accidents and suicide. It is as potent a cause of early death as smoking 15 cigarettes a day, and can be twice as deadly as obesity.

Perhaps because we are in thrall to the ideology that helps to cause the problem, we turn to the market to try to solve it. Over the past few weeks, the discovery of a new American profession, the people-walker (taking human beings for walks), has caused a small sensation in the media. In Japan there is a fully fledged market for friendship: you can hire friends by the hour with whom to chat and eat and watch TV; or, more disturbingly, to pose for pictures that you can post on social media. They are rented as mourners at funerals and guests at weddings. A recent article describes how a fake friend was used to replace a sister with whom the bride had fallen out. What would the bride’s mother make of it? No problem: she had been rented, too. In September we learned that similar customs have been followed in Britain for some time: an early foray into business for the Home Secretary, Amber Rudd, involved offering to lease her posh friends to underpopulated weddings.



My own experience fits the current pattern: the high incidence of loneliness suffered by people between the ages of 18 and 34. I have sometimes been lonely before and after that period, but it was during those years that I was most afflicted. The worst episode struck when I returned to Britain after six years working in West Papua, Brazil and East Africa. In those parts I sometimes felt like a ghost, drifting through societies to which I did not belong. I was often socially isolated, but I seldom felt lonely, perhaps because the issues I was investigating were so absorbing and the work so frightening that I was swept along by adrenalin and a sense of purpose.

When I came home, however, I fell into a mineshaft. My university friends, with their proper jobs, expensive mortgages and settled, prematurely aged lives, had become incomprehensible to me, and the life I had been leading seemed incomprehensible to everyone. Though feeling like a ghost abroad was in some ways liberating – a psychic decluttering that permitted an intense process of discovery – feeling like a ghost at home was terrifying. I existed, people acknowledged me, greeted me cordially, but I just could not connect. Wherever I went, I heard my own voice bouncing back at me.

Eventually I made new friends. But I still feel scarred by that time, and fearful that such desolation may recur, particularly in old age. These days, my loneliest moments come immediately after I’ve given a talk, when I’m surrounded by people congratulating me or asking questions. I often experience a falling sensation: their voices seem to recede above my head. I think it arises from the nature of the contact: because I can’t speak to anyone for more than a few seconds, it feels like social media brought to life.

The word “sullen” evolved from the Old French solain, which means “lonely”. Loneliness is associated with an enhanced perception of social threat, so one of its paradoxical consequences is a tendency to shut yourself off from strangers. When I was lonely, I felt like lashing out at the society from which I perceived myself excluded, as if the problem lay with other people. To read any comment thread is, I feel, to witness this tendency: you find people who are plainly making efforts to connect, but who do so by insulting and abusing, alienating the rest of the thread with their evident misanthropy. Perhaps some people really are rugged individualists. But others – especially online – appear to use that persona as a rationale for involuntary isolation.

Whatever the reasons might be, it is as if a spell had been cast on us, transforming this ultrasocial species into a population of loners. Like a parasite enhancing the conditions for its own survival, loneliness impedes its own cure by breeding shame and shyness. The work of groups such as Age UK, Mind, Positive Ageing and the Campaign to End Loneliness is life-saving.

When I first wrote about this subject, and the article went viral, several publishers urged me to write a book on the theme. Three years sitting at my desk, studying isolation: what’s the second prize? But I found another way of working on the issue, a way that engages me with others, rather than removing me. With the brilliant musician Ewan McLennan, I have written a concept album (I wrote the first draft of the lyrics; he refined them and wrote the music). Our aim is to use it to help break the spell, with performances of both music and the spoken word designed to bring people together –which, we hope, will end with a party at the nearest pub.

By itself, our work can make only a tiny contribution to addressing the epidemic. But I hope that, both by helping people to acknowledge it and by using the power of music to create common sentiment, we can at least begin to identify the barriers that separate us from others, and to remember that we are not the selfish, ruthless beings we are told we are.

“Breaking the Spell of Loneliness” by Ewan McLennan and George Monbiot is out now. For a full list of forthcoming gigs visit:

This article first appeared in the 20 October 2016 issue of the New Statesman, Brothers in blood