Artwork by Ryan Schude.
Show Hide image

The paradox of fairness

Is the world a better place if the vicious suffer for their viciousness? And what exactly are just deserts?

For as far back as I can remember language, and uttered the very last time I saw her, one of my mother’s most repeated sentences was: “Every dog has its day.” She said it aloud to herself and to the knowing, listening universe, though, when I was in the room, her eyes might be pointing in my direction. It was an incantation, voiced in a low growl. There was something of a spell about it, but it was mainly an assertion of a fundamental and reassuring truth, a statement to vibrate and stand in the air against whatever injustice she had just suffered or remembered suffering. It was, I understood, a reiterated form of self-comfort to announce that justice, while taking its time, was inevitably to come; perhaps, too, a bit of a nudge for the lackadaisical force responsible for giving every dog its day.

Hers was an other-worldly view, of justice meted out from beyond the human sphere, held in this case by an uneducated non-observant Jewish woman with parents from the shtetl, but it is a foundational promise made by all three religions of the Book, and surely their most effective selling point. My mother’s recitation of her truth belonged with another, more impassioned phrase, which I recall her saying only when sitting in a chair, rocking back and forth, or in bed, rolling her head from side to side. “God! God! What have I done to deserve this?” Generally, unlike the harsh confidence of the first phrase, it was wept, sometimes screamed, mumbled madly, wailed, moaned, and usually repeated over and over again, whereas “Every dog has its day” needed saying only once, whenever the situation merited it. Both phrases were occasioned by my repeatedly philandering, disappearing, money-withholding conman father, and each marked opposite ends of the continuum of disappointment on which my mother lived.

I learned from this, in the first place, obviously, to sneak away so that I wouldn’t get dragged in to the conversation and end up (perhaps not unjustly) as a substitute accusee for my father’s failure to care. But I learned also that she had certain expectations of the world: that the world properly consisted of a normality, and that the world had peculiarly failed her in respect of it.

From a very early age I already knew about the norms of the world, what it was supposed to be like, from nursery rhymes, fairy tales, books, films, television and radio. I knew that the most basic of all the norms was that fairness was to be expected. I doubt that I needed to be taught that; it was inward to me, never unknown, and I would guess that I knew it in some way even before I got the hang of language. I would also guess that it was the same for you.

I suppose what I importantly learned from my cursing and keening mother was that grown-ups still knew it, too. That fairness was not just one of those always suspect childish expectations – like money in return for a tooth, or a man coming down the chimney – that one grew out of.

At the same time, I learned from her that fairness was not an infallible fact of the world and that the most apparently fundamental essentials failed, yet the idea I got from my mother about this was that she (and sometimes I was included) was the only person on the planet whom the arranger of fairness had let down. All other husbands and fathers were true and trustworthy, everyone else had enough money to pay the rent and buy food, everyone else had relatives or friends who rallied round, so my mother often explicitly said. Everyone except her (and me, as the appendage inside her circle of misfortune).

It did rather astonish me that we should be so unfortunate to have been singled out, but I was also impressed that we should have received such special treatment from the universe. The stories and nursery rhymes had told me that bad things were supposed to happen to people who had done something to merit it. But my mother had no doubt she had done nothing, had been a helpless victim, yet the world was bad to her, and therefore bafflingly unfair. When she wailed to her personal inattentive God, “What have I done to deserve this?” she meant that she had done nothing.

It seemed that there was a crack in the heart of fairness, and she had fallen into it. She was innocent and deserving of better in her adulthood than her emotionally and economically impoverished childhood had provided, and yet she was receiving punishment and unhappiness in the form of my father and his bad behaviour. He, not she, deserved her misery, and yet, having disappeared and left us behind, he was living an apparently untroubled, unpunished life.

So I understood that on the one hand there was a rule of universal fairness, and every dog had to have its day, even if it was late in coming, and on the other hand that it was possible for some people to be delivered into unhappiness for no reason at all (as I grew older I understood it wasn’t just her and me). What was odd was the way my mother kept calling, albeit reproachfully, on this God who had so let her down.

I grew up to ditch the notion of a structural fairness, of a god or a nature that rewarded and punished on a moral basis. What occurred in people’s lives was consequent on their choices, their lack of choice, and the interrelation between the two, as well as high-or-low-risk-taking or simple arbitrary happenstance. I settled for a universe where narratives and meanings were fractured rather than based on moral cause and effect.

Lives were fragmented, subject to chance, not a continuing stream of moral repercussions, and although chance did have consequences, those consequences, too, were subject to chance. I recognised it in the devil’s distorting mirror from The Snow Queen which accidentally fell and broke into millions of splinters – a random shard falling into Kay’s eye but not into the eye of his friend Gerda. The story starts and takes its shape from a shrug of fate that knows nothing of you or what you deserve, but quite by accident or because of how our story-craving minds work, life could look as if it was conforming to moral judgement. I built a way to pass and describe my time around the rejection of the expectation of fairness, playing with the sharp edges of deconstructed fairy stories and tales children and adults are easily told. And I shook my head against those I came across who echoed my mother, such as, 30 years later, my mother-in-law, who contracted breast cancer at the age of 75 and asked, over and over, whenever I visited her: “How could this have happened to me? I’ve never done anything to deserve cancer.”

My attempt to grow up and away from the childishness of just deserts was, it goes without saying, no more than a position I took. It was necessary and useful, and allowed me to construct narratives that were more interesting to me than the most expected ones, but naturally I never did manage to do away with the sense of outrage against unfairness that I conclude is at the heart of self- and other-conscious life. I have to acknowledge the fundamental human desire for fairness, which, turned inwards, hampered my mother and which, turned outwards, causes people to work in danger and discomfort in places of war and hunger to improve imbalances of fortune.

Desert, the noun deriving from the verb “to deserve”, appears to be an essential human dynamic. It is at least a central anxiety that provides the plot for so many novels and films that depend on our sense that there is or should be such a thing. Like Kafka and Poe, Hitchcock repeatedly returns to the individual who is singled out, wrongly accused, an innocent suffering an injustice. Yet consider Montgomery Clift’s priest in I Confess, Henry Fonda in The Wrong Man, Blaney, the real killer’s friend played by Jon Finch in Frenzy, James Stewart in The Man Who Knew Too Much and Cary Grant in North by Northwest; none of them is – or could be according to Hitchcock’s Catholic upbringing – truly innocent of everything, and often their moral failings give some cause for the suspicion that falls on them. There is always a faint tang of consequence about their troubles.

We worry about people not getting what they deserve, but, due to religion or some essential guilt we carry with us, we are also concerned that there might be a deeper, less obvious basis for guilt that our everyday, human sense of justice doesn’t take into account. In Victorian fiction, Dickens and Hardy are masters of just and unjust deserts, as innocents such as Oliver Twist, David Copperfield, Tess of the D’Urbervilles and Jude the Obscure become engulfed by persecutory institutions and struggle, only sometimes with success, to find the life they ought, in a fair world, to have.

In Dickens, readers get a joyful reassurance after evil intent almost overcomes goodness but justice finally, though at the last moment, wins out by decency and coincidence. Hardy, in his covert modernism, offers no reassurance at all that his innocents’ day will come; his victims’ hopes and lives are snuffed out by forces such as nature and class that have no concern at all with the worth of individual lives and hopes. For both writers, however, the morally just or unjust result is usually an accident that works in or against the protagonist’s favour.

Every child ever born has at some time or other wailed, “It’s not fair.” To which the adults answer, “Life isn’t fair,” and always, surely, with a sense of sorrow and a vague feeling of betrayal, but also an understanding that a vital lesson is being imparted.

Fairness and desert are not exactly the same, I suppose; we might have a basic requirement for a generalised fairness – equality of opportunity, say – that has nothing to do with what anyone deserves, but our strangely inbuilt earliest sense of fairness provides our first encounter with the complexity of justice and injustice. Perhaps it arose even earlier than human consciousness. There are those who, like the primatologist Frans de Waal, suggest that a sense of fairness is an inherent emotion in monkeys:

An experiment with capuchin monkeys by Sarah Brosnan, of Georgia State University’s CEBUS Lab, and myself illuminated this emotional basis. These monkeys will happily perform a task for cucumber slices until they see other getting grapes, which taste so much better. They become agitated, throw down their measly cucumbers, and go on strike.

I’m not sure if this is exactly a sense of fairness. If so, it is a limited, unidirectional sense. Perhaps a sense of unfairness precedes the more general idea. I imagine a full sense of fairness would be demonstrated by a capuchin throwing her grapes down when she sees her fellow worker receiving cucumber. All for one and one for all. I couldn’t find any experiment that showed this.

A sense of personal unfairness may be all that is experienced by small children, too. It is always easy enough to come up with the idea that we have been morally mistreated. We manage to do it from a very young age and, like my mother-in-law, continue to the end of our lives. That others might deserve something is a more sophisticated thought. Usually, before any egalitarian fervour has a chance to emerge on its own, we have introduced the children, if not the monkeys, to the concept of desert. You get the grape for good behaviour, or helping with the washing-up, or not hitting your baby brother when he hits you, and you don’t get a grape if you throw a tantrum, or refuse to put on your socks. In this way, you and your brother get different amounts of goodness according to some very general rule that you are not much in a position to question, and the inherent problems of universal fairness are put into abeyance, except in the deepest dungeon of our consciousness.

There’s a revival of the childish sense of unfairness in adolescence when they/we cry, “I didn’t ask to be born.” To which we/they reply, again with an implication of just des - erts: “The world doesn’t owe you a living.” But neither party explains how either statement asks or answers the difficulty of unfairness in the world.

I dare say all this harks back to our earliest desperation – the battle for the breast – with the helpless infant demanding that her hunger be assuaged and demanding comfort for her discomfort, the formerly helpless infant now in charge and having the capacity to deny it. It starts in a milky muddle and goes on to just des(s)erts. It is astonishing, actually, that the word for pudding in English is not, as it plainly ought to be, related to the desert that is getting what you deserve.

Nevertheless, eventually the hard-learned reward and punishment system becomes social glue and enters into the world as law and civic organisation, as a clumsy attempt to solve the insoluble. The legislation starts early, in the family, and is a necessity in the community and the state, because, in any unlegislated situation, goodness and altruism are not necessarily rewarded on an individual level. Payback, positive and negative, is rarely found in the wild, and only sometimes in what we call civilisation. Cheats very often prosper and an eye for an eye is a brutal, primitive formulation that advanced cultures (us, as we like to think of ourselves) reject as a kind of exact justice that lacks all mature consideration of circumstances. Yahweh hardly applied fairness when he repaid Job’s devotion with vastly incommensurate loss just to win a bet with Satan. And certainly the knotted family romance that is the basis for Judaism, Christianity and Islam, involving Abram, Sara, Isaac, Hagar, Ishmael and Yahweh, is resolved only by Abram’s adultery with Hagar, then Hagar’s expulsion with her son, nearly ending in their death, and the near-filicide of Isaac. All the victims are as completely innocent as human beings and God can be.

In an attempt properly to get to grips with the idea of fairness, justice and desert, I have recently been struggling with the story of Amos, Boris, Claire and Zoey. They are the protagonists in a drama plotted by Shelly Kagan in his new book, The Geometry of Desert. To simplify, but only slightly, all four of them are injured by an explosion at work. A fifth person, You, comes along with a syringe containing a single dose of painkiller, while they wait in agony for the ambulance.

There is no point in giving everybody a little bit; it won’t help any of them enough. Whom do you give the single useful dose to? At this point, the devastation fades into the background and we learn that Amos was hurt as he happened to walk past an explosion from a device planted by the disgruntled or revolutionary Boris, who failed to get away in time, and that Claire, who instigated the bomb attack and set off the detonator, stood too close and was also injured by the blast, while Zoey came on the horrible scene and was wounded by a second blast as she was trying to go to the aid of the other three. The carnage can now return to the forefront of your mind and you have to choose whom to help with your exiguous morphine supply.

The first thing that should become clear before you start mulling over whom to assist is that you are, in fact, in the middle of a philosophical thought experiment. If you are, like me, a novelist with a resistance (as well as a –probably related –hopelessly inept attraction) to this kind of theoretical reasoning, you might reject the entire scenario, because it never happened and your plot is as good as anyone else’s. No, you think, as if you were back in school rebelling against the insistence that all lines meeting at infinity is a given, I don’t have to make any choice at all.

The bomb at the factory didn’t go off. It was never set in the first place. Boris and Claire are gentle vegans who have no animus that would impel them to set a bomb, and no one is hurt. Amos, Boris, Claire and Zoey can continue their ordinary daily business, perhaps never even meeting, or, if they do, knowing nothing about the drama that never happened and which they all failed to be involved in. Or perhaps each of them becomes the protagonist of his or her own novel of which You, and not Shelly Kagan, are the author – the A, B, C, Z Quartet.

In my version, You’s choices are broadened infinitely, there is no given, and I can simply refuse the parameters of the thought experiment because I am not a philosopher, I do not wish to be restricted to the terms set by someone else for their own didactic purposes, and likely I’ve got several deadlines that don’t depend on figuring out how much or how little guilt deserves the morphine and why. And so, once again, I fail to get to grips with academic philosophy.

The Geometry of Desert considers both the fundamental and the complex nature of deserving. Kagan poses familiar questions initially (what makes one person more deserving than another?; what is it that the more deserving deserve more of?; does anyone deserve anything at all?) and then puts them aside in order to examine the underlying complexity of desert by means of graphs that represent his elaborately anatomised notion of desert and all the possible implications and interactions between its teased-apart elements. This graphical representation of desert is, he says, the most important and original part, and the point, of his book.

Which I dare say it is, but I got no further than my enjoyment and childish rejection of the initial elementary narrative. If I were Alice, the Wonderland in which I find myself wandering, enchanted but fearful and utterly baffled, would be geometry, algebra and (as Alice also encounters) formal logic. I am, if it is possible, spatially challenged. Maps and reality completely fail to come together in my brain. My eyes tear up and the trauma of school maths lessons returns to me as Kagan translates away from situation to number and algebraic representation to devise graphs whose plotted lines meander across their grid in a, to me, mysterious arithmetical relation to each other. I’m rubbish at all things numerical and graphical and, with all the will in the world, which I started off with, I could no more have read the greater part of Kagan’s book with comprehension than I could read the Bhaga­vadgita in Sanskrit.

And yet and yet, I can’t get away from the foothills of desert. I can’t shake off the elementary problems that Amos, Boris, Claire and Zoey create, lying there, waiting for that ambulance, me with a hypodermic full of morphine still in my pocket. Amos and Zoey innocent as lambs, but perhaps Zoey more innocent, having put herself in harm’s way in order to help the others? Boris and Claire guilty, for sure, but is Claire, the initiator of the harmful event, more guilty than Boris the foot soldier? Does it go without saying that I should perform a moral triage in order to decide which sufferer to give the morphine, based on the hierarchy of guilt and innocence? Kagan calls it “Fault Forfeits First”, so that Zoey would be first in line for the morphine and Claire and Boris at the back of the queue. But he points out a basic division in Fault Forfeits First, between the “retributionists” and “moderates” who subscribe to that belief.

The retributionists would not give Claire or Boris any morphine even if some were left over after soothing Zoey’s pain, because they deserve to suffer, having caused suffering. The moderates believe that no one should suffer, but that the innocent Amos and Zoey should be helped first if a choice has to be made. The world, the moderates believe, I believe and perhaps you believe, is improved by an improvement in everyone’s well-being. The retributionists think that the world is a better place if the vicious suffer for their viciousness.

But, as John Rawls claimed early in his career, unless you completely accept free will in people’s behaviour, unclouded by fortune or misfortune in birth, education or life experience, it is possible that no one deserves anything as a result of his actions, good or bad. The first instinct is to give Zoey the pain­killer, other things being equal. Other things being equal is the problem. Why, when you come to think of it, does Zoey deserve less pain or more well-being on account of her good will? Did she have a particularly fortunate upbringing or, indeed, an unfortunate one that inclined her to acts of benevolence? No one is culturally, genetically free of influence. In any case, she had no intention of being injured when she went to help. And who knows why Claire, who conceived a bomb and detonated it, became the person she did?

How do we know (butterfly wings beating in the rainforest, and all that) if there might not be something we are not aware of that would make it more beneficial to give Claire the morphine? What if she has information about other bombs that have been planted? And what if, given an “undeserved” benefit, she came to rethink her viciousness? There may be more purely angelic joy in heaven over such a change of heart, but there are also very good practical reasons to rejoice far more, here on earth, over the redemption of one sinner than over 99 people who do not need to repent.

The retributionists and the moderates believe as they do for the same complicated reasons as the good and the vicious. In the practical world, getting just deserts is enshrined in legislation, and justice is separated from fairness, precisely to avoid the endless entailments of the philosophy of desert. It isn’t so surprising that there have been 20 seasons of Law and Order, which in every episode neatly segments and plays out the uncertainties of policing wrongdoing and providing justice. Finally, I suppose, we have to settle for the muddle of “good enough” fairness, while thinking and trying for something better. But don’t try telling that to my mother.

Jenny Diski’s most recent book is “What I Don’t Know About Animals” (Virago, £9.99)

This article first appeared in the 01 April 2013 issue of the New Statesman, Easter Special Issue

Show Hide image

The age of loneliness

Profound changes in technology, work and community are transforming our ultrasocial species into a population of loners.

Our dominant ideology is based on a lie. A series of lies, in fact, but I’ll focus on just one. This is the claim that we are, above all else, self-interested – that we seek to enhance our own wealth and power with little regard for the impact on others.

Some economists use a term to describe this presumed state of being – Homo economicus, or self-maximising man. The concept was formulated, by J S Mill and others, as a thought experiment. Soon it became a modelling tool. Then it became an ideal. Then it evolved into a description of who we really are.

It could not be further from the truth. To study human behaviour is to become aware of how weird we are. Many species will go to great lengths to help and protect their close kin. One or two will show occasional altruism towards unrelated members of their kind. But no species possesses a capacity for general altruism that is anywhere close to our own.

With the possible exception of naked mole-rats, we have the most social minds of all mammals. These minds evolved as an essential means of survival. Slow, weak, armed with rounded teeth and flimsy nails in a world of fangs and claws and horns and tusks, we survived through co-operation, reciprocity and mutual defence, all of which developed to a remarkable degree.

A review paper in the journal Frontiers in Psychology observes that Homo economicus  might be a reasonable description of chimpanzees. “Outsiders . . . would not expect to receive offers of food or solicitude; rather, they would be fiercely attacked . . . food is shared only under harassment; even mothers will not voluntarily offer novel foods to their own infants unless the infants beg for them.” But it is an unreasonable description of human beings.

How many of your friends, colleagues and neighbours behave like chimpanzees? A few, perhaps. If so, are they respected or reviled? Some people do appear to act as if they have no interests but their own – Philip Green and Mike Ashley strike me as possible examples – but their behaviour ­attracts general revulsion. The news is filled with spectacular instances of human viciousness: although psychopaths are rare, their deeds fill the papers. Daily acts of kindness are seldom reported, because they are everywhere.

Every day, I see people helping others with luggage, offering to cede their place in a queue, giving money to the homeless, setting aside time for others, volunteering for causes that offer no material reward. Alongside these quotidian instances are extreme and stunning cases. I think of my Dutch mother-in-law, whose family took in a six-year-old Jewish boy – a stranger – and hid him in their house for two years during the German occupation of the Netherlands. Had he been discovered, they would all have been sent to a concentration camp.

Studies suggest that altruistic tendencies are innate: from the age of 14 months, children try to help each other, attempting to hand over objects another child can’t reach. At the age of two, they start to share valued possessions. By the time they are three, they begin to protest against other people’s violation of moral norms.

Perhaps because we are told by the media, think tanks and politicians that competition and self-interest are the defining norms of human life, we disastrously mischaracterise the way in which other people behave. A survey commissioned by the Common Cause Foundation reported that 78 per cent of respondents believe others to be more selfish than they really are.

I do not wish to suggest that this mythology of selfishness is the sole or even principal cause of the epidemic of loneliness now sweeping the world. But it is likely to contribute to the plague by breeding suspicion and a sense of threat. It also appears to provide a doctrine of justification for those afflicted by isolation, a doctrine that sees individualism as a higher state of existence than community. Perhaps it is hardly surprising that Britain, the European nation in which neoliberalism is most advanced, is, according to government figures, the loneliness capital of Europe.

There are several possible reasons for the atomisation now suffered by the supremely social mammal. Work, which used to bring us together, now disperses us: many people have neither fixed workplaces nor regular colleagues and regular hours. Our leisure time has undergone a similar transformation: cinema replaced by television, sport by computer games, time with friends by time on Facebook.

Social media seems to cut both ways: it brings us together and sets us apart. It helps us to stay in touch, but also cultivates a tendency that surely enhances other people’s sense of isolation: a determination to persuade your followers that you’re having a great time. FOMO – fear of missing out – seems, at least in my mind, to be closely ­associated with loneliness.

Children’s lives in particular have been transformed: since the 1970s, their unaccompanied home range (in other words, the area they roam without adult supervision) has declined in Britain by almost 90 per cent. Not only does this remove them from contact with the natural world, but it limits their contact with other children. When kids played out on the street or in the woods, they quickly formed their own tribes, learning the social skills that would see them through life.

An ageing population, family and community breakdown, the decline of institutions such as churches and trade unions, the switch from public transport to private, inequality, an alienating ethic of consumerism, the loss of common purpose: all these are likely to contribute to one of the most dangerous epidemics of our time.

Yes, I do mean dangerous. The stress response triggered by loneliness raises blood pressure and impairs the immune system. Loneliness enhances the risk of depression, paranoia, addiction, cognitive decline, dem­entia, heart disease, stroke, viral infection, accidents and suicide. It is as potent a cause of early death as smoking 15 cigarettes a day, and can be twice as deadly as obesity.

Perhaps because we are in thrall to the ideology that helps to cause the problem, we turn to the market to try to solve it. Over the past few weeks, the discovery of a new American profession, the people-walker (taking human beings for walks), has caused a small sensation in the media. In Japan there is a fully fledged market for friendship: you can hire friends by the hour with whom to chat and eat and watch TV; or, more disturbingly, to pose for pictures that you can post on social media. They are rented as mourners at funerals and guests at weddings. A recent article describes how a fake friend was used to replace a sister with whom the bride had fallen out. What would the bride’s mother make of it? No problem: she had been rented, too. In September we learned that similar customs have been followed in Britain for some time: an early foray into business for the Home Secretary, Amber Rudd, involved offering to lease her posh friends to underpopulated weddings.



My own experience fits the current pattern: the high incidence of loneliness suffered by people between the ages of 18 and 34. I have sometimes been lonely before and after that period, but it was during those years that I was most afflicted. The worst episode struck when I returned to Britain after six years working in West Papua, Brazil and East Africa. In those parts I sometimes felt like a ghost, drifting through societies to which I did not belong. I was often socially isolated, but I seldom felt lonely, perhaps because the issues I was investigating were so absorbing and the work so frightening that I was swept along by adrenalin and a sense of purpose.

When I came home, however, I fell into a mineshaft. My university friends, with their proper jobs, expensive mortgages and settled, prematurely aged lives, had become incomprehensible to me, and the life I had been leading seemed incomprehensible to everyone. Though feeling like a ghost abroad was in some ways liberating – a psychic decluttering that permitted an intense process of discovery – feeling like a ghost at home was terrifying. I existed, people acknowledged me, greeted me cordially, but I just could not connect. Wherever I went, I heard my own voice bouncing back at me.

Eventually I made new friends. But I still feel scarred by that time, and fearful that such desolation may recur, particularly in old age. These days, my loneliest moments come immediately after I’ve given a talk, when I’m surrounded by people congratulating me or asking questions. I often experience a falling sensation: their voices seem to recede above my head. I think it arises from the nature of the contact: because I can’t speak to anyone for more than a few seconds, it feels like social media brought to life.

The word “sullen” evolved from the Old French solain, which means “lonely”. Loneliness is associated with an enhanced perception of social threat, so one of its paradoxical consequences is a tendency to shut yourself off from strangers. When I was lonely, I felt like lashing out at the society from which I perceived myself excluded, as if the problem lay with other people. To read any comment thread is, I feel, to witness this tendency: you find people who are plainly making efforts to connect, but who do so by insulting and abusing, alienating the rest of the thread with their evident misanthropy. Perhaps some people really are rugged individualists. But others – especially online – appear to use that persona as a rationale for involuntary isolation.

Whatever the reasons might be, it is as if a spell had been cast on us, transforming this ultrasocial species into a population of loners. Like a parasite enhancing the conditions for its own survival, loneliness impedes its own cure by breeding shame and shyness. The work of groups such as Age UK, Mind, Positive Ageing and the Campaign to End Loneliness is life-saving.

When I first wrote about this subject, and the article went viral, several publishers urged me to write a book on the theme. Three years sitting at my desk, studying isolation: what’s the second prize? But I found another way of working on the issue, a way that engages me with others, rather than removing me. With the brilliant musician Ewan McLennan, I have written a concept album (I wrote the first draft of the lyrics; he refined them and wrote the music). Our aim is to use it to help break the spell, with performances of both music and the spoken word designed to bring people together –which, we hope, will end with a party at the nearest pub.

By itself, our work can make only a tiny contribution to addressing the epidemic. But I hope that, both by helping people to acknowledge it and by using the power of music to create common sentiment, we can at least begin to identify the barriers that separate us from others, and to remember that we are not the selfish, ruthless beings we are told we are.

“Breaking the Spell of Loneliness” by Ewan McLennan and George Monbiot is out now. For a full list of forthcoming gigs visit:

This article first appeared in the 20 October 2016 issue of the New Statesman, Brothers in blood