Artwork by Ryan Schude.
Show Hide image

The paradox of fairness

Is the world a better place if the vicious suffer for their viciousness? And what exactly are just deserts?

For as far back as I can remember language, and uttered the very last time I saw her, one of my mother’s most repeated sentences was: “Every dog has its day.” She said it aloud to herself and to the knowing, listening universe, though, when I was in the room, her eyes might be pointing in my direction. It was an incantation, voiced in a low growl. There was something of a spell about it, but it was mainly an assertion of a fundamental and reassuring truth, a statement to vibrate and stand in the air against whatever injustice she had just suffered or remembered suffering. It was, I understood, a reiterated form of self-comfort to announce that justice, while taking its time, was inevitably to come; perhaps, too, a bit of a nudge for the lackadaisical force responsible for giving every dog its day.

Hers was an other-worldly view, of justice meted out from beyond the human sphere, held in this case by an uneducated non-observant Jewish woman with parents from the shtetl, but it is a foundational promise made by all three religions of the Book, and surely their most effective selling point. My mother’s recitation of her truth belonged with another, more impassioned phrase, which I recall her saying only when sitting in a chair, rocking back and forth, or in bed, rolling her head from side to side. “God! God! What have I done to deserve this?” Generally, unlike the harsh confidence of the first phrase, it was wept, sometimes screamed, mumbled madly, wailed, moaned, and usually repeated over and over again, whereas “Every dog has its day” needed saying only once, whenever the situation merited it. Both phrases were occasioned by my repeatedly philandering, disappearing, money-withholding conman father, and each marked opposite ends of the continuum of disappointment on which my mother lived.

I learned from this, in the first place, obviously, to sneak away so that I wouldn’t get dragged in to the conversation and end up (perhaps not unjustly) as a substitute accusee for my father’s failure to care. But I learned also that she had certain expectations of the world: that the world properly consisted of a normality, and that the world had peculiarly failed her in respect of it.

From a very early age I already knew about the norms of the world, what it was supposed to be like, from nursery rhymes, fairy tales, books, films, television and radio. I knew that the most basic of all the norms was that fairness was to be expected. I doubt that I needed to be taught that; it was inward to me, never unknown, and I would guess that I knew it in some way even before I got the hang of language. I would also guess that it was the same for you.

I suppose what I importantly learned from my cursing and keening mother was that grown-ups still knew it, too. That fairness was not just one of those always suspect childish expectations – like money in return for a tooth, or a man coming down the chimney – that one grew out of.

At the same time, I learned from her that fairness was not an infallible fact of the world and that the most apparently fundamental essentials failed, yet the idea I got from my mother about this was that she (and sometimes I was included) was the only person on the planet whom the arranger of fairness had let down. All other husbands and fathers were true and trustworthy, everyone else had enough money to pay the rent and buy food, everyone else had relatives or friends who rallied round, so my mother often explicitly said. Everyone except her (and me, as the appendage inside her circle of misfortune).

It did rather astonish me that we should be so unfortunate to have been singled out, but I was also impressed that we should have received such special treatment from the universe. The stories and nursery rhymes had told me that bad things were supposed to happen to people who had done something to merit it. But my mother had no doubt she had done nothing, had been a helpless victim, yet the world was bad to her, and therefore bafflingly unfair. When she wailed to her personal inattentive God, “What have I done to deserve this?” she meant that she had done nothing.

It seemed that there was a crack in the heart of fairness, and she had fallen into it. She was innocent and deserving of better in her adulthood than her emotionally and economically impoverished childhood had provided, and yet she was receiving punishment and unhappiness in the form of my father and his bad behaviour. He, not she, deserved her misery, and yet, having disappeared and left us behind, he was living an apparently untroubled, unpunished life.

So I understood that on the one hand there was a rule of universal fairness, and every dog had to have its day, even if it was late in coming, and on the other hand that it was possible for some people to be delivered into unhappiness for no reason at all (as I grew older I understood it wasn’t just her and me). What was odd was the way my mother kept calling, albeit reproachfully, on this God who had so let her down.

I grew up to ditch the notion of a structural fairness, of a god or a nature that rewarded and punished on a moral basis. What occurred in people’s lives was consequent on their choices, their lack of choice, and the interrelation between the two, as well as high-or-low-risk-taking or simple arbitrary happenstance. I settled for a universe where narratives and meanings were fractured rather than based on moral cause and effect.

Lives were fragmented, subject to chance, not a continuing stream of moral repercussions, and although chance did have consequences, those consequences, too, were subject to chance. I recognised it in the devil’s distorting mirror from The Snow Queen which accidentally fell and broke into millions of splinters – a random shard falling into Kay’s eye but not into the eye of his friend Gerda. The story starts and takes its shape from a shrug of fate that knows nothing of you or what you deserve, but quite by accident or because of how our story-craving minds work, life could look as if it was conforming to moral judgement. I built a way to pass and describe my time around the rejection of the expectation of fairness, playing with the sharp edges of deconstructed fairy stories and tales children and adults are easily told. And I shook my head against those I came across who echoed my mother, such as, 30 years later, my mother-in-law, who contracted breast cancer at the age of 75 and asked, over and over, whenever I visited her: “How could this have happened to me? I’ve never done anything to deserve cancer.”

My attempt to grow up and away from the childishness of just deserts was, it goes without saying, no more than a position I took. It was necessary and useful, and allowed me to construct narratives that were more interesting to me than the most expected ones, but naturally I never did manage to do away with the sense of outrage against unfairness that I conclude is at the heart of self- and other-conscious life. I have to acknowledge the fundamental human desire for fairness, which, turned inwards, hampered my mother and which, turned outwards, causes people to work in danger and discomfort in places of war and hunger to improve imbalances of fortune.

Desert, the noun deriving from the verb “to deserve”, appears to be an essential human dynamic. It is at least a central anxiety that provides the plot for so many novels and films that depend on our sense that there is or should be such a thing. Like Kafka and Poe, Hitchcock repeatedly returns to the individual who is singled out, wrongly accused, an innocent suffering an injustice. Yet consider Montgomery Clift’s priest in I Confess, Henry Fonda in The Wrong Man, Blaney, the real killer’s friend played by Jon Finch in Frenzy, James Stewart in The Man Who Knew Too Much and Cary Grant in North by Northwest; none of them is – or could be according to Hitchcock’s Catholic upbringing – truly innocent of everything, and often their moral failings give some cause for the suspicion that falls on them. There is always a faint tang of consequence about their troubles.

We worry about people not getting what they deserve, but, due to religion or some essential guilt we carry with us, we are also concerned that there might be a deeper, less obvious basis for guilt that our everyday, human sense of justice doesn’t take into account. In Victorian fiction, Dickens and Hardy are masters of just and unjust deserts, as innocents such as Oliver Twist, David Copperfield, Tess of the D’Urbervilles and Jude the Obscure become engulfed by persecutory institutions and struggle, only sometimes with success, to find the life they ought, in a fair world, to have.

In Dickens, readers get a joyful reassurance after evil intent almost overcomes goodness but justice finally, though at the last moment, wins out by decency and coincidence. Hardy, in his covert modernism, offers no reassurance at all that his innocents’ day will come; his victims’ hopes and lives are snuffed out by forces such as nature and class that have no concern at all with the worth of individual lives and hopes. For both writers, however, the morally just or unjust result is usually an accident that works in or against the protagonist’s favour.

Every child ever born has at some time or other wailed, “It’s not fair.” To which the adults answer, “Life isn’t fair,” and always, surely, with a sense of sorrow and a vague feeling of betrayal, but also an understanding that a vital lesson is being imparted.

Fairness and desert are not exactly the same, I suppose; we might have a basic requirement for a generalised fairness – equality of opportunity, say – that has nothing to do with what anyone deserves, but our strangely inbuilt earliest sense of fairness provides our first encounter with the complexity of justice and injustice. Perhaps it arose even earlier than human consciousness. There are those who, like the primatologist Frans de Waal, suggest that a sense of fairness is an inherent emotion in monkeys:

An experiment with capuchin monkeys by Sarah Brosnan, of Georgia State University’s CEBUS Lab, and myself illuminated this emotional basis. These monkeys will happily perform a task for cucumber slices until they see other getting grapes, which taste so much better. They become agitated, throw down their measly cucumbers, and go on strike.

I’m not sure if this is exactly a sense of fairness. If so, it is a limited, unidirectional sense. Perhaps a sense of unfairness precedes the more general idea. I imagine a full sense of fairness would be demonstrated by a capuchin throwing her grapes down when she sees her fellow worker receiving cucumber. All for one and one for all. I couldn’t find any experiment that showed this.

A sense of personal unfairness may be all that is experienced by small children, too. It is always easy enough to come up with the idea that we have been morally mistreated. We manage to do it from a very young age and, like my mother-in-law, continue to the end of our lives. That others might deserve something is a more sophisticated thought. Usually, before any egalitarian fervour has a chance to emerge on its own, we have introduced the children, if not the monkeys, to the concept of desert. You get the grape for good behaviour, or helping with the washing-up, or not hitting your baby brother when he hits you, and you don’t get a grape if you throw a tantrum, or refuse to put on your socks. In this way, you and your brother get different amounts of goodness according to some very general rule that you are not much in a position to question, and the inherent problems of universal fairness are put into abeyance, except in the deepest dungeon of our consciousness.

There’s a revival of the childish sense of unfairness in adolescence when they/we cry, “I didn’t ask to be born.” To which we/they reply, again with an implication of just des - erts: “The world doesn’t owe you a living.” But neither party explains how either statement asks or answers the difficulty of unfairness in the world.

I dare say all this harks back to our earliest desperation – the battle for the breast – with the helpless infant demanding that her hunger be assuaged and demanding comfort for her discomfort, the formerly helpless infant now in charge and having the capacity to deny it. It starts in a milky muddle and goes on to just des(s)erts. It is astonishing, actually, that the word for pudding in English is not, as it plainly ought to be, related to the desert that is getting what you deserve.

Nevertheless, eventually the hard-learned reward and punishment system becomes social glue and enters into the world as law and civic organisation, as a clumsy attempt to solve the insoluble. The legislation starts early, in the family, and is a necessity in the community and the state, because, in any unlegislated situation, goodness and altruism are not necessarily rewarded on an individual level. Payback, positive and negative, is rarely found in the wild, and only sometimes in what we call civilisation. Cheats very often prosper and an eye for an eye is a brutal, primitive formulation that advanced cultures (us, as we like to think of ourselves) reject as a kind of exact justice that lacks all mature consideration of circumstances. Yahweh hardly applied fairness when he repaid Job’s devotion with vastly incommensurate loss just to win a bet with Satan. And certainly the knotted family romance that is the basis for Judaism, Christianity and Islam, involving Abram, Sara, Isaac, Hagar, Ishmael and Yahweh, is resolved only by Abram’s adultery with Hagar, then Hagar’s expulsion with her son, nearly ending in their death, and the near-filicide of Isaac. All the victims are as completely innocent as human beings and God can be.

In an attempt properly to get to grips with the idea of fairness, justice and desert, I have recently been struggling with the story of Amos, Boris, Claire and Zoey. They are the protagonists in a drama plotted by Shelly Kagan in his new book, The Geometry of Desert. To simplify, but only slightly, all four of them are injured by an explosion at work. A fifth person, You, comes along with a syringe containing a single dose of painkiller, while they wait in agony for the ambulance.

There is no point in giving everybody a little bit; it won’t help any of them enough. Whom do you give the single useful dose to? At this point, the devastation fades into the background and we learn that Amos was hurt as he happened to walk past an explosion from a device planted by the disgruntled or revolutionary Boris, who failed to get away in time, and that Claire, who instigated the bomb attack and set off the detonator, stood too close and was also injured by the blast, while Zoey came on the horrible scene and was wounded by a second blast as she was trying to go to the aid of the other three. The carnage can now return to the forefront of your mind and you have to choose whom to help with your exiguous morphine supply.

The first thing that should become clear before you start mulling over whom to assist is that you are, in fact, in the middle of a philosophical thought experiment. If you are, like me, a novelist with a resistance (as well as a –probably related –hopelessly inept attraction) to this kind of theoretical reasoning, you might reject the entire scenario, because it never happened and your plot is as good as anyone else’s. No, you think, as if you were back in school rebelling against the insistence that all lines meeting at infinity is a given, I don’t have to make any choice at all.

The bomb at the factory didn’t go off. It was never set in the first place. Boris and Claire are gentle vegans who have no animus that would impel them to set a bomb, and no one is hurt. Amos, Boris, Claire and Zoey can continue their ordinary daily business, perhaps never even meeting, or, if they do, knowing nothing about the drama that never happened and which they all failed to be involved in. Or perhaps each of them becomes the protagonist of his or her own novel of which You, and not Shelly Kagan, are the author – the A, B, C, Z Quartet.

In my version, You’s choices are broadened infinitely, there is no given, and I can simply refuse the parameters of the thought experiment because I am not a philosopher, I do not wish to be restricted to the terms set by someone else for their own didactic purposes, and likely I’ve got several deadlines that don’t depend on figuring out how much or how little guilt deserves the morphine and why. And so, once again, I fail to get to grips with academic philosophy.

The Geometry of Desert considers both the fundamental and the complex nature of deserving. Kagan poses familiar questions initially (what makes one person more deserving than another?; what is it that the more deserving deserve more of?; does anyone deserve anything at all?) and then puts them aside in order to examine the underlying complexity of desert by means of graphs that represent his elaborately anatomised notion of desert and all the possible implications and interactions between its teased-apart elements. This graphical representation of desert is, he says, the most important and original part, and the point, of his book.

Which I dare say it is, but I got no further than my enjoyment and childish rejection of the initial elementary narrative. If I were Alice, the Wonderland in which I find myself wandering, enchanted but fearful and utterly baffled, would be geometry, algebra and (as Alice also encounters) formal logic. I am, if it is possible, spatially challenged. Maps and reality completely fail to come together in my brain. My eyes tear up and the trauma of school maths lessons returns to me as Kagan translates away from situation to number and algebraic representation to devise graphs whose plotted lines meander across their grid in a, to me, mysterious arithmetical relation to each other. I’m rubbish at all things numerical and graphical and, with all the will in the world, which I started off with, I could no more have read the greater part of Kagan’s book with comprehension than I could read the Bhaga­vadgita in Sanskrit.

And yet and yet, I can’t get away from the foothills of desert. I can’t shake off the elementary problems that Amos, Boris, Claire and Zoey create, lying there, waiting for that ambulance, me with a hypodermic full of morphine still in my pocket. Amos and Zoey innocent as lambs, but perhaps Zoey more innocent, having put herself in harm’s way in order to help the others? Boris and Claire guilty, for sure, but is Claire, the initiator of the harmful event, more guilty than Boris the foot soldier? Does it go without saying that I should perform a moral triage in order to decide which sufferer to give the morphine, based on the hierarchy of guilt and innocence? Kagan calls it “Fault Forfeits First”, so that Zoey would be first in line for the morphine and Claire and Boris at the back of the queue. But he points out a basic division in Fault Forfeits First, between the “retributionists” and “moderates” who subscribe to that belief.

The retributionists would not give Claire or Boris any morphine even if some were left over after soothing Zoey’s pain, because they deserve to suffer, having caused suffering. The moderates believe that no one should suffer, but that the innocent Amos and Zoey should be helped first if a choice has to be made. The world, the moderates believe, I believe and perhaps you believe, is improved by an improvement in everyone’s well-being. The retributionists think that the world is a better place if the vicious suffer for their viciousness.

But, as John Rawls claimed early in his career, unless you completely accept free will in people’s behaviour, unclouded by fortune or misfortune in birth, education or life experience, it is possible that no one deserves anything as a result of his actions, good or bad. The first instinct is to give Zoey the pain­killer, other things being equal. Other things being equal is the problem. Why, when you come to think of it, does Zoey deserve less pain or more well-being on account of her good will? Did she have a particularly fortunate upbringing or, indeed, an unfortunate one that inclined her to acts of benevolence? No one is culturally, genetically free of influence. In any case, she had no intention of being injured when she went to help. And who knows why Claire, who conceived a bomb and detonated it, became the person she did?

How do we know (butterfly wings beating in the rainforest, and all that) if there might not be something we are not aware of that would make it more beneficial to give Claire the morphine? What if she has information about other bombs that have been planted? And what if, given an “undeserved” benefit, she came to rethink her viciousness? There may be more purely angelic joy in heaven over such a change of heart, but there are also very good practical reasons to rejoice far more, here on earth, over the redemption of one sinner than over 99 people who do not need to repent.

The retributionists and the moderates believe as they do for the same complicated reasons as the good and the vicious. In the practical world, getting just deserts is enshrined in legislation, and justice is separated from fairness, precisely to avoid the endless entailments of the philosophy of desert. It isn’t so surprising that there have been 20 seasons of Law and Order, which in every episode neatly segments and plays out the uncertainties of policing wrongdoing and providing justice. Finally, I suppose, we have to settle for the muddle of “good enough” fairness, while thinking and trying for something better. But don’t try telling that to my mother.

Jenny Diski’s most recent book is “What I Don’t Know About Animals” (Virago, £9.99)

This article first appeared in the 01 April 2013 issue of the New Statesman, Easter Special Issue

Show Hide image

The English Revolt

Brexit, Euroscepticism and the future of the United Kingdom.

English voters have led – some would say forced – the United Kingdom towards exit from the European Union. Was this an English revolt, the result of an ­upsurge over decades of a more assertive, perhaps resentful, sense of English identity? At one level, clearly so. Surveys indicate that individuals who most often describe themselves as “English”, and regions where this is common, were more inclined to vote Leave on 23 June. Some of these are poorer regions where marginalised people think that their voices are more likely to be heard in a national democracy than in an international trading bloc, and for whom patriotism is a source of self-respect. But it would only make sense to regard Leave as essentially an English reaction if discontent with the EU were confined to England, or specifically linked with feelings of Englishness.

In fact, negative opinions about the EU, and especially about its economic policy, are now more widespread in other countries than they are in England. Polls by the Pew Research Centre last month showed that disapproval of the EU was as high in Germany and the Netherlands as in Britain, and higher in France, Greece and Spain. Though aggravated by the 2007-2008 crash and enforced policies of austerity, a decline in support was clear earlier. France’s referendum of May 2005 gave a 55 per cent No to the proposed EU constitution after thorough debate, and a now familiar pattern emerged: enthusiastic Europeanism was confined to the wealthiest suburbs and quarters of Paris, and the only professional groups that strongly voted Yes were big business, the liberal professions and academics.

Going far beyond the atavistic and incoherent English revolt that some think they discern, our referendum result is partly a consequence of transnational political phenomena across the democratic world: the disaffection of citizens from conventional politics, shown by falling turnouts for elections, shrinking party membership and the rise of new, sometimes extreme political movements; as well as the simultaneous detachment of a professional political class from civil society, and its consequent retreat into a closed world of institutions.

The EU embodies these phenomena in uniquely acute form. In several cases its central bodies have opposed – or, if one prefers, have been forced to deny – democratically expressed wishes. In Greece and Italy, the EU has enforced changes of government and policy, and in Denmark, Ireland and the Netherlands it has pressed countries to ignore or reverse popular referendums. Its own representative body, the European Parliament, has gained neither power nor legitimacy. Crucial decisions are taken in secret, making the EU a hiding place for beleaguered politicians as well as a source of lavish financial reward for insiders. In the words of the historian John Gillingham, Europe is now being governed by neither its peoples nor its ideals, but by a bank board. This is not the “superstate” of Eurosceptic mythology. Though it drains power and legitimacy away from national governments, it is incapable of exercising power effectively itself, whether to cope with short-term emergencies such as an inflow of refugees, or to solve chronic failings such as the creation of mass unemployment in southern Europe. The result is paralysis, the inability either to extricate itself from failing institutions or to make them work.

If popular discontent with the EU continues to increase (and it is hard to see how it could not) sooner or later there will be some unmanageable political or social crisis. The response of too many supporters of the EU is to screw the lid down tighter, including now by promising to make life difficult for the United Kingdom, pour décourager les autres. This is the organisation – unpopular, unaccountable, secretive, often corrupt, and economically failing – from which our decision to depart apparently causes people to weep in the streets.

***

Why this decision? Why in Britain? The simplest and perhaps the best answer is that we have had a referendum. If France, Greece, Italy and some other countries had been given the same choice, they might well have made the same decision. But of course they have not been and will not be given such a choice, barring severe political crisis. This is most obviously because countries that have adopted the euro – even those such as Greece, for which the IMF has predicted high unemployment at least until the 2040s – have no clear way out.

I make this obvious point to emphasise that the immediate explanation of what has happened lies not only and not mainly in different feelings about the EU in Britain, but in different political opportunities and levels of fear. The contrasting votes in Scotland and Northern Ireland have particular explanations. Scottish nationalists – like their counterparts in Catalonia – see the EU as an indispensable support for independence. Northern Ireland sees the matter primarily as one affecting its own, still tense domestic politics and its relations with the Republic. In a European perspective, Scotland and Northern Ireland are the outliers, not England and Wales. Indeed, Scotland’s vote makes it stand out as one of the most pro-EU countries in Europe. If ever there is another referendum to see whether Scots prefer the EU to the UK, it will show whether this level of support for the EU is solid.

If England is exceptional, it is not in its disaffection from the EU, nor in the political divisions the referendum vote has exposed (if France, for instance, had such a vote, one could expect blood in the streets). Rather, its exceptional characteristic is its long-standing and settled scepticism about the European project in principle, greater than in any other EU country. Every ­member has a specific history that shapes its attitude to the theoretical idea of European integration. As John Gillingham, one of the most perceptive historians of the EU, describes its beginnings: “to the French [supranationalism was] a flag of convenience, to the Italians it was preferable (by definition) to government by Rome, to the Germans a welcome escape route, and to the Benelux nations a better choice than being dominated by powerful neighbours”.

Subsequently, for the eastern European states, it was a decisive step away from communist dictatorship, and for southern Europe a line drawn under a traumatic history of civil conflict. There is also a widespread belief, powerful though fanciful, that the EU prevents war between the European states. All these are important reasons why there remains considerable support for unification as an aspiration. But all these reasons are weaker, and some of them non-existent, in Britain, and especially in England. The simple reason for this is that Britain’s experience of the 20th century was far less traumatic. Moreover, during that time loyalty to the nation was not tarnished with fascism, but was rather the buttress of freedom and democracy. Conversely, the vision of a European “superstate” is seen less as a guarantee of peace and freedom, and rather as the latest in a five-century succession of would-be continental hegemons.

Given all this, an obvious question is why the United Kingdom ever joined in the European project in the first place. The answer helps to explain the country’s subsequent lack of enthusiasm. Its first response to the creation of the European Economic Community in 1957 was not to join, but to agree to establish a separate European Free Trade Association (Efta) in 1959 with Austria, Denmark, Norway, Portugal, Sweden and Switzerland; over the next three decades the seven founder members were joined by Finland, Iceland and Liechtenstein. This worked efficiently, cheaply and amicably, and, in time, Efta and the EEC would doubtless have created trading arrangements and systems of co-operation. But then the historic mistake was made. Efta was considered too small to provide the diplomatic clout craved by Whitehall at a time of severe post-imperial jitters. A cabinet committee warned in 1960 that “if we try to remain aloof from [the EEC] – bearing in mind that this will be happening simultaneously with the contraction of our overseas possessions – we shall run the risk of losing political influence and of ceasing to be able to exercise any real claim to be a world Power”.

Besides, Washington disliked Efta as a barrier to its aim of a federal Europe, and the Americans put heavy pressure on London to apply to accede to the Treaty of Rome, which it duly did in August 1961. “It is only full membership, with the possibility of controlling and dominating Europe,” wrote an optimistic British cabinet official, “that is really attractive.”

As the former US secretary of state Dean Acheson (one of the early backers of European integration) put it, in a now celebrated comment in December 1962: “Great Britain has lost an empire, and has not yet found a role. The attempt to play a separate power role . . . apart from Europe . . . based on a ‘special relationship’ with the United States [or] on being the head of a ‘Commonwealth’ . . . – this role is about played out.”

Acheson’s words long haunted British policymakers; perhaps they still do. And yet Britain remains one of the half-dozen strongest and most assertive states anywhere in the world, just as it has been for the past three centuries.

To fear of diplomatic marginalisation was added fear of economic decline. A government report in 1953 warned of “relegation of the UK to the second division”. Over the next 30 years there was a chorus of dismay about “the sick man of Europe”. Belief that EEC membership at any price was the only cure for Britain’s perceived economic ills became the orthodoxy in official circles: Britain was “the sinking Titanic”, and “Europe” the lifeboat.

So, on 1 January 1973 Britain formally entered the EEC with Denmark and Ireland. Other Efta members remained outside the Community – Switzerland and Norway for good. Harold Wilson’s 1975 referendum on whether to stay in the EEC in effect turned on Europe’s superior economic performance – which, though no one realised it at the time, had just ended.

This memory of apparent British economic weakness half a century ago still seems to weigh with older Remainers. Yet it was based on a fundamental misconception: that European growth rates were permanently higher than in a supposedly outdated and declining Britain. In reality, faster growth on the mainland in the 1950s and 1960s was due to one-off structural modernisation: the large agricultural workforce shifted into more productive industrial employment. From the mid-1940s to the early 1970s this gave several European countries “windfall growth” at a higher rate than was possible in Britain, which since the 19th century had had no large agricultural sector to convert. By the early 1970s, once that catching up was finished, European growth rates became the same as, or slightly lower than, Britain’s. When measured over the whole half-century from 1950 to 2000, Britain’s economic performance was no different from the ­European norm. By the mid-1980s, growth was faster than in France and Germany, and today Britain’s economic fundamentals remain strong.

Slower European growth lessened the perceived attractiveness of EU integration. In 1992, on Black Wednesday (16 September), hesitant participation in the European Exchange Rate Mechanism led to forced devaluations in Finland, Sweden, Italy, Spain and, finally, Britain. This was a huge political shock, though an economic boost.

Black Wednesday subsequently made it politically difficult for Britain to join the eurozone – allowing us a narrow escape, attributable more to circumstance than to policy, as vocal political and economic lobbies urged joining.

Moreover, Britain’s trade with the rest of the EU was declining as a proportion of its global activity: as Gordon Brown observed in 2005, 80 per cent of the UK’s potential trade lay outside the EU. The EU’s single market proved not very effective at increasing trade between its members even before the crash of 2007-2008, and prolonged austerity thereafter made it stagnant. Consequently, in the 2016 referendum campaign, more emphasis was placed on the dangers of leaving the single market than on the precise benefits of being in it.

But the days when Britain seemed the Titanic and Europe the lifeboat were long gone. On the contrary, Britain, with its fluid and largely unregulated labour market, had become the employer of last resort for the depressed countries of the eurozone. The sustained importation of workers since the 1990s had become, for a large part of Britain’s working class, the thing that most obviously outweighed whatever legal or economic advantages the EU might theoretically offer.

***

What galvanised the vote for Brexit, I think, was a core attachment to national democracy: the only sort of democracy that exists in Europe. That is what “getting our country back” essentially means. Granted, the slogan covers a multitude of concerns and wishes, some of them irreconcilable; but that is what pluralist democracy involves. Britain has long been the country most ­resistant to ceding greater powers to the EU: opinion polls in the lead-up to the referendum showed that only 6 per cent of people in the UK (compared to 34 per cent in France, for instance, and 26 per cent in Germany) favoured increased centralisation – a measure of the feebleness of Euro-federalism in Britain.

In contrast, two-thirds wanted powers returned from the EU to the British government, with a majority even among the relatively Europhile young. This suggests a much greater opposition to EU centralisation than shown by the 52 per cent vote for Brexit. The difference may be accounted for by the huge pressure put on the electorate during the campaign. Indeed, arithmetic suggests that half even of Remain voters oppose greater powers being given to the EU. Yet its supporters regard an increase of EU control over economic and financial decisions – the basics of politics – as indispensable if the EU is to survive, because of the strains inherent in the eurozone system. This stark contradiction between the decentralisation that many of the peoples of Europe – and above all the British – want to see and the greater centralisation that the EU as an institution needs is wilfully ignored by Remain supporters. Those who deplore the British electorate’s excessive attachment to self-government as some sort of impertinence should be clear (not least with themselves) about whether they believe that the age of democracy in Europe is over, and that great decisions should be left to professional politicians, bureaucracies and large corporations.

Some have dismissed the Leave vote as an incoherent and anarchic protest against “the establishment”, or as a xenophobic reaction against immigrants. Some of the media in Britain and abroad have been doing their best to propagate this view. Yet xenophobia has not been a significant feature of British politics since the 1960s, and certainly far less so than in many obedient EU member states, including France, Germany, Greece and the Netherlands. As for the anti-establishment “revolt”, this emerged when parts of the establishment began to put organised pressure on the electorate to vote Remain. Would-be opinion-formers have hardly covered themselves in glory in recent weeks. They have been out of touch and out of sympathy with opinion in the country, unwilling or unable to engage in reasoned debate, and resorting to collective proclamations of institutional authority which proved embarrassingly ineffective.

Worst of all, their main argument – whether they were artists, actors, film-makers, university vice-chancellors or prestigious learned societies – was one of unabashed self interest: the EU is our milch-cow, and hence you must feed it. This was a lamentable trahison des clercs. The reaction to the referendum result by some Remain partisans has been a monumental fit of pique that includes talking up economic crisis (which, as Keynes showed, is often self-fulfilling) and smearing 17 million Leave voters as xenophobes. This is both irresponsible and futile, and paves the way to political marginalisation.

The Queen’s call for “deeper, cooler consideration” is much needed. I recall Victor Hugo’s crushing invective against French elitists who rejected the verdict of democracy, when in 1850 he scorned “your ignorance of the country today, the antipathy that you feel for it and that it feels for you”.

This antipathy has reduced English politics to a temporary shambles. It is too early to say whether there will be some realignment of the fragments: One-Nation Toryism, Conservative neoliberalism, “new” and “old” Labour, the hibernating Liberal Democrats and Greens, the various nationalists and, of course, the unpredictable Ukip. When in the past there were similar crises – such as Labour’s rift over the national government in 1931, the Liberals’ split over Irish home rule in 1886, or the Tory fragmentation over the repeal of the Corn Laws in 1846 – the political balance was permanently changed.

***

Many Europeans fear that a breakdown of the EU could slide into a return to the horrors of the mid-20th century. Most people in Britain do not. The fundamental feature of the referendum campaign was that the majority was not frightened out of voting for Leave, either by political or by economic warnings. This is testimony to a significant change since the last referendum in 1975: most people no longer see Britain as a declining country dependent on the EU.

A Eurobarometer poll in 2013 showed that Britain was the only EU member state in which most citizens felt that they could face the future better outside the Union. Last month’s referendum reflected this view, which was not reversed by reiterated predictions of doom.

In retrospect, joining the Common Market in 1973 has proved an immense historic error. It is surely evident that we would not have been applying to join the EU in 2016 had we, like Norway or Switzerland, remained outside it. Yet the political and possibly economic costs of leaving it now are considerable. Even though discontent with the EU across much of Europe has recently overtaken sentiment in Britain, Britain is unique, in that, ever since the 1970s, its public has been consistently far less ­favourable to the idea of European integration than the electorate in any other country. Hence the various “opt-outs” and the critically important decision to remain outside the euro.

Now, by a great historic irony, we are heading towards the sort of associate status with the EU that we had in the late 1960s as the leading member of Efta, and which we could have kept. Instead, this country was led by its political elite, for reasons of prestige and because of exaggerated fears of national decline and marginalisation, into a vain attempt to be “at the heart of Europe”. It has been a dangerous illusion, born of the postwar declinist obsession, that Britain must “punch above its weight” both by following in the footsteps of the United States and by attaching itself to the EU.

For some, money, blood and control over our own policy were sacrifices worth making for a “seat at the top table”. This dual strategy has collapsed. In future we shall have to decide what is the appropriate and desirable role for Britain to play in the world, and we shall have to decide it for ourselves.

Robert Tombs is Professor of French History at Cambridge University. His most recent book is “The English and Their History” (Penguin)

This article first appeared in the 21 July 2016 issue of the New Statesman, The English Revolt