A rounded image: but modern culture is solipsistic, fixed on looking inward at our own preoccupations. Photo: Fleur van Dodewaard, part of the ‘Sun Set Series’ (2011)
Show Hide image

On narcissim: the mirror and the self

People from Tiger Woods to the Obamas are routinely denounced for their narcissism. But what does the word really mean and are there good as well as bad types of self-love?
Sylvia Plath said that writers are “the most narcissistic people”: whatever the truth of that statement, one can assume at least that her use of the term was correct. Freud bequeathed the modern era a tangled concept in narcissism, and literary culture has shown itself as apt as any other to misappropriate it. Yet it is the fashion to see people increasingly as one of two types – a narcissist, or the victim of one – so perhaps it is worth asking precisely what is meant by the word, which has come to encapsulate a cultural malaise.
 
Alongside the struggle in the modern era to define and enshrine narcissism as a psychiatric condition, the term has been appropriated as a shorthand for the general idea of self-obsession. This is a diverse concept with a large vocabulary of its own, but that vocabulary is increasingly abandoned in favour of a word whose ill-defined connotations of mental illness give it a strange force. What we think narcissism is, and how much of what we see seems to answer to it, depends in reality on the moral status we accord the self: the very forcefulness of “narcissism” lies in the fact that it illuminates the person saying it as much as the person against whom it is said. And indeed narcissism, classically, is a business of echo and reflection that can give rise to a narrative of maddening circularity, of repetition and counter-repetition, in which self and other struggle to separate and define themselves.
 
 
Surface intention: Obama allows himself to be used as a channel for reflecting on the American story.
Photo: Pete Souza/The White House/Polaris/eyevine
 
“Narcissism describes a culturally induced kind of subjectivity,” writes the psychoanalyst Sergio Benvenuto, “a new way in which modern subjects secularise ideals, sex objects and knowledge, a culture in which people believe less and less in psychoanalysis.” A narcissistic culture, in other words, will pillory what it calls narcissists and disown certain cultural products as narcissistic in order to avoid self-revelation and obstruct the pursuit of personal truth.
 
In US politics, where “narcissism” has come to signify the very elision of power and personality that has been fundamental to the nation’s ascendant culture of self, the effect is of a hall of mirrors: “The authors blame John Edwards’s narcissism for his downfall and describe Bill Clinton as a ‘narcissist on an epic scale’,” a book reviewer recently wrote in the New York Times. “Do a Google search on ‘Tiger Woods’ and ‘narcissist’ and you get tens of thousands of references . . . Rush Limbaugh calls President Obama a narcissist, it seems, every 24 hours.” Mitt Romney, himself a known narcissist, also favours this analysis of Obama, and avidly posts evidence for it on his website. The book Malignant Self-Love: Narcissism Revisited by Sam Vaknin is often cited in support of these diagnoses. Unfortunately it appears that Mr Vaknin, too, is a narcissist.
 
Narcissism, in the case of Obama and other political leaders, is a catch-all term for nearly every quality a person might require in order to become, for instance, president of the United States: ambition, determination, vision, self-belief. But Obama, in the eyes of his critics, also qualifies as another kind of narcissist. “Perhaps not surprisingly for a man whose principal accomplishment before becoming president was to write two autobiographies,” writes one journalist, “Obama has seemed to spend an inordinate amount of time talking about himself. And it’s not just Obama, but the first lady, too.”
 
Michelle Obama’s narcissism is illustrated thus, in a long quotation from a talk she gave to students at the University of Mumbai: 
 
I didn’t grow up with a lot of money. I mean, my parents – I had two parents. I was lucky to have two parents, and they always had a job, but we didn’t have a lot of money. But it was because of working hard, and studying, and learning how to write and read. And then I got a chance to go to college. And then college opened up the world to me. I started seeing all these things that I could be or do – and I never even imagined being the first lady of the United States. But because I had an education, when the time came to do this, I was ready. So just remember there is nothing that you guys can’t do. You know, you have everything it takes to be successful and smart and to raise a family, right? What do you say?
 
The writer continues: “The poor students in Mumbai might have had something to say, but the first lady never let them say a word. Instead, she continued on with her monologue before permitting a question. She then answered that question by referring to her favourite subject: herself and Barack Obama.”
 
This second narcissist, who spends all his time “talking about himself”, is in a way a more complex figure, and one that is harder to isolate, particularly in a culture (America) where fame and autobiography are so intertwined. As in Michelle Obama’s telling of it, fame (or power, or success) is the happy ending in the American story of life; that story is usually a narrative of ascent. Generally speaking, the Obamas have been lauded for talking about themselves – they have demonstrated an impeccable grasp of autobiographical form. They have in many ways revived and reshaped it by salting the ascent with just enough reality (or “honesty”) to make the American story seem true again. It’s a delicate illusion to manage, and one that is threatened by the notion that the autobiographer isn’t advancing the common story of life after all but is simply talking about his “favourite subject”, himself.
 
George J Marlin, the author of Narcissist Nation: Reflections of a Blue-State Conservative (“reflections” seems to be an unintended pun), claims that Obama “uses the ‘I’ word more than all the presidents have used it collectively in the 200-and-some-odd years of our nation”. The conservative, it seems, more readily than the democrat, sees autobiography as a form of bad manners (“the ‘I’ word”); and indeed, one reading of the myth of Narcissus itself is as a story of unmannerliness and its consequences.
 
Christopher Lasch, in his celebrated book The Culture of Narcissism: American Life in an Age of Diminishing Expectations (1979), wrote: “The new narcissist is haunted not by guilt but by anxiety. He seeks not to inflict his own certainties on others but to find a meaning in life.” The guilt of the “old” narcissist might constitute nothing more than this conservative aversion to self-disclosure. The old narcissist processed his self-obsession by inflicting his certainties in a way that nonetheless left his “self” concealed: the “new” narcissist, by contrast, presents an agonised face to the world; his “self” is confessed and given over to others, leaving him free to ignore the social contract and do as he likes.
 
Malignant Self-Love is written in the “survivor” mode of American letters, the author having survived both his own confessed narcissism and that of his parents and gone on to found Narcissus Publications, an outlet for his own works. Yet Vaknin’s definition of narcissism is accurate enough: the narcissistic personality “is rigid to the point of being unable to change in reaction to changing circumstances . . . Such a person takes behavioural, emotional and cognitive cues exclusively from others. His inner world is, so to speak, vacated. His True Self is dilapidated and dysfunctional. Instead he has a tyrannical and delusional False Self. Such a person is incapable of loving and of living. He cannot love others because he cannot love himself. He loves his reflection, his surrogate self. And he is incapable of living because life is a struggle towards, a striving, a drive at something. In other words: life is change. He who cannot change cannot live. The narcissist is an actor in a monodrama, yet forced to remain behind the scenes. The scenes take centre stage, instead. The narcissist does not cater at all to his own needs. Contrary to his reputation, the narcissist does not ‘love’ himself in any true sense of the word.” 
 
What is compelling here is the notion that the narcissist’s “inner world is, so to speak, vacated”. D W Winnicott’s interjection of the maternal figure into the theory of primary narcissism attributes that vacated inner world to an initial absence of recognition: “The mother gazes at the baby in her arms, and the baby gazes at his mother’s face and finds himself therein . . . provided that the mother is really looking at the unique, small, helpless being and not projecting her own expectations, fears and plans for the child. In that case, the child would find not himself in his mother’s face, but rather the mother’s own projections. This child would remain without a mirror, and for the rest of his life would be seeking this mirror in vain.”
 
The widespread notion of a “healthy” degree of narcissism, according to this definition, is not quite the essential dose of vanity or self-regard we’re so often told to allow ourselves; perhaps, rather, there is an extent to which a person needs to be another person’s projection, their construction, an inner space that is and ought to remain vacated in order for the social dynamic to function.
 
“Liberated from the superstitions of the past”, Christopher Lasch continues, the new narcissist “doubts even the reality of his own existence. Superficially relaxed and tolerant . . . his sexual attitudes are permissive rather than puritanical, even though his emancipation from ancient taboos brings him no sexual peace. Fiercely competitive in his demand for approval and acclaim, he distrusts competition because he associates it unconsciously with an unbridled urge to destroy. He extols co-operation and teamwork while harbouring deeply antisocial impulses. He praises respect for rules and regulations in the secret belief that they do not apply to himself. Acquisitive in the sense that his cravings have no limits, he does not accumulate goods and provisions against the future, but demands immediate gratification and lives in a state of restless, perpetually unsatisfied desire.”
 
Lasch’s “new” narcissist isn’t so new any longer: he has become a parent. It might be said that social media such as Twitter and Facebook – those shrines to the self – are among the new narcissist’s offspring, and they are often seized on as evidence of our own “culture of narcissism”. The notion of networking as a façade for “antisocial impulses” is compelling, but in fact the most striking thing about the representation of self in these forums is its triviality. This may be one consequence of parental over-approval, the outpourings of a generation whose parents abstained from criticising them and instead hung on their every word and deed. The belief that you are very important, in other words, could be genuine – of course the world wants to know what you had for lunch.
 
Talking about your “favourite subject”, in this context, is not just permissible but mandatory: displaying the culturally approved degree of self-love is a sign of narcissistic “health”. In its “healthy” guise, narcissism bears no relation to Vaknin’s vacated inner space, for the defining characteristic of contemporary “healthy” narcissism is banality. The psychoanalytic literature concurs in finding mental activity in itself to be narcissistic: thinking is an act of libidinal appropriation, in which the self removes its attention from the object. The “I” word, in fact, is as dirty as it ever was, when caught in the act of pursuing its own truth. Instead, the duty of the contemporary “I” is to confess itself in public, to dismiss itself by surrendering to an agreed social narrative as rigid in its permissiveness as it once was in its conservatism. According to that narrative, if you’re not your own “favourite subject”, there is something wrong with you. “Health” requires it, and thinking is unhealthy. Hence what looks like a series of consequences – that in a culture of relentless disclosure we have become obsessed with rights of privacy – is in fact a set of concurrently held and contradictory beliefs. Self-disclosure is one thing; selfexamination quite another.
 
When Tracey Emin’s Everyone I Have Ever Slept With 1963-1995 went up in smoke in the 2004 Momart warehouse fire, there was unseemly jubilation in the right-wing press: Tracey’s tent represented the cardinal sin of “confessional” art. Emin is an artist who is often called narcissistic, and there are many ways in which she – and more specifically, the tent – illustrates the contemporary misappropriation of the term. The problem with confessional art, in the eyes of its critics, is that it conflates the trivial and the serious; the more the self is trivialised, the more abhorrent to culture this conflation will seem. In other words, the tent was shocking not because it disclosed what was private and personal, but because it was called “art”. More than that, its disclosures were not “healthy”. And finally, the tent was not tragic. This was very annoying, and made its incineration seem like a piece of poetic justice. Had the tent been self-loathing, it might have fitted in to the narrative of ascent: a girl regrets her chequered past and goes on to become a famous artist, selling her work for vast sums. But like Louise Bourgeois, Emin reprised feminine skills of needlework in order to represent a subjection in which self-discipline and self-care survived; a female art signifying not tragedy, but dignity. The tent is a piece of storytelling – it is commemorative, for keeping. A “confession”, on the other hand, is something to be thrown away in the hope of absolution. The “confessional” work, strictly speaking, is an admission of abnormality made out of the desire to become normal.
 
Emin has had great play in and on the contemporary obsession with narcissism, outwitting it at every turn. Her exertions demonstrate how hard it has become to serve the autobiographical impulse and raise the question of why the “I” word is such a locus of contradiction. Paradoxically, in a climate of unfettered disclosure, the artist is abhorred for examining herself. 
 
Recently I participated in a literary event at which a number of memoirists read from their work. It was striking how many of them assured the audience that “this is not about me”. It seemed that the only legitimate excuse for writing autobiography was to present it as a kind of war report – I was there, I witnessed it, but this is not about me. And there was nothing shamefaced about it: what these writers were saying, in fact, was that their work was “serious”, that although it looked like autobiography (triviality) it was actually diligent documentary (art). Tracey Emin’s statement is the reverse: “This is all about me.” What Emin has understood, as the Obamas have understood, is the notion of autobiographical occasion, whereby the self is not merely declarative but representative; is, in other words, the best example of what it is trying to say.
 
There are places in the social narrative where the form has to become autobiographical in order to advance itself; history passes to the individual for a while, as when a black man becomes president of the United States of America, or a working-class woman becomes one of the most powerful artists of her era. The story of how this came to be is not the story of one exceptional person: rather, that person is able to express and illustrate change through their own being. Self-portraiture was the best way for Rembrandt to describe the ascent of the self and the new relationship with worldliness and death it betokened. At the other end of history, Emin’s tent documents not just the changed status of the female body, but the contemporary problem of “the personal” itself, a representation she has pushed to the limit by making it co-extensive with “Tracey Emin”.
 
To return to Sylvia Plath . . . Literary culture has a far less comfortable relationship with self-analysis and self-portraiture than the visual arts, which is the mark of its conservatism. The openly self-analysing writer will be pilloried for talking about his or her “favourite subject”, for bad manners in using the “I” word. The literary reverence for the idea of “imagination”, as well as for history and for tales of “otherness”, is perhaps another iteration of what Virginia Woolf observed to be the culturally sanctioned “important” (male) subjects for the novel. The more “other” a text, the less it can be believed to be narcissistic; if the personal is trivial, the impersonal is “important”. 
 
Freud described narcissism as a tactic, a libidinal position, as Sergio Benvenuto puts it, “taken, for example, when a human being is in physical pain. Classical neurotic suffering drags narcissism along, because being neurotic in Freudian terms means not knowing what one desires. This uncertainty, or puzzling state of gaping desire, hauls along narcissistic constellations.”
 
A writer may indeed be someone driven by “classical neurotic suffering”, but, to quote Benvenuto again, “The symptom of narcissism is fascination . . . for Freud, our narcissistic love for ourselves is never natural, or primary.” Personal truth – the self-portrait – is in fact the opposite of narcissistic. Rather, the narcissistic artist is tactically seductive and charming, and imagination can be one such tactic. Writers may be narcissists, dragging along evolving literary constructions, but the central preoccupation of the narcissist is the avoidance of self-exposure while garnering attention and praise.
 
Whether or not “narcissism” is misunderstood and misused, its usage is puritanical: it is intended to inflict shame. Often the so-called narcissist’s self-exposure – the very thing that makes him vulnerable – has already been rewarded, if only by the attention of his critics; hence their anger. The accusation becomes an echo chamber, as in Ovid’s telling of the myth, where Narcissus and Echo can only say, “Who are you?” to one another, in a conversation that can never progress. This reflexive relationship leads both parties into upset and madness: Echo runs away, and Narcissus, driven by thirst, goes to the waters wherein his mother was once trapped and seduced, and where he was conceived. He becomes fixated with this source of self, on whose surface his own image floats: he doesn’t recognise the image and mistakes it for a being that might reciprocate. Yet he is thrilled at last to feel something, to feel love. When he tries to approach the image, it disappears. When he retreats, it comes back again.
 
It’s a pretty concept, and one that does indeed describe the struggle of creativity. The eventual result of this impasse is transformation. What is human in Narcissus dies and distils itself: his self-absorption bears fruit and is bequeathed to the world, becoming a flower that grows at the water’s edge, where he himself began.
 
Rachel Cusk’s most recent book is “Aftermath: On Marriage and Separation” (Faber & Faber, £8.99)
Show Hide image

The English Revolt

Brexit, Euroscepticism and the future of the United Kingdom.

English voters have led – some would say forced – the United Kingdom towards exit from the European Union. Was this an English revolt, the result of an ­upsurge over decades of a more assertive, perhaps resentful, sense of English identity? At one level, clearly so. Surveys indicate that individuals who most often describe themselves as “English”, and regions where this is common, were more inclined to vote Leave on 23 June. Some of these are poorer regions where marginalised people think that their voices are more likely to be heard in a national democracy than in an international trading bloc, and for whom patriotism is a source of self-respect. But it would only make sense to regard Leave as essentially an English reaction if discontent with the EU were confined to England, or specifically linked with feelings of Englishness.

In fact, negative opinions about the EU, and especially about its economic policy, are now more widespread in other countries than they are in England. Polls by the Pew Research Centre last month showed that disapproval of the EU was as high in Germany and the Netherlands as in Britain, and higher in France, Greece and Spain. Though aggravated by the 2007-2008 crash and enforced policies of austerity, a decline in support was clear earlier. France’s referendum of May 2005 gave a 55 per cent No to the proposed EU constitution after thorough debate, and a now familiar pattern emerged: enthusiastic Europeanism was confined to the wealthiest suburbs and quarters of Paris, and the only professional groups that strongly voted Yes were big business, the liberal professions and academics.

Going far beyond the atavistic and incoherent English revolt that some think they discern, our referendum result is partly a consequence of transnational political phenomena across the democratic world: the disaffection of citizens from conventional politics, shown by falling turnouts for elections, shrinking party membership and the rise of new, sometimes extreme political movements; as well as the simultaneous detachment of a professional political class from civil society, and its consequent retreat into a closed world of institutions.

The EU embodies these phenomena in uniquely acute form. In several cases its central bodies have opposed – or, if one prefers, have been forced to deny – democratically expressed wishes. In Greece and Italy, the EU has enforced changes of government and policy, and in Denmark, Ireland and the Netherlands it has pressed countries to ignore or reverse popular referendums. Its own representative body, the European Parliament, has gained neither power nor legitimacy. Crucial decisions are taken in secret, making the EU a hiding place for beleaguered politicians as well as a source of lavish financial reward for insiders. In the words of the historian John Gillingham, Europe is now being governed by neither its peoples nor its ideals, but by a bank board. This is not the “superstate” of Eurosceptic mythology. Though it drains power and legitimacy away from national governments, it is incapable of exercising power effectively itself, whether to cope with short-term emergencies such as an inflow of refugees, or to solve chronic failings such as the creation of mass unemployment in southern Europe. The result is paralysis, the inability either to extricate itself from failing institutions or to make them work.

If popular discontent with the EU continues to increase (and it is hard to see how it could not) sooner or later there will be some unmanageable political or social crisis. The response of too many supporters of the EU is to screw the lid down tighter, including now by promising to make life difficult for the United Kingdom, pour décourager les autres. This is the organisation – unpopular, unaccountable, secretive, often corrupt, and economically failing – from which our decision to depart apparently causes people to weep in the streets.

***

Why this decision? Why in Britain? The simplest and perhaps the best answer is that we have had a referendum. If France, Greece, Italy and some other countries had been given the same choice, they might well have made the same decision. But of course they have not been and will not be given such a choice, barring severe political crisis. This is most obviously because countries that have adopted the euro – even those such as Greece, for which the IMF has predicted high unemployment at least until the 2040s – have no clear way out.

I make this obvious point to emphasise that the immediate explanation of what has happened lies not only and not mainly in different feelings about the EU in Britain, but in different political opportunities and levels of fear. The contrasting votes in Scotland and Northern Ireland have particular explanations. Scottish nationalists – like their counterparts in Catalonia – see the EU as an indispensable support for independence. Northern Ireland sees the matter primarily as one affecting its own, still tense domestic politics and its relations with the Republic. In a European perspective, Scotland and Northern Ireland are the outliers, not England and Wales. Indeed, Scotland’s vote makes it stand out as one of the most pro-EU countries in Europe. If ever there is another referendum to see whether Scots prefer the EU to the UK, it will show whether this level of support for the EU is solid.

If England is exceptional, it is not in its disaffection from the EU, nor in the political divisions the referendum vote has exposed (if France, for instance, had such a vote, one could expect blood in the streets). Rather, its exceptional characteristic is its long-standing and settled scepticism about the European project in principle, greater than in any other EU country. Every ­member has a specific history that shapes its attitude to the theoretical idea of European integration. As John Gillingham, one of the most perceptive historians of the EU, describes its beginnings: “to the French [supranationalism was] a flag of convenience, to the Italians it was preferable (by definition) to government by Rome, to the Germans a welcome escape route, and to the Benelux nations a better choice than being dominated by powerful neighbours”.

Subsequently, for the eastern European states, it was a decisive step away from communist dictatorship, and for southern Europe a line drawn under a traumatic history of civil conflict. There is also a widespread belief, powerful though fanciful, that the EU prevents war between the European states. All these are important reasons why there remains considerable support for unification as an aspiration. But all these reasons are weaker, and some of them non-existent, in Britain, and especially in England. The simple reason for this is that Britain’s experience of the 20th century was far less traumatic. Moreover, during that time loyalty to the nation was not tarnished with fascism, but was rather the buttress of freedom and democracy. Conversely, the vision of a European “superstate” is seen less as a guarantee of peace and freedom, and rather as the latest in a five-century succession of would-be continental hegemons.

Given all this, an obvious question is why the United Kingdom ever joined in the European project in the first place. The answer helps to explain the country’s subsequent lack of enthusiasm. Its first response to the creation of the European Economic Community in 1957 was not to join, but to agree to establish a separate European Free Trade Association (Efta) in 1959 with Austria, Denmark, Norway, Portugal, Sweden and Switzerland; over the next three decades the seven founder members were joined by Finland, Iceland and Liechtenstein. This worked efficiently, cheaply and amicably, and, in time, Efta and the EEC would doubtless have created trading arrangements and systems of co-operation. But then the historic mistake was made. Efta was considered too small to provide the diplomatic clout craved by Whitehall at a time of severe post-imperial jitters. A cabinet committee warned in 1960 that “if we try to remain aloof from [the EEC] – bearing in mind that this will be happening simultaneously with the contraction of our overseas possessions – we shall run the risk of losing political influence and of ceasing to be able to exercise any real claim to be a world Power”.

Besides, Washington disliked Efta as a barrier to its aim of a federal Europe, and the Americans put heavy pressure on London to apply to accede to the Treaty of Rome, which it duly did in August 1961. “It is only full membership, with the possibility of controlling and dominating Europe,” wrote an optimistic British cabinet official, “that is really attractive.”

As the former US secretary of state Dean Acheson (one of the early backers of European integration) put it, in a now celebrated comment in December 1962: “Great Britain has lost an empire, and has not yet found a role. The attempt to play a separate power role . . . apart from Europe . . . based on a ‘special relationship’ with the United States [or] on being the head of a ‘Commonwealth’ . . . – this role is about played out.”

Acheson’s words long haunted British policymakers; perhaps they still do. And yet Britain remains one of the half-dozen strongest and most assertive states anywhere in the world, just as it has been for the past three centuries.

To fear of diplomatic marginalisation was added fear of economic decline. A government report in 1953 warned of “relegation of the UK to the second division”. Over the next 30 years there was a chorus of dismay about “the sick man of Europe”. Belief that EEC membership at any price was the only cure for Britain’s perceived economic ills became the orthodoxy in official circles: Britain was “the sinking Titanic”, and “Europe” the lifeboat.

So, on 1 January 1973 Britain formally entered the EEC with Denmark and Ireland. Other Efta members remained outside the Community – Switzerland and Norway for good. Harold Wilson’s 1975 referendum on whether to stay in the EEC in effect turned on Europe’s superior economic performance – which, though no one realised it at the time, had just ended.

This memory of apparent British economic weakness half a century ago still seems to weigh with older Remainers. Yet it was based on a fundamental misconception: that European growth rates were permanently higher than in a supposedly outdated and declining Britain. In reality, faster growth on the mainland in the 1950s and 1960s was due to one-off structural modernisation: the large agricultural workforce shifted into more productive industrial employment. From the mid-1940s to the early 1970s this gave several European countries “windfall growth” at a higher rate than was possible in Britain, which since the 19th century had had no large agricultural sector to convert. By the early 1970s, once that catching up was finished, European growth rates became the same as, or slightly lower than, Britain’s. When measured over the whole half-century from 1950 to 2000, Britain’s economic performance was no different from the ­European norm. By the mid-1980s, growth was faster than in France and Germany, and today Britain’s economic fundamentals remain strong.

Slower European growth lessened the perceived attractiveness of EU integration. In 1992, on Black Wednesday (16 September), hesitant participation in the European Exchange Rate Mechanism led to forced devaluations in Finland, Sweden, Italy, Spain and, finally, Britain. This was a huge political shock, though an economic boost.

Black Wednesday subsequently made it politically difficult for Britain to join the eurozone – allowing us a narrow escape, attributable more to circumstance than to policy, as vocal political and economic lobbies urged joining.

Moreover, Britain’s trade with the rest of the EU was declining as a proportion of its global activity: as Gordon Brown observed in 2005, 80 per cent of the UK’s potential trade lay outside the EU. The EU’s single market proved not very effective at increasing trade between its members even before the crash of 2007-2008, and prolonged austerity thereafter made it stagnant. Consequently, in the 2016 referendum campaign, more emphasis was placed on the dangers of leaving the single market than on the precise benefits of being in it.

But the days when Britain seemed the Titanic and Europe the lifeboat were long gone. On the contrary, Britain, with its fluid and largely unregulated labour market, had become the employer of last resort for the depressed countries of the eurozone. The sustained importation of workers since the 1990s had become, for a large part of Britain’s working class, the thing that most obviously outweighed whatever legal or economic advantages the EU might theoretically offer.

***

What galvanised the vote for Brexit, I think, was a core attachment to national democracy: the only sort of democracy that exists in Europe. That is what “getting our country back” essentially means. Granted, the slogan covers a multitude of concerns and wishes, some of them irreconcilable; but that is what pluralist democracy involves. Britain has long been the country most ­resistant to ceding greater powers to the EU: opinion polls in the lead-up to the referendum showed that only 6 per cent of people in the UK (compared to 34 per cent in France, for instance, and 26 per cent in Germany) favoured increased centralisation – a measure of the feebleness of Euro-federalism in Britain.

In contrast, two-thirds wanted powers returned from the EU to the British government, with a majority even among the relatively Europhile young. This suggests a much greater opposition to EU centralisation than shown by the 52 per cent vote for Brexit. The difference may be accounted for by the huge pressure put on the electorate during the campaign. Indeed, arithmetic suggests that half even of Remain voters oppose greater powers being given to the EU. Yet its supporters regard an increase of EU control over economic and financial decisions – the basics of politics – as indispensable if the EU is to survive, because of the strains inherent in the eurozone system. This stark contradiction between the decentralisation that many of the peoples of Europe – and above all the British – want to see and the greater centralisation that the EU as an institution needs is wilfully ignored by Remain supporters. Those who deplore the British electorate’s excessive attachment to self-government as some sort of impertinence should be clear (not least with themselves) about whether they believe that the age of democracy in Europe is over, and that great decisions should be left to professional politicians, bureaucracies and large corporations.

Some have dismissed the Leave vote as an incoherent and anarchic protest against “the establishment”, or as a xenophobic reaction against immigrants. Some of the media in Britain and abroad have been doing their best to propagate this view. Yet xenophobia has not been a significant feature of British politics since the 1960s, and certainly far less so than in many obedient EU member states, including France, Germany, Greece and the Netherlands. As for the anti-establishment “revolt”, this emerged when parts of the establishment began to put organised pressure on the electorate to vote Remain. Would-be opinion-formers have hardly covered themselves in glory in recent weeks. They have been out of touch and out of sympathy with opinion in the country, unwilling or unable to engage in reasoned debate, and resorting to collective proclamations of institutional authority which proved embarrassingly ineffective.

Worst of all, their main argument – whether they were artists, actors, film-makers, university vice-chancellors or prestigious learned societies – was one of unabashed self interest: the EU is our milch-cow, and hence you must feed it. This was a lamentable trahison des clercs. The reaction to the referendum result by some Remain partisans has been a monumental fit of pique that includes talking up economic crisis (which, as Keynes showed, is often self-fulfilling) and smearing 17 million Leave voters as xenophobes. This is both irresponsible and futile, and paves the way to political marginalisation.

The Queen’s call for “deeper, cooler consideration” is much needed. I recall Victor Hugo’s crushing invective against French elitists who rejected the verdict of democracy, when in 1850 he scorned “your ignorance of the country today, the antipathy that you feel for it and that it feels for you”.

This antipathy has reduced English politics to a temporary shambles. It is too early to say whether there will be some realignment of the fragments: One-Nation Toryism, Conservative neoliberalism, “new” and “old” Labour, the hibernating Liberal Democrats and Greens, the various nationalists and, of course, the unpredictable Ukip. When in the past there were similar crises – such as Labour’s rift over the national government in 1931, the Liberals’ split over Irish home rule in 1886, or the Tory fragmentation over the repeal of the Corn Laws in 1846 – the political balance was permanently changed.

***

Many Europeans fear that a breakdown of the EU could slide into a return to the horrors of the mid-20th century. Most people in Britain do not. The fundamental feature of the referendum campaign was that the majority was not frightened out of voting for Leave, either by political or by economic warnings. This is testimony to a significant change since the last referendum in 1975: most people no longer see Britain as a declining country dependent on the EU.

A Eurobarometer poll in 2013 showed that Britain was the only EU member state in which most citizens felt that they could face the future better outside the Union. Last month’s referendum reflected this view, which was not reversed by reiterated predictions of doom.

In retrospect, joining the Common Market in 1973 has proved an immense historic error. It is surely evident that we would not have been applying to join the EU in 2016 had we, like Norway or Switzerland, remained outside it. Yet the political and possibly economic costs of leaving it now are considerable. Even though discontent with the EU across much of Europe has recently overtaken sentiment in Britain, Britain is unique, in that, ever since the 1970s, its public has been consistently far less ­favourable to the idea of European integration than the electorate in any other country. Hence the various “opt-outs” and the critically important decision to remain outside the euro.

Now, by a great historic irony, we are heading towards the sort of associate status with the EU that we had in the late 1960s as the leading member of Efta, and which we could have kept. Instead, this country was led by its political elite, for reasons of prestige and because of exaggerated fears of national decline and marginalisation, into a vain attempt to be “at the heart of Europe”. It has been a dangerous illusion, born of the postwar declinist obsession, that Britain must “punch above its weight” both by following in the footsteps of the United States and by attaching itself to the EU.

For some, money, blood and control over our own policy were sacrifices worth making for a “seat at the top table”. This dual strategy has collapsed. In future we shall have to decide what is the appropriate and desirable role for Britain to play in the world, and we shall have to decide it for ourselves.

Robert Tombs is Professor of French History at Cambridge University. His most recent book is “The English and Their History” (Penguin)

This article first appeared in the 21 July 2016 issue of the New Statesman, The English Revolt