JOHN DEVOLLE/GETTY IMAGES
Show Hide image

Fitter, dumber, more productive

How the craze for Apple Watches, Fitbits and other wearable tech devices revives the old and discredited science of behaviourism.

When Tim Cook unveiled the latest operating system for the Apple Watch in June, he described the product in a remarkable way. This is no longer just a wrist-mounted gadget for checking your email and social media notifications; it is now “the ultimate device for a healthy life”.

With the watch’s fitness-tracking and heart rate-sensor features to the fore, Cook explained how its Activity and Workout apps have been retooled to provide greater “motivation”. A new Breathe app encourages the user to take time out during the day for deep breathing sessions. Oh yes, this watch has an app that notifies you when it’s time to breathe. The paradox is that if you have zero motivation and don’t know when to breathe in the first place, you probably won’t survive long enough to buy an Apple Watch.

The watch and its marketing are emblematic of how the tech trend is moving beyond mere fitness tracking into what might one call quality-of-life tracking and algorithmic hacking of the quality of consciousness. A couple of years ago I road-tested a brainwave-sensing headband, called the Muse, which promises to help you quiet your mind and achieve “focus” by concentrating on your breathing as it provides aural feedback over earphones, in the form of the sound of wind at a beach. I found it turned me, for a while, into a kind of placid zombie with no useful “focus” at all.

A newer product even aims to hack sleep – that productivity wasteland, which, according to the art historian and essayist Jonathan Crary’s book 24/7: Late Capitalism and the Ends of Sleep, is an affront to the foundations of capitalism. So buy an “intelligent sleep mask” called the Neuroon to analyse the quality of your sleep at night and help you perform more productively come morning. “Knowledge is power!” it promises. “Sleep analytics gathers your body’s sleep data and uses it to help you sleep smarter!” (But isn’t one of the great things about sleep that, while you’re asleep, you are perfectly stupid?)

The Neuroon will also help you enjoy technologically assisted “power naps” during the day to combat “lack of energy”, “fatigue”, “mental exhaustion” and “insomnia”. When it comes to quality of sleep, of course, numerous studies suggest that late-night smartphone use is very bad, but if you can’t stop yourself using your phone, at least you can now connect it to a sleep-enhancing gadget.

So comes a brand new wave of devices that encourage users to outsource not only their basic bodily functions but – as with the Apple Watch’s emphasis on providing “motivation” – their very willpower.  These are thrillingly innovative technologies and yet, in the way they encourage us to think about ourselves, they implicitly revive an old and discarded school of ­thinking in psychology. Are we all neo-­behaviourists now?

***

The school of behaviourism arose in the early 20th century out of a virtuous scientific caution. Experimenters wished to avoid anthropomorphising animals such as rats and pigeons by attributing to them mental capacities for belief, reasoning, and so forth. This kind of description seemed woolly and impossible to verify.

The behaviourists discovered that the actions of laboratory animals could, in effect, be predicted and guided by careful “conditioning”, involving stimulus and reinforcement. They then applied Ockham’s razor: there was no reason, they argued, to believe in elaborate mental equipment in a small mammal or bird; at bottom, all behaviour was just a response to external stimulus. The idea that a rat had a complex mentality was an unnecessary hypothesis and so could be discarded. The psychologist John B Watson declared in 1913 that behaviour, and behaviour alone, should be the whole subject matter of psychology: to project “psychical” attributes on to animals, he and his followers thought, was not permissible.

The problem with Ockham’s razor, though, is that sometimes it is difficult to know when to stop cutting. And so more radical behaviourists sought to apply the same lesson to human beings. What you and I think of as thinking was, for radical behaviourists such as the Yale psychologist Clark L Hull, just another pattern of conditioned reflexes. A human being was merely a more complex knot of stimulus responses than a pigeon. Once perfected, some scientists believed, behaviourist science would supply a reliable method to “predict and control” the behaviour of human beings, and thus all social problems would be overcome.

It was a kind of optimistic, progressive version of Nineteen Eighty-Four. But it fell sharply from favour after the 1960s, and the subsequent “cognitive revolution” in psychology emphasised the causal role of conscious thinking. What became cognitive behavioural therapy, for instance, owed its impressive clinical success to focusing on a person’s cognition – the thoughts and the beliefs that radical behaviourism treated as mythical. As CBT’s name suggests, however, it mixes cognitive strategies (analyse one’s thoughts in order to break destructive patterns) with behavioural techniques (act a certain way so as to affect one’s feelings). And the deliberate conditioning of behaviour is still a valuable technique outside the therapy room.

The effective “behavioural modification programme” first publicised by Weight Watchers in the 1970s is based on reinforcement and support techniques suggested by the behaviourist school. Recent research suggests that clever conditioning – associating the taking of a medicine with a certain smell – can boost the body’s immune response later when a patient detects the smell, even without a dose of medicine.

Radical behaviourism that denies a subject’s consciousness and agency, however, is now completely dead as a science. Yet it is being smuggled back into the mainstream by the latest life-enhancing gadgets from Silicon Valley. The difference is that, now, we are encouraged to outsource the “prediction and control” of our own behaviour not to a benign team of psychological experts, but to algorithms.

It begins with measurement and analysis of bodily data using wearable instruments such as Fitbit wristbands, the first wave of which came under the rubric of the “quantified self”. (The Victorian polymath and founder of eugenics, Francis Galton, asked: “When shall we have anthropometric laboratories, where a man may, when he pleases, get himself and his children weighed, measured, and rightly photographed, and have their bodily faculties tested by the best methods known to modern science?” He has his answer: one may now wear such laboratories about one’s person.) But simply recording and hoarding data is of limited use. To adapt what Marx said about philosophers: the sensors only interpret the body, in various ways; the point is to change it.

And the new technology offers to help with precisely that, offering such externally applied “motivation” as the Apple Watch. So the reasoning, striving mind is vacated (perhaps with the help of a mindfulness app) and usurped by a cybernetic system to optimise the organism’s functioning. Electronic stimulus produces a physiological response, as in the behaviourist laboratory. The human being herself just needs to get out of the way. The customer of such devices is merely an opaquely functioning machine to be tinkered with. The desired outputs can be invoked by the correct inputs from a technological prosthesis. Our physical behaviour and even our moods are manipulated by algorithmic number-crunching in corporate data farms, and, as a result, we may dream of becoming fitter, happier and more productive.

***

 

The broad current of behaviourism was not homogeneous in its theories, and nor are its modern technological avatars. The physiologist Ivan Pavlov induced dogs to salivate at the sound of a bell, which they had learned to associate with food. Here, stimulus (the bell) produces an involuntary response (salivation). This is called “classical conditioning”, and it is advertised as the scientific mechanism behind a new device called the Pavlok, a wristband that delivers mild electric shocks to the user in order, so it promises, to help break bad habits such as overeating or smoking.

The explicit behaviourist-revival sell here is interesting, though it is arguably predicated on the wrong kind of conditioning. In classical conditioning, the stimulus evokes the response; but the Pavlok’s painful electric shock is a stimulus that comes after a (voluntary) action. This is what the psychologist who became the best-known behaviourist theoretician, B F Skinner, called “operant conditioning”.

By associating certain actions with positive or negative reinforcement, an animal is led to change its behaviour. The user of a Pavlok treats herself, too, just like an animal, helplessly suffering the gadget’s painful negative reinforcement. “Pavlok associates a mild zap with your bad habit,” its marketing material promises, “training your brain to stop liking the habit.” The use of the word “brain” instead of “mind” here is revealing. The Pavlok user is encouraged to bypass her reflective faculties and perform pain-led conditioning directly on her grey matter, in order to get from it the behaviour that she prefers. And so modern behaviourist technologies act as though the cognitive revolution in psychology never happened, encouraging us to believe that thinking just gets in the way.

Technologically assisted attempts to defeat weakness of will or concentration are not new. In 1925 the inventor Hugo Gernsback announced, in the pages of his magazine Science and Invention, an invention called the Isolator. It was a metal, full-face hood, somewhat like a diving helmet, connected by a rubber hose to an oxygen tank. The Isolator, too, was designed to defeat distractions and assist mental focus.

The problem with modern life, Gernsback wrote, was that the ringing of a telephone or a doorbell “is sufficient, in nearly all cases, to stop the flow of thoughts”. Inside the Isolator, however, sounds are muffled, and the small eyeholes prevent you from seeing anything except what is directly in front of you. Gernsback provided a salutary photograph of himself wearing the Isolator while sitting at his desk, looking like one of the Cybermen from Doctor Who. “The author at work in his private study aided by the Isolator,” the caption reads. “Outside noises being eliminated, the worker can concentrate with ease upon the subject at hand.”

Modern anti-distraction tools such as computer software that disables your internet connection, or word processors that imitate an old-fashioned DOS screen, with nothing but green text on a black background, as well as the brain-measuring Muse headband – these are just the latest versions of what seems an age-old desire for technologically imposed calm. But what do we lose if we come to rely on such gadgets, unable to impose calm on ourselves? What do we become when we need machines to motivate us?

***

It was B F Skinner who supplied what became the paradigmatic image of ­behaviourist science with his “Skinner Box”, formally known as an “operant conditioning chamber”. Skinner Boxes come in different flavours but a classic example is a box with an electrified floor and two levers. A rat is trapped in the box and must press the correct lever when a certain light comes on. If the rat gets it right, food is delivered. If the rat presses the wrong lever, it receives a painful electric shock through the booby-trapped floor. The rat soon learns to press the right lever all the time. But if the levers’ functions are changed unpredictably by the experimenters, the rat becomes confused, withdrawn and depressed.

Skinner Boxes have been used with success not only on rats but on birds and primates, too. So what, after all, are we doing if we sign up to technologically enhanced self-improvement through gadgets and apps? As we manipulate our screens for ­reassurance and encouragement, or wince at a painful failure to be better today than we were yesterday, we are treating ourselves similarly as objects to be improved through operant conditioning. We are climbing willingly into a virtual Skinner Box.

As Carl Cederström and André Spicer point out in their book The Wellness Syndrome, published last year: “Surrendering to an authoritarian agency, which is not just telling you what to do, but also handing out rewards and punishments to shape your behaviour more effectively, seems like undermining your own agency and autonomy.” What’s worse is that, increasingly, we will have no choice in the matter anyway. Gernsback’s Isolator was explicitly designed to improve the concentration of the “worker”, and so are its digital-age descendants. Corporate employee “wellness” programmes increasingly encourage or even mandate the use of fitness trackers and other behavioural gadgets in order to ensure an ideally efficient and compliant workforce.

There are many political reasons to resist the pitiless transfer of responsibility for well-being on to the individual in this way. And, in such cases, it is important to point out that the new idea is a repackaging of a controversial old idea, because that challenges its proponents to defend it explicitly. The Apple Watch and its cousins promise an utterly novel form of technologically enhanced self-mastery. But it is also merely the latest way in which modernity invites us to perform operant conditioning on ourselves, to cleanse away anxiety and dissatisfaction and become more streamlined citizen-consumers. Perhaps we will decide, after all, that tech-powered behaviourism is good. But we should know what we are arguing about. The rethinking should take place out in the open.

In 1987, three years before he died, B F Skinner published a scholarly paper entitled Whatever Happened to Psychology as the Science of Behaviour?, reiterating his now-unfashionable arguments against psychological talk about states of mind. For him, the “prediction and control” of behaviour was not merely a theoretical preference; it was a necessity for global social justice. “To feed the hungry and clothe the naked are ­remedial acts,” he wrote. “We can easily see what is wrong and what needs to be done. It is much harder to see and do something about the fact that world agriculture must feed and clothe billions of people, most of them yet unborn. It is not enough to advise people how to behave in ways that will make a future possible; they must be given effective reasons for behaving in those ways, and that means effective contingencies of reinforcement now.” In other words, mere arguments won’t equip the world to support an increasing population; strategies of behavioural control must be designed for the good of all.

Arguably, this authoritarian strand of behaviourist thinking is what morphed into the subtly reinforcing “choice architecture” of nudge politics, which seeks gently to compel citizens to do the right thing (eat healthy foods, sign up for pension plans) by altering the ways in which such alternatives are presented.

By contrast, the Apple Watch, the Pavlok and their ilk revive a behaviourism evacuated of all social concern and designed solely to optimise the individual customer. By ­using such devices, we voluntarily offer ourselves up to a denial of our voluntary selves, becoming atomised lab rats, to be manipulated electronically through the corporate cloud. It is perhaps no surprise that when the founder of American behaviourism, John B Watson, left academia in 1920, he went into a field that would come to profit very handsomely indeed from his skills of manipulation – advertising. Today’s neo-behaviourist technologies promise to usher in a world that is one giant Skinner Box in its own right: a world where thinking just gets in the way, and we all mechanically press levers for food pellets.

This article first appeared in the 18 August 2016 issue of the New Statesman, Corbyn’s revenge

Getty
Show Hide image

Twilight of the postwar era

This Brexit-focused election is just one milestone in a long and complex relationship between the UK and the EU.

On 25 March the European Union celebrated its 60th birthday in Rome. Of the 28 members, only the United Kingdom declined to attend, signalling, to quote one senior EU diplomat, that it didn’t think the occasion was “appropriate for us”. The Daily Express called this a blatant “snub” to Brussels.

On 29 March Theresa May sent her “Dear Donald” letter – not, of course, to that dear Donald but to “President Tusk” at the EU in Brussels. It was delivered by a senior British diplomat with an antique and strained politesse reminiscent of his predecessors in Berlin in August 1914 and September 1939.

On 18 April the PM declared that it was in the national interest to hold a snap general election on 8 June, having five times in person or through official sources denounced the idea of going to the country before the set date in 2020.

On 29 April, a month after the PM’s letter, Donald Tusk secured agreement from the remaining 27 member states for the EU’s negotiating guidelines.

The following day the press reported a total face-off between May and Jean-Claude Juncker, the head of the European Commission, and EU negotiators at a Downing Street dinner. She was living “in a different galaxy”, Juncker is said to have exclaimed. May dismissed the story as “Brussels gossip”. But then, on 3 May, in an address outside 10 Downing Street, the Prime Minister hit back, accusing senior EU politicians and officials of meddling in the British election campaign.

Whom you believe depends, as usual, on which side of our national chasm you are standing. Of one thing we can be sure. The spin and the propaganda will go on remorselessly, day after day, for years to come, as this country tries to talk its way out of a European union in which it has never felt at home. To keep our bearings amid the dizzying intergalactic spin, it is worth taking a longer view. Because history matters in this debate and few of our “leaders” seem to have any historical perspective.

***

At 60 the EU is a senior citizen – rather stiff in the joints, grossly overweight and often a bit of a bore. It’s hard now to recall the heady hopes that its birth aroused. After two ruinous wars in three decades, many western European leaders were determined to escape from the vortex of belligerent nationalism.

Six countries signed the original Treaty of Rome in March 1957 to set up the European Economic Community. The EEC was a common market and customs union between Belgium, France, Luxembourg, the Netherlands and two defeated Axis powers from 1945 – Italy and West Germany. Britain could have been present at the creation; in fact, most of the six wanted us to join. But then, as now, the message was: “We don’t think it is appropriate for us.”

In part, the motives behind founding the EEC were economic. Hard borders and high tariffs would hamper recovery after the war. Belgium, Luxembourg and the Netherlands had already formed the Benelux customs union in 1948. They were also natural trading partners with Germany, sharing the Rhine-Meuse-Scheldt Delta, and Germany had vied with France for decades over the mineral resources of the Saar and the Ruhr. Now the six countries decided to pool these vital assets. The European Coal and Steel Community (ECSC) of 1952 was a stepping stone to the Treaty of Rome.

None of these states had abandoned the pursuit of national interests; rather, they were going about it in less confrontational ways. Electorates, still haunted by the Depression of the 1930s, now expected their governments not only to ensure order and security but also to stimulate growth and provide welfare. In these circumstances, some erosion of national sovereignty seemed necessary, even desirable. Prosperity wasn’t a zero-sum game, built on hard-nosed “us first” policies, but would be fostered by calculated yet enlightened interdependence. For the modern state, in short, absolute sovereignty could not be an end in itself.

That said, the essential imperative of European integration was not economic but political. For France and Germany, 1914 and 1939 were just the most recent manifestations of their bloody past, a cycle of wars that stretched back to the days of Bismarck, Napoleon and Louis XIV. Sedan 1870, Leipzig 1813, Jena 1806, Valmy 1792, Turckheim 1675 – the victories were emblazoned on public monuments and celebrated in school textbooks, the defeats quietly forgotten. ­European integration offered a chance for the French and the Germans to break free from centuries of tit-for-tat conflicts; a belated acceptance of the dictum “If you can’t beat them, join them”.

The Benelux countries were caught in the jaws of that Franco-German antagonism: whenever the two big beasts bit on each other, the three little ones felt the pain. ­Italy, the other founding member, was – like West Germany – desperate to jettison its pariah status from the Fascist era. So Rome 1957 served as a belated peace treaty, drawing a line under the Second World War for western Europe.

This zeal to transcend hard nationalism is seen most strikingly in the life of Robert Schuman, the man now celebrated as the “Father of Europe”. Born in 1886, Schuman grew up in Luxembourg but was educated at German universities and practised law in the city of Metz, in Lorraine – then part of Germany thanks to its victory in 1870-71. When the next war broke out in 1914, he was conscripted into the kaiser’s army: only medical problems saved him from having to fight against the French.

In 1919 France recovered Alsace and Lorraine, so Schuman became a French citizen and got into French politics. From 1942 to 1945 he fought in the wartime Resistance and then, amid France’s postwar kaleidoscopic politics, served variously as finance minister, prime minister and foreign minister. It was Schuman’s celebrated declaration of 9 May 1950 that paved the way for the ECSC and the Treaty of Rome.

Today the “Schuman roundabout” lies at the heart of the EU quarter in Brussels – an apt memorial, because his experience of the (un)merry-go-round of belligerent nationalism inspired his commitment to European integration. He was not alone. The West German chancellor Konrad Adenauer (born 1876) was a Rhinelander from Cologne who served as that city’s mayor from 1917 to 1933, until he was sacked by the Nazis. Over the years he had in turn chafed at Prussian domination of the Rhineland, feared French annexation, and endured two stretches of British military occupation.

The Italian premier Alcide De Gasperi (born 1881) had started his political life in the Austrian parliament before 1914, when his homeland, Trentino/South Tyrol, still belonged to the Habsburg empire. After the region was transferred to Italy in 1919, De Gasperi resumed his political career not in Vienna but in Rome, opposing first the Fascists and then the Communists.

The early lives of these three men along the shifting borderlands of war-torn Europe brought home to them the suicidal futility of hard nationalism. They also shared a profound sense of Catholic Europe, extending back through the Holy Roman empire to the era of Charlemagne.

It was from this historical platform that Schuman approached European integration. “If we don’t want to fall back into the old errors in dealing with the German problem,” he said, “there is only one solution: that is the European solution.” Coal and steel were an ideal starting point because they were double-edged – vital for industrial growth but also for waging war. Surrendering national control over these critical assets could enhance prosperity and peace.

***

The British approach to “Europe” was very different. In the mid-20th century Britain still saw itself as a global power. The sterling area took half of all British exports: western Europe, struggling to recover from the war, less than a quarter. In 1951 British industrial production equalled that of France and West Germany combined. And although Britain worked closely with France in 1947-49 over the Marshall Plan and the North Atlantic Treaty, its engagement with the Continent had clear limits.

“Our policy should be to assist Europe to recover, as far as we can,” senior British civil servants advised in 1949. “But the concept must be one of limited liability. In no circumstances must we assist them beyond the point at which the assistance leaves us too weak to be a worthwhile ally for USA if Europe collapses . . .”

“Limited liability” was a philosophy rooted in Britain’s experience of the war – also markedly different from that of the Six. In May and June of 1940, Germany conquered France, Belgium, Luxembourg and the Netherlands, with Italy jumping in to grab some of the spoils. That summer is now engraved in British national mythology. It was immortalised in David Low’s Very Well, Alone cartoon for the Evening Standard, depicting a pugnacious Tommy breathing defiance to the world from a rock in storm-tossed seas.

Victory was eventually achieved not with the Continentals, who seemed to be either foes or failures, but in alliance with those whom Churchill called “the English-speaking peoples” – above all, the United States. From this perspective, “sovereignty” clearly worked: we successfully defended our iconic southern border, the white cliffs of Dover, and gained ultimate victory. Only those who had been defeated (in 1940 or 1945) would imagine surrendering any national powers to a higher authority.

In 1950, therefore, when the Labour cabinet decided that the Schuman Plan was not appropriate for us, it was following the majority view in Whitehall and Westminster. Ernest Bevin, the ailing but still doughty foreign secretary who had led Britain’s drive for closer intergovernmental co-operation with France in the 1940s, had no time for the dread word “federalism”. In his inimitable phrase, “If you open that Pandora’s box, you never know what Trojan ’orses will jump out.” Pressed by the Americans to take these ideas more seriously, he questioned how he could go to his London dockland constituents in Woolwich, blitzed by the Luftwaffe in 1940, and explain that the Germans would help them in a war with Russia. As for France, he sniffed, “the man in the street, coming back from a holiday there, was almost invariably struck by the defeatist attitude of the French”. Great Britain, he exclaimed, was “not part of Europe”; she was “not simply a Luxembourg”.

This was a bipartisan attitude, endorsed by the Tories when they regained office in 1951. Churchill conjured up the image of three overlapping “circles” of global power, with Britain involved in each but not confined to any: the Commonwealth and empire; the “English-speaking world”; and, as he put it to the cabinet in November that year, “United Europe, to which we are a separate, closely and specially related ally and friend”. He and his successor Anthony Eden welcomed European integration for “them”, not “us” – as a way of reconciling France and Germany. After the Six embarked in 1956-57 on talks in Brussels about further integration, the British sent not a government minister but a Board of Trade official, and then merely as an “observer”.

The accepted wisdom in London remained that Britain’s trading interests were global and that a protectionist European bloc would be dangerous. Yet that kind of common market was not a foregone conclusion. Britain had a powerful potential ally within the Six in the form of West Germany, and especially its influential economics minister, Ludwig Erhard.

Almost as much as London, Bonn’s trading interests were global: 40 per cent of its exports went beyond Europe and much of West Germany’s European trade was outside the Six, with Austria, Scandinavia, Switzerland and the UK. Like the British, Erhard wanted a reduction of global tariff barriers to promote free trade, rather than the high-tariff, protectionist bloc favoured by Paris to defend France’s flabby economy. Yet a common market was inconceivable without the French, and Chancellor Adenauer – focused on postwar reconciliation – insisted that politics mattered more than economics. Erhard was told to get the best deal he could as long as France was “in”.

So that left the French able largely to dictate their terms. Among these were a steep external tariff, inclusion within the EEC of France’s overseas territories, acceptance across the Six of France’s high welfare payments and the development of a Common Agricultural Policy (Cap), which subsidised inefficient farming. By 1970 the Cap consumed 70 per cent of the EEC budget. But, as a senior Italian official observed ruefully, “Europe cannot organise without France and, to get her in, prices must be paid which may seem exorbitant.”

What would have happened if Britain had been fully engaged in these negotiations from the start? Might it have strengthened Erhard’s hand and helped forge a strong
Anglo-German axis in favour of a looser, more open free-trade area? That would have put pressure on Paris to accept London and Bonn’s terms, or be left out in the cold. In which case European integration could have developed along very different lines, with a Franco-German-British triangle operating in creative tension at the heart of the new Europe in an EEC that, in effect, would have been 3 + 4. A tantalising “what if”, but it would have required a very different attitude
in Britain towards its future and its past.

***

And so the EEC was born on New Year’s Day 1958 with six founder members, not seven. The British had been completely wrong-footed. In 1950 they expected Schuman’s pipe dream to go up in smoke; they were equally complacent about the Brussels talks in 1956-57; and they repeated the mistake yet again in assuming it would take years for the EEC to become a reality. Instead, not only was the EEC now a fact, but the Six made rapid progress in dismantling tariff barriers and agreeing the basics of the Cap. By 1961 they were seriously debating political union, or at least a common foreign policy.

London struggled to believe that those despised Continentals, who in their various ways had botched the Second World War, could bury the hatchet and work together. British complacency, even arrogance, has aptly been called the “price of victory”. And we’ve been paying the bill ever since.

Once the Six was up and running, there was a grave danger of Britain being marginalised. The European community threatened
to become “the only Western bloc approaching in importance the Big Two – the USSR and the United States”, a senior Whitehall committee warned in 1960. Aside from the economic damage that would ensue, “if we try to remain aloof from them” Britain would “run the risk of losing political influence and of ceasing to be able to exercise any claim to be a world Power”. The economic case for membership was still finely balanced: commercial and emotional ties with the Commonwealth, strengthened by the war, remained strong. Yet, for Harold Macmillan, like Adenauer in 1956, politics took precedence over economics. In August 1961 his government applied to join the EEC.

But the price of victory kicked in again. Charles de Gaulle had not forgotten or forgiven Roosevelt and Churchill for treating his Free French as second-class members of the wartime alliance. A fierce nationalist, he accepted the European project but sought to turn it to France’s advantage, or his conception of this. Crucial to his strategy was keeping Britain out of the EEC.

“My dear chap, it’s very simple,” the French agriculture minister told his British counterpart. “At the moment, with the Six, there are five hens and one cock. If you join, with other countries, there will be perhaps seven or eight hens. But there will be two cocks. Well, that is not so pleasant.”

Determined to rule the roost, de Gaulle blocked first Macmillan’s application to join and then Harold Wilson’s. By the time he retired and Edward Heath had negotiated terms of entry, 15 years had elapsed since 1 January 1958. The original deal-making among the Six had set hard, to their advantage. Any new member had to accept the club rules as given: the “acquis commun­autaire”, in Eurospeak. Worse still, in 1973, just months after Denmark, Ireland and the UK had joined the community, the bottom fell out of the world economy with the oil crisis, recession and stagflation, making it nigh impossible amid all the crisis management to force the EEC into reform as Heath had hoped. The good ship Europe had been launched on the high tide of postwar prosperity. But as the Six became the Nine, that tide began to ebb. We have never had it so good – ever again.

Since the 1970s and Britain’s “entry” into Europe, successive prime ministers have tried to undo the damage caused by their aloof predecessors. Most have done so “alone” – in 1940 mode – rather than working to form alliances with reform-minded colleagues on the Continent. In particular, as in the mid-1950s, they failed to build creative partnerships with the Germans.

Margaret Thatcher was a notable example. Her cantankerous “handbagging” secured rebates on British budget contributions in excess of what probably could have been obtained by “normal” diplomacy, but it alienated many of her European colleagues. And her visceral suspicion of the Germans, dating back to the Second World War, poisoned relations with Bonn. “She doesn’t really believe that there’s any such thing as useful negotiation,” observed Sir Nicholas Henderson, a high-ranking British diplomat. “She doesn’t see foreign policy as it is, which is a lot of give and take.”

Yet Thatcher was only the extreme case. Even prime ministers who were more “pro-Europe”, such as John Major and Tony Blair, were hamstrung by domestic politics – meaning both the rooted Euroscepticism of Tory backbenchers and also the tabloids’ determination to treat every encounter with “Europe” as a replay of old battles. Woe betide any British PM who returned from Brussels without being able to proclaim victory in another Waterloo (though the 1815 battle was won in tandem with the Germans, plus Dutch and Belgian support).

The Brexit frenzy is only the latest round in that story. Even on the Remain side, the Cameron-Osborne campaign – a breathtaking blend of arrogance and incompetence – chose to make its case almost entirely by economic scaremongering about the dangers of Leave (through “Project Fact”, aka “Project Fear”), rather than highlighting positives of the European project, especially its enduring contribution to postwar peace.

Of course, the EU has often been its own worst enemy. Reform has been slow: the Cap, for instance, accounted for 73 per cent of total EU spending as late as 1985 and did not fall below 40 per cent until 2013, still a remarkable figure for one of the most industrialised regions of the world. Institutionally, the bureaucracy is flabby; financial control is weak; decision-making is ponderous; the European Commission frequently locks horns with the European Council (the heads of government); and the persistent “democratic deficit” has exacerbated a popular sense of alienation.

Repeatedly, too, politics has trumped economics, particularly over the question of enlargement. In the 1980s the Nine ­became 12 in order to embrace three underdeveloped countries that had recently thrown off authoritarian regimes: Greece, Spain and Portugal. In the 1990s the euro was driven not just by the ambition of Jacques Delors but by the determination of François Mitterrand and Helmut Kohl to anchor the financial and industrial power of a unified Germany firmly in European structures – updating, if you like, Schuman’s vision. And since 2000, the EU has welcomed in from the Cold (War) those countries of eastern Europe that were anxious to escape the Russian bear hug. All these politically inspired moves have come at an economic price. To be sure, the EU28 is far more truly “European” geographically, but the original Six (apart from southern Italy) had a coherence as developed economies and functioning democracies that today’s mixed bag of members conspicuously lacks. Yet the EU project has continued to be animated by aspirations for close economic and political union that date from the 1950s.

***

Sixty is a ripe age. Many institutions do not survive that long and the EU (like Nato, founded in 1949) is painfully aware of the need to think imaginatively about its form and direction. The “Future of Europe” was firmly on the menu even at the Rome birthday party. On 29 March 2017 the UK, by contrast, began Year Zero – reborn into a brave new, Britain-shaped world, if you believe the Prime Minister; tumbling into the abyss, if you heed remaindered Remainers. Now Old EU@60 is about to meet New UK@0 for a long and bruising battle.

The stakes are high on both sides. Brussels is in no mood to let Britain off lightly: an easy exit would encourage other waverers and jeopardise the whole European project. Across the Channel, if May puts politics before economics (“control” of borders over access to the single market) her hard nationalism could alienate Scotland, undermine the Irish settlement, rupture the United Kingdom and end in no deal. A “full English” Brexit might prove very expensive.

The tabloids will doubtless report it as a replay of 1940 and “Our Finest Hour”: an earlier Brexit moment. Attentive as ever to them, May has embraced the description of herself as a “bloody difficult woman” who is eager to “fight for Britain”, in Churchill-Thatcher mode. Is her snap election intended to pave the way for a hard, nativist Brexit? Or does she just hope that a bigger majority will give her more room for manoeuvre in battling Brussels? No one knows, probably not even May herself. Current negotiating strategies, like battle plans, will not survive the first encounter with “the enemy”.

That is why it is important, amid the daily barrage of spin, sneers and aggro, to keep the bigger historical picture in mind. Because we may be entering the twilight of what can be called the postwar era, which began in the decade after 1945, when the horrors of belligerent nationalism prompted a fervent effort to make peace and build truly international institutions. The UN, Nato and the EEC were all products of that creative moment; likewise the General Agreement on Tariffs and Trade and the Universal Declaration of Human Rights.

This fabric of postwar internationalism is now ageing and strained – often in need of radical modification – yet in a world where nationalism, protectionism and racism are on the rise, it provides some flimsy protection against the law of the jungle. If Brexit is handled belligerently, it could help to pull the threads from that thin tissue of coexistence and co-operation.

Our leaders show little awareness of what is at stake historically. According to US Vogue’s recent interview with Theresa May, “She says she doesn’t read much history and tries not to picture how things will be in advance.” Jeremy Corbyn seems to live in an ideological time warp of his own. Boris Johnson does have historical sensitivity, but of a typically self-serving sort: see his entertaining little (auto)biography of Churchill.

This Brexit election is just an early milestone on a long and painful road. It took the UK over 11 years from first applying to joining the EEC. It may take as long to complete a full, legally watertight exit from the EU. Certainly, for the next few years, at a time when so many global problems are crying out for creative policymaking, the EU and the UK will confront each other obsessively to the exclusion of almost everything else. A dysfunctional union and a disunited kingdom – each captivated by its contrasting past – will struggle and muddle towards divergent futures.

David Reynolds’s books include “Britannia Overruled” (Routledge) and “The Long Shadow: the Great War and the 20th Century” (Simon & Schuster)

This article first appeared in the 25 May 2017 issue of the New Statesman, Why Islamic State targets Britain

0800 7318496