Show Hide image

A thinker for our times

Global leaders are once again reminding themselves of the insights of the Cambridge academic who hel

John Maynard Keynes has been restored to life. Rusty Keynesian tools – larger budget deficits, tax cuts, accelerated spending programmes and other “economic stimuli” – have been brought back into use the world over to cut off the slide into depression. And they will do the job, if not next year, the year after. But the first Keynesian revolution was not about a rescue operation. Its purpose was to explain how shipwreck might occur; in short, to provide a theoretical basis for better navigation and for steering in seas that were bound to be choppy. Yet, even while the rescue operation is going on, we need to look critically at the economic theory that takes his name.

In his great work The General Theory of Employment, In terest and Money, written during the Great Depression of the 1930s, Keynes said of his ideas that they were "extremely simple, and should be obvious". Market economies were in herently volatile, owing to un certainty about future events being inescapable. Booms were liable to lead to catastrophic collapses followed by long periods of stagnation. Governments had a vital role to play in stabilising market economies. If they did not, the undoubted benefit of markets would be lost and political space would open up for extremists who would offer to solve economic problems by abolishing both markets and liberty. This, in a nutshell, was the Keynesian "political economy".

These ideas were a challenge to the dominant economic models of the day which held that, in the absence of noxious government interference, market economies were naturally stable at full employment. Trading in all markets would always take place at the "right" prices – prices that would "clear the market". This being so, booms and slumps, and prolonged unemployment, could not be generated by the market system itself. If they did happen, it was due to "external shocks". There were many attempts to explain the Great Depression of the 1930s along these lines – as a result of the dislocations of the First World War, of the growth of trade union power to prevent wages falling, and so on. But Keynes rightly regarded such explanations as self-serving. The Great Depression started in the United States, not in war-torn Europe, and in the most lightly regulated, most self-contained, and least unionised, market economy of the world. What were the "external shocks" that caused the Dow Jones Index to fall from 1,000 to 40 between 1929 and 1932, American output to drop by 20 per cent and unemployment to rise to 25 million?

He set out to save capitalism, a system he did not much admire, because he thought it the best hope for the future of civilisation

We can ask exactly the same question today as the world economy slides downwards. The present economic crisis has been generated by a banking system that had been extensively deregulated and in a flexible, largely non-unionised, economy. Indeed, the American capitalism of the past 15 years strongly resembles the capitalism of the 1920s in general character. To Keynes, it seemed obvious that large instabilities were inherent in market processes themselves.

 

John Maynard Keynes was a product of Cambridge civilisation at its most fertile. He was born in 1883 into an academic family, and his circle included not just the most famous philosophers of the day – G E Moore, Bertrand Russell and Ludwig Wittgenstein – but also that exotic offshoot of Cambridge, the Bloomsbury Group, a commune of writers and painters with whom he formed his closest friendships. Keynes was caught up in the intellectual ferment and sexual awakening that marked the passage from Victorian to Edwardian England. At the same time, he had a highly practical bent: he was a supreme example of what Alasdair MacIntyre calls “the aesthete manager”, who partitions his life between the pleasures of the mind and the senses and the management of public affairs. After the First World War, Keynes set out to save a capitalist system he did not particularly admire. He did so because he thought it was the best guarantor of the possibility of civilisation. But he was always quite clear that the pursuit of wealth was a means, not an end. He did not much admire economics, either, hoping that some day economists would become as useful as dentists.

All of this made him, as his wife put it, "more than an economist". In fact, he was the most brilliant non-economist who ever applied himself to the study of economics. In this lay both his greatness and his vulnerability. He imposed himself on his profession by a series of profound insights into human behaviour which fitted the turbulence of his times. But these were never – could never be – properly integrated into the core of his discipline, which spewed them out as soon as it conveniently could. He died of heart failure in 1946, having worked himself to death in the service of his country.

The economic theory of Keynes's day, which precluded boom-bust sequences, seemed patently contrary to experience, yet its foundations were so deep-dug, its defences so secure, its reasoning so compelling, that it took Keynes three big books – including a two-volume Treatise on Money – to see how it might be cracked. His attempt to do so was the most heroic intellectual enterprise of the 20th century. It was nothing less than the attempt to overturn the dominant economic paradigm dating from Adam Smith and David Ricardo.

He finally said what he wanted to say in the preface to The General Theory: "A monetary economy, we shall find, is one in which changing views about the future are capable of in fluencing the quantity of employment and not merely its direction." In that pregnant sentence is the whole of the Keynesian revolution.

Keynes's understanding about how economies work was rooted in his theory of knowledge. The future was unknowable: so disaster was always possible. Keynes did not believe that the future was wholly unknowable. Not only can we calculate the probability of winning the Lottery, but we can forecast with tolerable accuracy the price movements of consumer goods over a short period. Yet we "simply do not know" what the price of oil will be in ten, or even five, years' time. Investments which promised returns "at a comparatively distant, and sometimes an indefinitely distant, date" were acts of faith, gambles on the unknown. And in that fact lay the possibility of huge mistakes.

Classical economists could not deny the possibility of unpredictable events. Inventions are by their nature unpredictable, especially as to timing, and many business cycle theorists saw them as generating boom-bust cycles. But mainstream economics, nevertheless, "abstracted" from such disturbances. The technique by which it did so is fascinatingly brought out in an argument about economic method between two 19th-century economists, which Keynes cited as a fork in the road. In 1817, Ricardo wrote to his friend Thomas Malthus: "It appears to me that one great cause of our differences . . . is that you have always in your mind the immediate and temporary effects of particular changes, whereas I put these immediate and temporary effects quite aside, and fix my whole attention on the permanent state of things which will result from them."

To this, Malthus replied: "I certainly am disposed to refer frequently to things as they are, as the only way of making one's writing practically useful to society . . . Besides I really do think that the progress of society consists of irregular movements, and that to omit the consideration of causes which for eight or ten years will give a great stimulus to production and population or a great check to them is to omit the causes of the wealth and poverty of nations . . ."

Keynes sided with Malthus. He regarded the timeless equilibrium method pioneered by Ricardo as the great wrong turning in economics. It was surely the Ricardo-Malthus exchange he had in mind when writing his best-known aphorism: "But this long run is a misleading guide to affairs. In the long run we are all dead. Economists set themselves too easy, too useless a task if in tempestuous seasons they can only tell us that when the storm is long past the ocean is flat again."

Ricardo may have thought of the "long run" as the length of time it took storms to disperse. But under the influence of mathematics, economists abandoned the notion of time itself, and therefore of the distinction between the long run and the short run. By Keynes's time, "risks", as he put it, "were supposed to be capable of an exact actuarial computation". If all risks could be measured they could be known in advance. So the future could be reduced to the same epistemological status as the present. Prices would always reflect objective probabilities. This amounted to saying that unregulated market economies would generally be extremely stable. Only very clever people, equipped with adequate mathematics, could believe in anything quite so absurd. Under the influence of this theory, governments withdrew from active management and regulation of economic life: it was the age of laissez-faire.

Keynes commented: "The extraordinary achievement of the classical theory was to overcome the beliefs of the 'natural man' and, at the same time, to be wrong." It was wrong because it "attempts to apply highly precise and mathematical methods to material which is itself much too vague to support such treatment".

Keynes did not believe that "natural man" was irrational. The question he asked was: how do we, as rational investors, behave when we – unlike economists – know that the future is uncertain, or, in economist-speak, know that we are "informationally deprived"? His answer was that we adopt certain "conventions": we assume that the future will be more like the past than experience would justify, that existing opinion as expressed in current prices correctly sums up future prospects, and we copy what everyone else is doing. (As he once put it: "Bankers prefer to be ruined in a conventional way.") But any view of the future based on "so flimsy a foundation" is liable to "sudden and violent changes" when the news changes. "The practice of calmness and immobility, of certainty and security suddenly breaks down. New fears and hopes will, without warning, take charge of human conduct . . . the market will be subject to waves of optimistic and pessimistic sentiment, which are unreasoning yet in a sense legitimate where no solid basis exists for a reasonable calculation."

 

But what is rational for individuals is catastrophic for the economy. Subnormal activity is possible because, in times of crisis, money carries a liquidity premium. This increased "propensity to hoard" is decisive in preventing a quick enough fall in interest rates. The mainstream economics of Keynes's day viewed the interest rate (more accurately, the structure of interest rates) as the price that balances the overall supply of saving with the demand for investment. If the desire to save more went up, interest rates would automatically fall; if the desire to save fell, they would rise. This continual balancing act was what made the market economy self-adjusting. Keynes, on the other hand, saw the interest rate as the "premium" for parting with money. Pessimistic views of the future would raise the price for parting with money, even though the supply of saving was increasing and the demand for investment was falling. Keynes's "liquidity preference theory of the rate of interest" was the main reason he gave for his claim that market economies were not automatically self-correcting. Uncertainty was what ruined the classical scheme.

The same uncertainty made monetary policy a dubious agent of recovery. Even a "cheap money" policy by the central bank might not be enough to halt the slide into depression if the public's desire to hoard money was going up at the same time. Even if you provide the water, you can't force a horse to drink. This was Keynes's main argument for the use of fiscal policy to fight a depression. There is only one sure way to get an increase in spending in the face of falling confidence and that is for the government to spend the money itself.

This, in essence, was the Keynesian revolution. Keynesian economics dominated policymaking in the 25 years or so after the Second World War. The free-market ideologists gave this period such a bad press, that we forget how successful it was. Even slow-growing Britain chugged along at between 2 and 3 per cent per capita income growth from 1950-73 without serious interruptions, and the rest of the world, developed and developing, grew quite a bit faster. But an intellectual/ideological rebellion against Keynesian economics was gathering force. It finally got its chance to restore economics to its old tramlines with the rise of inflation from the late 1960s onwards – something which had less to do with Keynesian policy than with the Vietnam War. The truth was that "scientific" economics could not live with the idea of an unpredictable world. So, rather than admit that it could not be a "hard" science like physics, it set out to abolish uncertainty.

The "new" classical economists hit on a weak spot in Keynesian theory. The view that a large part of the future was unknowable seemed to leave out learning from experience or making efficient use of available information. Rational agents went on making the same mistakes. It seemed more reasonable to assume that recurrent events would initiate a learning process, causing agents to be less often surprised by events. This would make economies more stable.

The attack on Keynes's "uncertain" expectations developed from the 1960s onwards, from the "adaptive" expectations of Milton Friedman to the "rational" expectations of Robert Lucas and others. The development of Bayesian statistics and Bayesian decision-theory suggested that agents can always be modelled as having prior probability distributions over events – distributions that are updated by evidence.

 

Today, the idea of radical uncertainty, though ardently championed by “post-Keynesians” such as Paul Davidson, has little currency in mainstream economics; however, it is supported by financiers of an intellectual bent such as George Soros. As a result, uncertainty once more became “risk”, and risk can always be managed, measured, hedged and spread. This underlies the “efficient market hypothesis” – the idea that all share options can be correctly priced. Its acceptance explains the explosion of leveraged finance since the 1980s. The efficient market hypothesis has a further implication. If the market always prices financial assets correctly, the “real” economy – the one involved in the production of goods and non-financial services – will be as stable as the financial sector. Keynes’s idea that “changing views about the future are capable of influencing the quantity of employment” became a discarded heresy.

And yet the questions remain. Is the present crisis a once-in-a-lifetime event, against which it would be as absurd to guard as an earthquake, or is it an ever-present possibility? Do large "surprises" get instantly diffused through the price system or do their effects linger on like toxic waste, preventing full recovery? There are also questions about the present system that Keynes hardly considered. For instance: are some structures of the economy more conducive to macroeconomic stability than others?

This is the terrain of Karl Marx and the underconsump tionist theorists. There is a long tradition, recently revived, which argues that the more unequal the distribution of income, the more unstable an economy will be. Certainly globalisation has shifted GDP shares from wages to profits. In the underconsumptionist tradition, this leads to overinvestment. The explosion of debt finance can be interpreted as a way of postponing the "crisis of realisation".

Keynes did not have a complete answer to the problems we are facing once again. But, like all great thinkers, he leaves us with ideas which compel us to rethink our situation. In the long run, he deserves to ride again.

Lord Skidelsky is the author of "John Maynard Keynes" (three volumes), published in hardback by Macmillan

This article first appeared in the 22 December 2008 issue of the New Statesman, Christmas and New Year special

PETER NICHOLLS/REUTERS
Show Hide image

David Cameron's fatal insouciance

Will future historians remember the former prime minister for anything more than his great Brexit bungle?

On 13 July 2016, after a premiership lasting six years and 63 days, David Cameron left Downing Street for the last time. On the tarmac outside the black door, with his wife and children at his side, he gave a characteristically cool and polished parting statement. Then he got in his car for the last journey to Buckingham Palace – the picture, as ever, of insouciant ease. As I was watching the television pictures of Cameron’s car gliding away, I remembered what he is supposed to have said some years earlier, when asked why he wanted to be prime minister. True or not, his answer perfectly captured the public image of the man: “Because I think I’d be rather good at it.”

A few moments later, a friend sent me a text message. It was just six words long: “He’s down there with Chamberlain now.”

At first I thought that was a bit harsh. People will probably always disagree about Cameron’s economic record, just as they do about Margaret Thatcher’s. But at the very least it was nowhere near as bad as some of his critics had predicted, and by some standards – jobs created, for instance – it was much better than many observers had expected. His government’s welfare and education policies have their critics, but it seems highly unlikely that people will still be talking about them in a few decades’ time. Similarly, although Britain’s intervention in Libya is unlikely to win high marks from historians, it never approached the disaster of Iraq in the public imagination.

Cameron will probably score highly for his introduction of gay marriage, and although there are many people who dislike him, polls suggested that most voters regarded him as a competent, cheerful and plausible occupant of the highest office in the land. To put it another way, from the day he entered 10 Downing Street until the moment he left, he always looked prime ministerial. It is true that he left office as a loser, humiliated by the EU referendum, and yet, on the day he departed, the polls had him comfortably ahead of his Labour opposite number. He was, in short, popular.
On the other hand, a lot of people liked Neville Chamberlain, too. Like Chamberlain, Cameron seems destined to be remembered for only one thing. When students answer exam questions about Chamberlain, it’s a safe bet that they aren’t writing about the Holidays with Pay Act 1938. And when students write about Cameron in the year 2066, they won’t be answering questions about intervention in Libya, or gay marriage. They will be writing about Brexit and the lost referendum.

It is, of course, conceivable, though surely very unlikely, that Brexit will be plain sailing. But it is very possible that it will be bitter, protracted and enormously expensive. Indeed, it is perfectly conceivable that by the tenth anniversary of the referendum, the United Kingdom could be reduced to an English and Welsh rump, struggling to come to terms with a punitive European trade deal and casting resentful glances at a newly independent Scotland. Of course the Brexiteers – Nigel Farage, Boris Johnson, Michael Gove, Daniel Hannan et al – would get most of the blame in the short run. But in the long run, would any of them really be remembered? Much more likely is that historians’ fingers would point at one man: Cameron, the leader of the Conservative and Unionist Party, the prime minister who gambled with his future and lost the Union. The book by “Cato” that destroyed Chamberlain’s reputation in July 1940 was entitled Guilty Men. How long would it be, I wonder, before somebody brought out a book about Cameron, entitled Guilty Man?

Naturally, all this may prove far too pessimistic. My own suspicion is that Brexit will turn out to be a typically European – or, if you prefer, a typically British – fudge. And if the past few weeks’ polls are anything to go by, Scottish independence remains far from certain. So, in a less apocalyptic scenario, how would posterity remember David Cameron? As a historic failure and “appalling bungler”, as one Guardian writer called him? Or as a “great prime minister”, as Theresa May claimed on the steps of No 10?

Neither. The answer, I think, is that it would not remember him at all.

***

The late Roy Jenkins, who – as Herbert Asquith’s biographer, Harold Wilson’s chancellor and Jim Callaghan’s rival – was passionately interested in such things, used to write of a “market” in prime ministerial futures. “Buy Attlee!” he might say. “Sell Macmillan!” But much of this strikes me as nonsense. For one thing, political reputations fluctuate much less than we think. Many people’s views of, say, Wilson, Thatcher and Blair have remained unchanged since the day they left office. Over time, reputations do not change so much as fade. Academics remember prime ministers; so do political anoraks and some politicians; but most people soon forget they ever existed. There are 53 past prime ministers of the United Kingdom, but who now remembers most of them? Outside the university common room, who cares about the Marquess of Rockingham, the Earl of Derby, Lord John Russell, or Arthur Balfour? For that matter, who cares about Asquith or Wilson? If you stopped people in the streets of Sunderland, how many of them would have heard of Stanley Baldwin or Harold Macmillan? And even if they had, how much would they ­really know about them?

In any case, what does it mean to be a success or a failure as prime minister? How on Earth can you measure Cameron’s achievements, or lack of them? We all have our favourites and our prejudices, but how do you turn that into something more dispassionate? To give a striking example, Margaret Thatcher never won more than 43.9 per cent of the vote, was roundly hated by much of the rest of the country and was burned in effigy when she died, long after her time in office had passed into history. Having come to power promising to revive the economy and get Britain working again, she contrived to send unemployment well over three million, presided over the collapse of much of British manufacturing and left office with the economy poised to plunge into yet another recession. So, in that sense, she looks a failure.

Yet at the same time she won three consecutive general elections, regained the Falklands from Argentina, pushed through bold reforms to Britain’s institutions and fundamentally recast the terms of political debate for a generation to come. In that sense, clearly she was a success. How do you reconcile those two positions? How can you possibly avoid yielding to personal prejudice? How, in fact, can you reach any vaguely objective verdict at all?

It is striking that, although we readily discuss politicians in terms of success and failure, we rarely think about what that means. In some walks of life, the standard for success seems obvious. Take the other “impossible job” that the tabloids love to compare with serving as prime minister: managing the England football team. You can measure a football manager’s success by trophies won, qualifications gained, even points accrued per game, just as you can judge a chief executive’s performance in terms of sales, profits and share values.

There is no equivalent for prime ministerial leadership. Election victories? That would make Clement Attlee a failure: he fought five elections and won only two. It would make Winston Churchill a failure, too: he fought three elections and won only one. Economic growth? Often that has very little to do with the man or woman at the top. Opinion polls? There’s more to success than popularity, surely. Wars? Really?

The ambiguity of the question has never stopped people trying. There is even a Wikipedia page devoted to “Historical rankings of Prime Ministers of the United Kingdom”, which incorporates two surveys of academics carried out by the University of Leeds, a BBC Radio 4 poll of Westminster commentators, a feature by BBC History Magazine and an online poll organised by Newsnight. By and large, there is a clear pattern. Among 20th-century leaders, there are four clear “successes” – Lloyd George, Churchill, Attlee and Thatcher – with the likes of Macmillan, Wilson and Heath scrapping for mid-table places. At the bottom, too, the same names come up again and again: Balfour, Chamberlain, Eden, Douglas-Home and Major. But some of these polls are quite old, dating back to the Blair years. My guess is that if they were conducted today, Major might rise a little, especially after the success of Team GB at the Olympics, and Gordon Brown might find himself becalmed somewhere towards the bottom.

***

So what makes the failures, well, failures? In two cases, the answer is simply electoral defeat. Both ­Arthur Balfour and John Major were doomed to failure from the moment they took office, precisely because they had been picked from within the governing party to replace strong, assertive and electorally successful leaders in Lord Salisbury and Margaret Thatcher, respectively. It’s true that Major unexpectedly won the 1992 election, but in both cases there was an atmosphere of fin de régime from the very beginning. Douglas-Home probably fits into this category, too, coming as he did at the fag end of 13 years of Conservative rule. Contrary to political mythology, he was in fact a perfectly competent prime minister, and came much closer to winning the 1964 election than many people had expected. But he wasn’t around for long and never really captured the public mood. It seems harsh merely to dismiss him as a failure, but politics is a harsh business.

That leaves two: Chamberlain and Eden. Undisputed failures, who presided over the greatest foreign policy calamities in our modern history. Nothing to say, then? Not so. Take Chamberlain first. More than any other individual in our modern history, he has become a byword for weakness, naivety and self-deluding folly.

Yet much of this picture is wrong. Chamberlain was not a weak or indecisive man. If anything, he was too strong: too stubborn, too self-confident. Today we remember him as a faintly ridiculous, backward-looking man, with his umbrella and wing collar. But many of his contemporaries saw him as a supremely modern administrator, a reforming minister of health and an authoritative chancellor who towered above his Conservative contemporaries. It was this impression of cool capability that secured Chamberlain the crown when Baldwin stepped down in 1937. Unfortunately, it was precisely his titanic self-belief, his unbreakable faith in his own competence, that also led him to overestimate his influence over Adolf Hitler. In other words, the very quality that people most admired – his stubborn confidence in his own ability – was precisely what doomed him.

In Chamberlain’s case, there is no doubt that he had lost much of his popular prestige by May 1940, when he stepped down as prime minister. Even though most of his own Conservative MPs still backed him – as most of Cameron’s MPs still backed him after the vote in favour of Brexit – the evidence of Mass Observation and other surveys suggests that he had lost support in the country at large, and his reputation soon dwindled to its present calamitous level.

The case of the other notable failure, Anthony Eden, is different. When he left office after the Suez crisis in January 1957, it was not because the public had deserted him, but because his health had collapsed. Surprising as it may seem, Eden was more popular after Suez than he had been before it. In other words, if the British people had had their way, Eden would probably have continued as prime minister. They did not see him as a failure at all.

Like Chamberlain, Eden is now generally regarded as a dud. Again, this may be a bit unfair. As his biographers have pointed out, he was a sick and exhausted man when he took office – the result of two disastrously botched operations on his gall bladder – and relied on a cocktail of painkillers and stimulants. Yet, to the voters who handed him a handsome general election victory in 1955, Eden seemed to have all the qualities to become an enormously successful prime minister: good looks, brains, charm and experience, like a slicker, cleverer and more seasoned version of Cameron. In particular, he was thought to have proved his courage in the late 1930s, when he had resigned as foreign secretary in protest at the appeasement of Benito Mussolini before becoming one of Churchill’s chief lieutenants.

Yet it was precisely Eden’s great asset – his reputation as a man who had opposed appeasement and stood up to the dictators – that became his weakness. In effect, he became trapped by his own legend. When the Egyptian dictator Gamal Abdel Nasser nationalised the Suez Canal in July 1956, Eden seemed unable to view it as anything other than a replay of the fascist land-grabs of the 1930s. Nasser was Mussolini; the canal was Abyssinia; ­failure to resist would be appeasement all over again. This was nonsense, really: Nasser was nothing like Mussolini. But Eden could not escape the shadow of his own political youth.

This phenomenon – a prime minister’s greatest strength gradually turning into his or her greatest weakness – is remarkably common. Harold Wilson’s nimble cleverness, Jim Callaghan’s cheerful unflappability, Margaret Thatcher’s restless urgency, John Major’s Pooterish normality, Tony Blair’s smooth charm, Gordon Brown’s rugged seriousness: all these things began as refreshing virtues but became big handicaps. So, in that sense, what happened to Chamberlain and Eden was merely an exaggerated version of what happens to every prime minister. Indeed, perhaps it is only pushing it a bit to suggest, echoing Enoch Powell, that all prime ministers, their human flaws inevitably amplified by the stresses of office, eventually end up as failures. In fact, it may not be too strong to suggest that in an age of 24-hour media scrutiny, surging populism and a general obsession with accountability, the very nature of the job invites failure.

***

In Cameron’s case, it would be easy to construct a narrative based on similar lines. Remember, after all, how he won the Tory leadership in the first place. He went into the 2005 party conference behind David Davis, the front-runner, but overhauled him after a smooth, fluent and funny speech, delivered without notes. That image of blithe nonchalance served him well at first, making for a stark contrast with the saturnine intensity and stumbling stiffness of his immediate predecessors, Michael Howard and Iain Duncan Smith. Yet in the end it was Cameron’s self-confidence that really did for him.

Future historians will probably be arguing for years to come whether he really needed to promise an In/Out referendum on the UK’s membership of the EU, as his defenders claim, to protect his flank against Ukip. What is not in doubt is that Cameron believed he could win it. It became a cliché to call him an “essay crisis” prime minister – a gibe that must have seemed meaningless to millions of people who never experienced the weekly rhythms of the Oxford tutorial system. And yet he never really managed to banish the impression of insouciance. The image of chillaxing Dave, the PM so cockily laidback that he left everything until the last minute, may be a caricature, but my guess is that it will stick.

As it happens, I think Cameron deserves more credit than his critics are prepared to give him. I think it would be easy to present him as a latter-day Baldwin – which I mean largely as a compliment. Like Baldwin, he was a rich provincial Tory who posed as an ordinary family man. Like Baldwin, he offered economic austerity during a period of extraordinary international financial turmoil. Like Baldwin, he governed in coalition while relentlessly squeezing the Liberal vote. Like Baldwin, he presented himself as the incarnation of solid, patriotic common sense; like Baldwin, he was cleverer than his critics thought; like Baldwin, he was often guilty of mind-boggling complacency. The difference is that when Baldwin gambled and lost – as when he called a rash general election in 1923 – he managed to save his career from the ruins. When Cameron gambled and lost, it was all over.

Although I voted Remain, I do not share many commentators’ view of Brexit as an apocalyptic disaster. In any case, given that a narrow majority of the electorate got the result it wanted, at least 17 million people presumably view Cameron’s gamble as a great success – for Britain, if not for him. Unfortunately for Cameron, however, most British academics are left-leaning Remainers, and it is they who will write the history books. What ought also to worry Cameron’s defenders – or his shareholders, to use Roy Jenkins’s metaphor – is that both Chamberlain and Eden ended up being defined by their handling of Britain’s foreign policy. There is a curious paradox here, ­because foreign affairs almost never matters at the ballot box. In 1959, barely three years after Suez, the Conservatives cruised to an easy re-election victory; in 2005, just two years after invading Iraq, when the extent of the disaster was already apparent, Blair won a similarly comfortable third term in office. Perhaps foreign affairs matters more to historians than it does to most voters. In any case, the lesson seems to be that, if you want to secure your historical reputation, you can get away with mishandling the economy and lengthening the dole queues, but you simply cannot afford to damage Britain’s international standing.

So, if Brexit does turn into a total disaster, Cameron can expect little quarter. Indeed, while historians have some sympathy for Chamberlain, who was, after all, motivated by a laudable desire to avoid war, and even for Eden, who was a sick and troubled man, they are unlikely to feel similar sympathy for an overconfident prime minister at the height of his powers, who seems to have brought his fate upon himself.

How much of this, I wonder, went through David Cameron’s mind in the small hours of that fateful morning of 24 June, as the results came through and his place in history began to take shape before his horrified eyes? He reportedly likes to read popular history for pleasure; he must occasionally have wondered how he would be remembered. But perhaps it meant less to him than we think. Most people give little thought to how they will be remembered after their death, except by their closest friends and family members. There is something insecure, something desperately needy, about people who dwell on their place in history.

Whatever you think about Cameron, he never struck me as somebody suffering from excessive insecurity. Indeed, his normality was one of the most likeable things about him.

He must have been deeply hurt by his failure. But my guess is that, even as his car rolled away from 10 Downing Street for the last time, his mind was already moving on to other things. Most prime ministers leave office bitter, obsessive and brooding. But, like Stanley Baldwin, Cameron strolled away from the job as calmly as he had strolled into it. It was that fatal insouciance that brought him down. 

Dominic Sandbrook is a historian, broadcaster and columnist for the Daily Mail. His book The Great British Dream Factory will be published in paperback by Penguin on 1 September

Dominic Sandbrook is a historian and author. His books include Never Had It So Good: A History of Britain from Suez to the Beatles and White Heat: A History of Britain in the Swinging Sixties. He writes the What If... column for the New Statesman.

This article first appeared in the 25 August 2016 issue of the New Statesman, Cameron: the legacy of a loser