PETER NICHOLLS/REUTERS
Show Hide image

How will history treat David Cameron?

Will future historians remember the former prime minister for anything more than his great Brexit bungle?

On 13 July 2016, after a premiership lasting six years and 63 days, David Cameron left Downing Street for the last time. On the tarmac outside the black door, with his wife and children at his side, he gave a characteristically cool and polished parting statement. Then he got in his car for the last journey to Buckingham Palace – the picture, as ever, of insouciant ease. As I was watching the television pictures of Cameron’s car gliding away, I remembered what he is supposed to have said some years earlier, when asked why he wanted to be prime minister. True or not, his answer perfectly captured the public image of the man: “Because I think I’d be rather good at it.”

A few moments later, a friend sent me a text message. It was just six words long: “He’s down there with Chamberlain now.”

At first I thought that was a bit harsh. People will probably always disagree about Cameron’s economic record, just as they do about Margaret Thatcher’s. But at the very least it was nowhere near as bad as some of his critics had predicted, and by some standards – jobs created, for instance – it was much better than many observers had expected. His government’s welfare and education policies have their critics, but it seems highly unlikely that people will still be talking about them in a few decades’ time. Similarly, although Britain’s intervention in Libya is unlikely to win high marks from historians, it never approached the disaster of Iraq in the public imagination.

Cameron will probably score highly for his introduction of gay marriage, and although there are many people who dislike him, polls suggested that most voters regarded him as a competent, cheerful and plausible occupant of the highest office in the land. To put it another way, from the day he entered 10 Downing Street until the moment he left, he always looked prime ministerial. It is true that he left office as a loser, humiliated by the EU referendum, and yet, on the day he departed, the polls had him comfortably ahead of his Labour opposite number. He was, in short, popular.
On the other hand, a lot of people liked Neville Chamberlain, too. Like Chamberlain, Cameron seems destined to be remembered for only one thing. When students answer exam questions about Chamberlain, it’s a safe bet that they aren’t writing about the Holidays with Pay Act 1938. And when students write about Cameron in the year 2066, they won’t be answering questions about intervention in Libya, or gay marriage. They will be writing about Brexit and the lost referendum.

It is, of course, conceivable, though surely very unlikely, that Brexit will be plain sailing. But it is very possible that it will be bitter, protracted and enormously expensive. Indeed, it is perfectly conceivable that by the tenth anniversary of the referendum, the United Kingdom could be reduced to an English and Welsh rump, struggling to come to terms with a punitive European trade deal and casting resentful glances at a newly independent Scotland. Of course the Brexiteers – Nigel Farage, Boris Johnson, Michael Gove, Daniel Hannan et al – would get most of the blame in the short run. But in the long run, would any of them really be remembered? Much more likely is that historians’ fingers would point at one man: Cameron, the leader of the Conservative and Unionist Party, the prime minister who gambled with his future and lost the Union. The book by “Cato” that destroyed Chamberlain’s reputation in July 1940 was entitled Guilty Men. How long would it be, I wonder, before somebody brought out a book about Cameron, entitled Guilty Man?

Naturally, all this may prove far too pessimistic. My own suspicion is that Brexit will turn out to be a typically European – or, if you prefer, a typically British – fudge. And if the past few weeks’ polls are anything to go by, Scottish independence remains far from certain. So, in a less apocalyptic scenario, how would posterity remember David Cameron? As a historic failure and “appalling bungler”, as one Guardian writer called him? Or as a “great prime minister”, as Theresa May claimed on the steps of No 10?

Neither. The answer, I think, is that it would not remember him at all.

***

The late Roy Jenkins, who – as Herbert Asquith’s biographer, Harold Wilson’s chancellor and Jim Callaghan’s rival – was passionately interested in such things, used to write of a “market” in prime ministerial futures. “Buy Attlee!” he might say. “Sell Macmillan!” But much of this strikes me as nonsense. For one thing, political reputations fluctuate much less than we think. Many people’s views of, say, Wilson, Thatcher and Blair have remained unchanged since the day they left office. Over time, reputations do not change so much as fade. Academics remember prime ministers; so do political anoraks and some politicians; but most people soon forget they ever existed. There are 53 past prime ministers of the United Kingdom, but who now remembers most of them? Outside the university common room, who cares about the Marquess of Rockingham, the Earl of Derby, Lord John Russell, or Arthur Balfour? For that matter, who cares about Asquith or Wilson? If you stopped people in the streets of Sunderland, how many of them would have heard of Stanley Baldwin or Harold Macmillan? And even if they had, how much would they ­really know about them?

In any case, what does it mean to be a success or a failure as prime minister? How on Earth can you measure Cameron’s achievements, or lack of them? We all have our favourites and our prejudices, but how do you turn that into something more dispassionate? To give a striking example, Margaret Thatcher never won more than 43.9 per cent of the vote, was roundly hated by much of the rest of the country and was burned in effigy when she died, long after her time in office had passed into history. Having come to power promising to revive the economy and get Britain working again, she contrived to send unemployment well over three million, presided over the collapse of much of British manufacturing and left office with the economy poised to plunge into yet another recession. So, in that sense, she looks a failure.

Yet at the same time she won three consecutive general elections, regained the Falklands from Argentina, pushed through bold reforms to Britain’s institutions and fundamentally recast the terms of political debate for a generation to come. In that sense, clearly she was a success. How do you reconcile those two positions? How can you possibly avoid yielding to personal prejudice? How, in fact, can you reach any vaguely objective verdict at all?

It is striking that, although we readily discuss politicians in terms of success and failure, we rarely think about what that means. In some walks of life, the standard for success seems obvious. Take the other “impossible job” that the tabloids love to compare with serving as prime minister: managing the England football team. You can measure a football manager’s success by trophies won, qualifications gained, even points accrued per game, just as you can judge a chief executive’s performance in terms of sales, profits and share values.

There is no equivalent for prime ministerial leadership. Election victories? That would make Clement Attlee a failure: he fought five elections and won only two. It would make Winston Churchill a failure, too: he fought three elections and won only one. Economic growth? Often that has very little to do with the man or woman at the top. Opinion polls? There’s more to success than popularity, surely. Wars? Really?

The ambiguity of the question has never stopped people trying. There is even a Wikipedia page devoted to “Historical rankings of Prime Ministers of the United Kingdom”, which incorporates two surveys of academics carried out by the University of Leeds, a BBC Radio 4 poll of Westminster commentators, a feature by BBC History Magazine and an online poll organised by Newsnight. By and large, there is a clear pattern. Among 20th-century leaders, there are four clear “successes” – Lloyd George, Churchill, Attlee and Thatcher – with the likes of Macmillan, Wilson and Heath scrapping for mid-table places. At the bottom, too, the same names come up again and again: Balfour, Chamberlain, Eden, Douglas-Home and Major. But some of these polls are quite old, dating back to the Blair years. My guess is that if they were conducted today, Major might rise a little, especially after the success of Team GB at the Olympics, and Gordon Brown might find himself becalmed somewhere towards the bottom.

***

So what makes the failures, well, failures? In two cases, the answer is simply electoral defeat. Both ­Arthur Balfour and John Major were doomed to failure from the moment they took office, precisely because they had been picked from within the governing party to replace strong, assertive and electorally successful leaders in Lord Salisbury and Margaret Thatcher, respectively. It’s true that Major unexpectedly won the 1992 election, but in both cases there was an atmosphere of fin de régime from the very beginning. Douglas-Home probably fits into this category, too, coming as he did at the fag end of 13 years of Conservative rule. Contrary to political mythology, he was in fact a perfectly competent prime minister, and came much closer to winning the 1964 election than many people had expected. But he wasn’t around for long and never really captured the public mood. It seems harsh merely to dismiss him as a failure, but politics is a harsh business.

That leaves two: Chamberlain and Eden. Undisputed failures, who presided over the greatest foreign policy calamities in our modern history. Nothing to say, then? Not so. Take Chamberlain first. More than any other individual in our modern history, he has become a byword for weakness, naivety and self-deluding folly.

Yet much of this picture is wrong. Chamberlain was not a weak or indecisive man. If anything, he was too strong: too stubborn, too self-confident. Today we remember him as a faintly ridiculous, backward-looking man, with his umbrella and wing collar. But many of his contemporaries saw him as a supremely modern administrator, a reforming minister of health and an authoritative chancellor who towered above his Conservative contemporaries. It was this impression of cool capability that secured Chamberlain the crown when Baldwin stepped down in 1937. Unfortunately, it was precisely his titanic self-belief, his unbreakable faith in his own competence, that also led him to overestimate his influence over Adolf Hitler. In other words, the very quality that people most admired – his stubborn confidence in his own ability – was precisely what doomed him.

In Chamberlain’s case, there is no doubt that he had lost much of his popular prestige by May 1940, when he stepped down as prime minister. Even though most of his own Conservative MPs still backed him – as most of Cameron’s MPs still backed him after the vote in favour of Brexit – the evidence of Mass Observation and other surveys suggests that he had lost support in the country at large, and his reputation soon dwindled to its present calamitous level.

The case of the other notable failure, Anthony Eden, is different. When he left office after the Suez crisis in January 1957, it was not because the public had deserted him, but because his health had collapsed. Surprising as it may seem, Eden was more popular after Suez than he had been before it. In other words, if the British people had had their way, Eden would probably have continued as prime minister. They did not see him as a failure at all.

Like Chamberlain, Eden is now generally regarded as a dud. Again, this may be a bit unfair. As his biographers have pointed out, he was a sick and exhausted man when he took office – the result of two disastrously botched operations on his gall bladder – and relied on a cocktail of painkillers and stimulants. Yet, to the voters who handed him a handsome general election victory in 1955, Eden seemed to have all the qualities to become an enormously successful prime minister: good looks, brains, charm and experience, like a slicker, cleverer and more seasoned version of Cameron. In particular, he was thought to have proved his courage in the late 1930s, when he had resigned as foreign secretary in protest at the appeasement of Benito Mussolini before becoming one of Churchill’s chief lieutenants.

Yet it was precisely Eden’s great asset – his reputation as a man who had opposed appeasement and stood up to the dictators – that became his weakness. In effect, he became trapped by his own legend. When the Egyptian dictator Gamal Abdel Nasser nationalised the Suez Canal in July 1956, Eden seemed unable to view it as anything other than a replay of the fascist land-grabs of the 1930s. Nasser was Mussolini; the canal was Abyssinia; ­failure to resist would be appeasement all over again. This was nonsense, really: Nasser was nothing like Mussolini. But Eden could not escape the shadow of his own political youth.

This phenomenon – a prime minister’s greatest strength gradually turning into his or her greatest weakness – is remarkably common. Harold Wilson’s nimble cleverness, Jim Callaghan’s cheerful unflappability, Margaret Thatcher’s restless urgency, John Major’s Pooterish normality, Tony Blair’s smooth charm, Gordon Brown’s rugged seriousness: all these things began as refreshing virtues but became big handicaps. So, in that sense, what happened to Chamberlain and Eden was merely an exaggerated version of what happens to every prime minister. Indeed, perhaps it is only pushing it a bit to suggest, echoing Enoch Powell, that all prime ministers, their human flaws inevitably amplified by the stresses of office, eventually end up as failures. In fact, it may not be too strong to suggest that in an age of 24-hour media scrutiny, surging populism and a general obsession with accountability, the very nature of the job invites failure.

***

In Cameron’s case, it would be easy to construct a narrative based on similar lines. Remember, after all, how he won the Tory leadership in the first place. He went into the 2005 party conference behind David Davis, the front-runner, but overhauled him after a smooth, fluent and funny speech, delivered without notes. That image of blithe nonchalance served him well at first, making for a stark contrast with the saturnine intensity and stumbling stiffness of his immediate predecessors, Michael Howard and Iain Duncan Smith. Yet in the end it was Cameron’s self-confidence that really did for him.

Future historians will probably be arguing for years to come whether he really needed to promise an In/Out referendum on the UK’s membership of the EU, as his defenders claim, to protect his flank against Ukip. What is not in doubt is that Cameron believed he could win it. It became a cliché to call him an “essay crisis” prime minister – a gibe that must have seemed meaningless to millions of people who never experienced the weekly rhythms of the Oxford tutorial system. And yet he never really managed to banish the impression of insouciance. The image of chillaxing Dave, the PM so cockily laidback that he left everything until the last minute, may be a caricature, but my guess is that it will stick.

As it happens, I think Cameron deserves more credit than his critics are prepared to give him. I think it would be easy to present him as a latter-day Baldwin – which I mean largely as a compliment. Like Baldwin, he was a rich provincial Tory who posed as an ordinary family man. Like Baldwin, he offered economic austerity during a period of extraordinary international financial turmoil. Like Baldwin, he governed in coalition while relentlessly squeezing the Liberal vote. Like Baldwin, he presented himself as the incarnation of solid, patriotic common sense; like Baldwin, he was cleverer than his critics thought; like Baldwin, he was often guilty of mind-boggling complacency. The difference is that when Baldwin gambled and lost – as when he called a rash general election in 1923 – he managed to save his career from the ruins. When Cameron gambled and lost, it was all over.

Although I voted Remain, I do not share many commentators’ view of Brexit as an apocalyptic disaster. In any case, given that a narrow majority of the electorate got the result it wanted, at least 17 million people presumably view Cameron’s gamble as a great success – for Britain, if not for him. Unfortunately for Cameron, however, most British academics are left-leaning Remainers, and it is they who will write the history books. What ought also to worry Cameron’s defenders – or his shareholders, to use Roy Jenkins’s metaphor – is that both Chamberlain and Eden ended up being defined by their handling of Britain’s foreign policy. There is a curious paradox here, ­because foreign affairs almost never matters at the ballot box. In 1959, barely three years after Suez, the Conservatives cruised to an easy re-election victory; in 2005, just two years after invading Iraq, when the extent of the disaster was already apparent, Blair won a similarly comfortable third term in office. Perhaps foreign affairs matters more to historians than it does to most voters. In any case, the lesson seems to be that, if you want to secure your historical reputation, you can get away with mishandling the economy and lengthening the dole queues, but you simply cannot afford to damage Britain’s international standing.

So, if Brexit does turn into a total disaster, Cameron can expect little quarter. Indeed, while historians have some sympathy for Chamberlain, who was, after all, motivated by a laudable desire to avoid war, and even for Eden, who was a sick and troubled man, they are unlikely to feel similar sympathy for an overconfident prime minister at the height of his powers, who seems to have brought his fate upon himself.

How much of this, I wonder, went through David Cameron’s mind in the small hours of that fateful morning of 24 June, as the results came through and his place in history began to take shape before his horrified eyes? He reportedly likes to read popular history for pleasure; he must occasionally have wondered how he would be remembered. But perhaps it meant less to him than we think. Most people give little thought to how they will be remembered after their death, except by their closest friends and family members. There is something insecure, something desperately needy, about people who dwell on their place in history.

Whatever you think about Cameron, he never struck me as somebody suffering from excessive insecurity. Indeed, his normality was one of the most likeable things about him.

He must have been deeply hurt by his failure. But my guess is that, even as his car rolled away from 10 Downing Street for the last time, his mind was already moving on to other things. Most prime ministers leave office bitter, obsessive and brooding. But, like Stanley Baldwin, Cameron strolled away from the job as calmly as he had strolled into it. It was that fatal insouciance that brought him down. 

Dominic Sandbrook is a historian, broadcaster and columnist for the Daily Mail. His book The Great British Dream Factory will be published in paperback by Penguin on 1 September

Dominic Sandbrook is a historian and author. His books include Never Had It So Good: A History of Britain from Suez to the Beatles and White Heat: A History of Britain in the Swinging Sixties. He writes the What If... column for the New Statesman.

This article first appeared in the 25 August 2016 issue of the New Statesman, Cameron: the legacy of a loser

Credit: BRIDGEMAN IMAGES
Show Hide image

A century ago, the Spanish flu killed 100 million people. Is a new pandemic on the way?

Our leaders need to act like the outbreak has already started – because for all we know it may have.

It is hard not to have a sneaking envy of the virus. As complex creatures, we are distracted by myriad demands on our attention; we will never know the dead-eyed focus of the viral world. It is akin to the psychopath: a cold, purposeful drive to achieve its own agenda, coupled with the skills and resourcefulness to succeed. In a world threatened by nuclear war and devastating climate change, it may actually be the virus that we should fear most.

This is the centenary year of the Spanish flu outbreak, when a virus killed between 50 and 100 million people in a matter of months. The devastation was worldwide; it is only known as Spanish flu because Spain, neutral in the ongoing hostilities of World War One, was the only country without press restrictions. Across Europe, people assumed their own outbreaks originated in the only place reporting on the disaster.

A number of authors have lined up with a kind of grim celebration of influenza’s annus mirabilis. As well as chronicling the fatal reach of this organism, they all offer a warning about a follow-up pandemic that is overdue – and for which, it seems, we are largely unprepared. “Somewhere out there a dangerous virus is boiling up in the bloodstream of a bird, bat, monkey, or pig, preparing to jump to a human being,” says Jonathan Quick in The End of Epidemics. “It has the potential to wipe out millions of us, including my family and yours, over a matter of weeks or months.”

If that seems a little shlocky, you should know that Quick is no quack. He is a former director at the WHO, the current chair of the Global Health Council and a faculty member at Harvard Medical School. The book’s blurb includes endorsements from the director of the London School of Hygiene and Tropical Medicine, the president of Médicins Sans Frontières, and the president of the Rockefeller Foundation.

The numbers Quick serves up are stupefying. Bill Gates, for instance, has said it is more likely than not that he will live to see a viral outbreak kill over 10 million people in a year. In Gates’s nightmare scenario, outlined by computer simulations created with disease-modelling experts, 33 million people die within 200 days of the first human infection. The potential for exponential spread means a death toll of 300 million is possible in the first year. “We would be in a world where scrappy, ravaged survivors struggle for life in a zombie-movie wasteland,” Quick tells us in his informed, cogent and – honestly – frightening book.

If you can’t imagine what that is like, you could try asking the Yupik people of Alaska, who were devastated by the 1918 Spanish flu. You might not get an answer, however, because they remain traumatised, and have made a pact not to speak about the pandemic that shattered their ancient culture.  (A pandemic is a disease that spreads across continents; an epidemic is usually contained within a country or continent.)They aren’t the only long-term sufferers. The Vanuatu archipelago suffered 90 per cent mortality and 20 of its local languages went extinct. Those in the womb in 1918 were also affected. A baby born in 1919 “was less likely to graduate and earn a reasonable wage, and more likely to go to prison, claim disability benefit, and suffer from heart disease,” reports Laura Spinney in Pale Rider.

Such arresting snippets of the flu’s legacy abound in Spinney’s thoughtful, coherent take on the 1918 outbreak. The book’s subtitle suggests that the Spanish flu changed the world, and Spinney certainly backs this up. Societies broke down and had to be rebuilt; recovering populations were reinvigorated by the simple calculus of Darwin’s “survival of the fittest”; public health provisions were first imagined and then brought into reality; artists and writers responded to a new global mood by establishing new movements.

Not every outcome could be spun as a positive. Scientists, for instance, were humiliated by their inability to halt the flu’s progress, creating an opportunity for quack medicines to arise and establish themselves. Some of our greatest writers lived through the trauma, but could never bring themselves to discuss it in their stories. Virginia Woolf noted that it was “strange indeed that illness has not taken its place with love and battle and jealousy among the prime themes of literature”.

Spinney’s background as a science writer shines through: her handling of the workings of the flu is detailed and deft. She brings both the influenza A virus (the only type responsible for pandemics) and the human immune system to life, laying out the biochemical processes that kill and cure with clarity and care. She exposes the chilling roots of often-used but seldom-explained viral names such as “H1N1” (Spanish flu) or “H5N1” (bird flu). H is for haemagglutinin, the lollipop-shaped appendage that allows a virus to break into a cell and take over the means of production. N is for neuraminidase, the “glass-cutter” structure that allows replicated viruses to break out again and unleash hell upon the host. So far, we know of 18 H’s and 11 N’s and they all have ever-evolving sub-types that make a long-lasting general vaccine against the flu an elusive dream: “Every flu pandemic of the 20th century was triggered by the emergence of a new H in influenza A,” says Spinney.

For all her technical expertise, Spinney has a light touch and a keen eye for the comic. She relates how a ferret sneezing in the face of a British researcher in 1933 exposed influenza’s ability to travel between biological species, for instance. She also excels with the bigger picture, detailing the century of scientific detective work that has allowed us to piece together the genetic elements of the 1918 virus and gain insights into its creation. It seems to have jumped to humans on a farm in Kansas, via domestic and wild birds indigenous to North America. There may also have been some ingredients from pigs, too, but that’s not settled.

Spinney’s afterword questions whether our collective memory for such events ever reflects the truth of the moment. “When the story of the Spanish flu was told, it was told by those who got off most lightly: the white and well off,” she tells us. “With very few exceptions, the ones who bore the brunt of it, those living in ghettoes or at the rim, have yet to tell their tale. Some, such as the minorities whose languages died with them, never will.”

That said, Catharine Arnold has done a remarkable job of relating the tales of a diverse set of sufferers, crafting an arresting and intimate narrative of the 1918 pandemic. She pulls the accounts of hundreds of victims into a gripping tale that swoops down into the grisly detail, then soars up to give a broad view over the landscape of this calamitous moment in human history.

Arnold’s remembrances come from the unknown and from celebrities. A Margery Porter from south London emphasised that “we just couldn’t stand up. Your legs actually gave way, I can’t exaggerate that too much.” John Steinbeck described the experience of infection as almost spiritual. “I went down and down,” he said, “until the wingtips of angels brushed my eyes.”

The reality was, inevitably, less poetic. A local surgeon removed one of Steinbeck’s ribs so that he could gain access to the author’s infected lung. Most victims’ bodies turned blue-black as they died. Healthcare workers reported appalling scenes, with delirious patients suffering horrific nosebleeds. “Sometimes the blood would just shoot across the room,” a navy nurse recalled. If their lungs punctured, the patients’ bodies would fill with air. “You would feel somebody and he would be bubbles… When their lungs collapsed, air was trapped beneath their skin. As we rolled the dead in winding sheets, their bodies crackled – an awful crackling noise with sounded like Rice Krispies when you pour milk over them.”

The killer in 1918 was often not the flu virus itself but the “cytokine storm” of an immune system overreacting to the infection. Strong, fit young people, with their efficient immune systems, were thus particularly at risk, their bodies effectively shutting themselves down. Then there were the ravages of opportunistic bacteria that would lodge in the devastated tissue, causing pneumonia and other fatal complications. Arnold paints a grim but vivid picture of exhausted gravediggers and opportunistic funeral directors cannily upping their prices. The morgues were overflowing, and morticians worked day and night. In the end, mass graves were the only answer for the poverty-stricken workers attempting to bury their loved ones before they, too, succumbed.

No one was spared from grief or suffering at the hands of the “Spanish Lady”, as the flu came to be known. Louis Brownlow, the city commissioner for Washington DC, reported nursing his stricken wife while answering telephone calls from desperate citizens. One woman called to say that of the three girls she shared a room with, two had died, and the third was on her way out. Brownlow sent a police officer to the house. A few hours later, the sergeant reported back from the scene: “Four girls dead.”

Some of the other stories Arnold has unearthed are equally heartbreaking. A Brooklyn boy called Michael Wind wrote of the moment his mother died after less than a day of being ill. He and his five siblings were at her bedside, as was their father, “head in hands, sobbing bitterly”. The following morning, knowing that he was soon to die too, their father took the three youngest children to the orphanage.

Arnold writes beautifully, and starkly, of the tragedy that unfolded in the autumn months of 1918: “the Spanish Lady played out her death march, killing without compunction. She did not discriminate between statesmen, painters, soldiers, poets, writers or brides.” She chronicles the Lady’s path from the United States and Canada through Europe, Africa and Asia, culminating in New Zealand’s “Black November”. The book is utterly absorbing. But how do we respond to its horrors and tragedies? What are we to do with our collective memories of such visceral, world-shattering events? Learn from them – and fast, argues Jonathan Quick.

Unlike Arnold and Spinney, Quick is not content to be a chronicler or a bystander. He is, he says, both terrified at the looming disaster and furious at the lack of high-level reaction to its threat. He is determined to create a movement that will instigate change, mimicking the way activists forced change from governments paralysed by, and pharmaceutical companies profiteering from, the Aids pandemic. Quick has channelled his fury: The End of Epidemics is, at heart, a call to arms against influenza, Ebola, Zika and the many other threats before us.

 

So what are we to do? First, our leaders need to act like the outbreak has already started – because for all we know it may have. We must strengthen our public health systems, and create robust agencies and NGOs ready to monitor and deal with the threat. We must educate citizens and implement surveillance, prevention and response mechanisms, while fighting misinformation and scaremongering. Governments must step up (and fund) research.

We can’t develop a vaccine until the threat is manifest, but we can prepare technology for fast large-scale production. We can also invest in methods of early diagnoses and virus identification. Invest $1 per person per year for 20 years and the threat will be largely neutralised, Quick suggests. Finally – and most importantly – there is an urgent need to create grass-roots support for these measures: citizen groups and other organisations that will hold their leaders to account and prevent death on a scale that no one alive has ever experienced. Is this achievable? Traumatised readers of Quick’s book will be left hoping that it is.

For all the advances of the last century, there are many unknowns. Scientists don’t know, for instance, which microbe will bring the next pandemic, where it will come from, or whether it will be transmitted through the air, by touch, through body fluids or through a combination of routes.

While there is considerable attention focused on communities in West Africa, East Asia or South America as the most likely source of the next outbreak, it’s worth remembering that most scientists now believe the 1918 influenza outbreak began on a farm in Kansas. Quick suggests the
next pandemic might have a similar geographical origin, thanks to the industrialised livestock facilities beloved by American food giants.

Viruses naturally mutate and evolve rapidly, taking up stray bits of genetic material wherever they can be found. But it’s the various flu strains that live inside animals that bring sleepless nights to those in the know. They can exist inside a pig, bat or chicken without provoking symptoms, but prove devastating if (when) they make the jump to humans. As more and more humans live in close proximity to domesticated animals, encroach on the territories inhabited by wild animals, and grow their food on unprecedented scales, our chance of an uncontrollable epidemic increase.

The meat factories known as “Concentrated Animal Feeding Operations” (CAFOs) are particularly problematic. They provide cheap meat, poultry, dairy and
eggs from animals kept in what Quick terms “concentration camp conditions”, simultaneously creating the perfect breeding ground for new and dangerous pathogens. Pigs, he points out, eat almost everything, so their guts are the perfect mixing bowls for a new and deadly influenza strain. “CAFOs were the birthplace of swine flu, and they could very likely be the birthplace of the next killer pandemic,” Quick warns.

There are other possibilities, though – bioterror, for instance. Bill Gates is among
those who have warned that terrorist groups are looking into the possibility of releasing the smallpox virus in a crowded market, or on a plane. Then there is the possibility of a scientist’s mistake. In 1978 a woman died after smallpox was released from a laboratory at the University of Birmingham, UK. In 2004 two Chinese researchers accidentally infected themselves with the SARS virus and spread it to seven other people, one of whom died. In 2014, a cardboard box full of forgotten vials of smallpox was found in a National Institutes of Health facility in Bethesda, Maryland. A year later, the US military accidentally shipped live anthrax spores to labs in the US and a military base in South Korea. It’s not impossible that human error could strike again – with catastrophic results.

Such possibilities lie behind our discomfort with what scientists have to do to further our understanding. Researchers in Rotterdam, for instance, wanted to know whether the deadly H5N1 bird flu could develop a capacity for airborne transmission like the common cold virus. Having failed to modify its genetics to achieve this, they began to pass an infection between ferrets, the animals whose response to the virus most mimics that of humans. Ten ferrets later, healthy animals were catching the virus from the cage next door. Knowing how easily H5N1 can become airborne is exactly the kind of discovery that will bolster our vigilance. It is, after all, many times more fatal than the H1N1 strain that caused the Spanish flu. At the same time, there was a huge – but understandable –
furore over whether the research should
be published, and thus be available to potential bioterrorists.

We might have to live with such dilemmas, because it is important to be ready to challenge the killer virus when it arrives. As we have seen with Aids and the common cold, developing vaccines takes time, and there is no guarantee of success, even with a concerted research effort.

****

Will we be ready? Quick suggests that our best chance lies in the world’s business leaders realising what’s at stake: economies would be devastated by the next pandemic. In 1918, Arnold points out, the British government was telling citizens it was their patriotic duty to “carry on” and make sure the wheels of industry kept turning. The result was a perfect environment for mass infection. Political leaders made similar mistakes across the Atlantic: on 12 October President Wilson led a gathering of 25,000 New Yorkers down the “Avenue of the Allies”. “That same week,” Arnold reports, “2,100 New Yorkers died of influenza.”

It’s worth noting that Spanish flu did not abate because we outsmarted it. The pandemic ended because the virus ran out of people it could infect. Of those who didn’t die, some survived through a chance natural immunity, and some were lucky enough to have maintained a physical separation from those carrying the invisible threat. The virus simply failed to kill the rest, enabling their bodies to develop the antibodies required to repel a further attack. A generation or two later, when the antibody-equipped immune systems were in the grave, and humans were immunologically vulnerable (and complacent) once again, H1N1 virus re-emerged, causing the 2009 swine flu outbreak.

As these books make clear, this is a history that could repeat all too easily in our time. Of the three, Pale Rider is perhaps the most satisfying. It has greater complexity and nuance than Arnold’s collection of harrowing tales, fascinating though they are. Spinney’s analysis is more circumspect and thus less paralysing than Quick’s masterful exposition of our precarious situation. But the truth is we need all these perspectives, and probably more, if we are to avoid sleepwalking into the next pandemic. Unlike our nemesis, humans lack focus – and it could be our undoing. 

Michael Brooks’s most recent book is “The Quantum Astrologer’s Handbook” (Scribe)

Pale Rider: The Spanish Flu of 1918 and How it Changed the World
Laura Spinney
Vintage, 352pp, £25

Pandemic 1918: The Story of the Deadliest Influenza in History
Catharine Arnold
Michael O’Mara, 368pp, £20

The End of Epidemics
Jonathan D Quick with Bronwyn Fryer
Scribe, 288pp, £14.99

Michael Brooks holds a PhD in quantum physics. He writes a weekly science column for the New Statesman, and his most recent book is At the Edge of Uncertainty: 11 Discoveries Taking Science by Surprise.

This article first appeared in the 25 August 2016 issue of the New Statesman, Cameron: the legacy of a loser