Show Hide image

The suburb that changed the world

In the 1980s, Silicon Valley was populated by lefties and hippies who dreamed of a computer revoluti

In Sofia Coppola's 2006 film of the life of Marie Antoinette, there is a scene where an entourage of palace jeunes filles sweeps through a ball at which the set and costumes are period, but the music and manners are straight out of a modern dance club. The proposition seems to be that an elite few were able to put a toe into the future to experience what is ordinary today.

Something like that went on in the Silicon Valley I knew in the 1980s. The debates and dilemmas that occupy a generation today appeared in miniature before there was an internet. We took our anticipation of the internet deadly seriously, to the point where it seemed already real. Thus I have experienced the internet age twice.

Experiencing the internet in reality is different - and even bizarre, because although it seemed reasonable to expect the thing to come about, it is still uncanny that the reasoning was right. It feels as though we got away with something we shouldn't have done.

The internet arrived from two directions, one top-down and the other bottom-up. Initially computers and computer networking were both developed in military and government labs. The way you experienced computation from the 1960s often reflected this point of origin, with early computer companies such as IBM exuding a grey, regimented stoniness in order to appear seductive to their patrons.

In the 1970s, a small market emerged for hobbyist computers. You could build your own little box with blinking lights that you could program by flipping lines of switches on the front panel. That's all you could do at first, but oh, the ecstasy to be able to touch your own computer, if you had an inkling of where it all could lead.

A culture grew up around these hobbyist machines centred in Silicon Valley, and spawned the personal computer market - with Microsoft launching in 1975 and Apple in 1976. The centre of gravity split: the stony grey opposite delirious hippies and faux revolutionaries.

The turbulent confluence between top-down and bottom-up continues to this day. Internet start-ups sprout like garage bands. Most die, but a few explode into national-scale empires, as in the case of Facebook. Dreary top-down institutions such as wireless carriers maintain their lofty entitlements, though occasionally they drain away, like the old music business. I used to be partisan, favouring the bottom-up approach, but now I appreciate the balance of tides, because all kinds of power should be checked.

My first encounter with Silicon Valley was at the end of my teens, which was also the end of the 1970s. The world seemed carved into zones according to the degree of magic available. The highest magic was found in nexuses of hippie exuberance such as the beach town of Santa Cruz, California, where pearlescent rainbows covered everything and even the most mediocre musicians could effortlessly invent melodies superior to almost anything heard since. Young, creative people with any sense of ambition tended to be drawn to these places like weight to gravity, but by the time I arrived the magic was receding.

The overwhelming explanation we held of our time and place was that we had been born too late to experience the one true orgasm of meaning, the 1960s. Young people who felt jilted by life because of a slight error in timing found solace in a twisted calculus of punk humour. An alternative to the Santa Cruz-type El Dorados of bohemia were the zones of brazen, barren reality: remote and violent desert towns, impoverished villages in Mexico, or tenements in New York City.

The most deficient places - condemned by hippies and punks alike - were the suburbs, the places of the conventional parent: an artificial world ruled by Disney and McDonald's.

I did not arrive at this suspect ontology naturally, having grown up in a way that was both gritty and bohemian. My father and I couldn't afford a home at one point, when I was 11, so we lived in tents on cheap land while building a crazed, geometric, spaceship-like house in a rough corner of southern New Mexico. I adapted to the flight from the suburbs because this seemed the ticket into the social world of my peers in that era. I well remember how my heart sank when I later realised that eco­nomic circumstances left me no choice but to force my old jalopy over the mountain pass that insulated dewy, arousing Santa Cruz from soul-killing, blandifying Silicon Valley, which was situated in, of all places, a suburb.

The mountain ridge that separates Silicon Valley and the town of Palo Alto from the ocean keeps out the famed fog of northern California in the summer. This has always made it an elite getaway from San Francisco, but to me Silicon Valley's light looked incomplete and made me feel remote and depressed - so close to the ocean, but without its full light.

I despaired at the time that I had failed to earn enough to be able to remain at the fulcrum of hippie truth, but I was to learn, slowly, that I was moving from one narcissistic category war to another. Instead of hippies v suburbs, I enlisted in the turf war between nerds and - well, the opposite doesn't have a name. A sort of muggle: the fool who doesn't realise that he lives in a cocoon and serves only as a battery to power the action; a person who fails to understand that the world is an information system, and that life is programming.

Having moved from one kind of nonsense to another eventually helped me learn to be sceptical of both.

Palo Alto was nicknamed "Shallow Alto" by the hippie hackers, who felt that living there was a sell-out, a sign of failure. And yet, one by one, we gave in and entered an alternate, infinitely better-funded elite club. The place was much more than a suburb, naturally. A little more than a century earlier, there had been a Native American culture there, but it was murdered and erased, so little more can be said. Layers of mutually indifferent histories were then overlaid on to this, awaiting the final washout by Silicon Valley culture.

A trace of the Spanish colonial period remained in the odd old adobe mansion; evidence of black immigration from earlier in the 20th century lay in the shocking, violent twin to Palo Alto, East Palo Alto; fruit orchards swept to the horizon in some directions and utilitarian grids of simple wooden buildings testified to the well-ordered conception of railroad towns and military bases.

But the hackers would take over. What a strange society nerds make. In 1996 Oliver Sacks published a book called The Island of the Colour-blind, about a place where so many people cannot see colour that it becomes the norm. In the same way, the society of computer nerds is nerdy not in comparison to a centre, but as a centre. Our nerdy world, which from an outsider's perspective might seem slightly askew, even tilted a touch into Asperger's syndrome, was and is our centre. The rest of the world seemed hysterical, irrational and confused by the surface aesthetics of things, somehow failing to grasp the numerical, causal, core truth underpinning events and the problem-solving purpose of reality.

I kept my concerns about the light of Palo Alto to myself and "passed", which was, happily, not hard for me. Certain kinds of math and programming come on strongest when you're young, and I could program the hell out of a computer in those days. Then and now, technical credibility is the ultimate membership card in Silicon Valley, and it is one of the reasons I still love the place. The billionaire company starters - and I won't name names because it's all of them - still get a little insecure and feel a need to preen when they're around top hackers.

The overlap between the late stages of hippie bohemia and the early incarnations of Silicon Valley was often endearing. There was a sense of justice in the way that males who had been at the bottom of the social ladder in high school were on track to run the world. Greasy cottages with futons on the floor, with dustings of pot and cookie crumbles rubbed into cheap oriental rugs, a carnage of forgotten dirty clothes in the corner, empty refrigerators and tangles of thick grey cables leading to the huge computer monitors and the hot metal cabinets where the silicon chips crunched. Asymmetrical, patchy beards, shirts part tucked, prescriptions for glasses powerful enough to find life on a distant planet. This was the new model of hippie nerd, supplanting the ascetic fellow with the pocket protector.

There were precious few girl nerds at the time. There was one who programmed a hit arcade game called Centipede for the first video game company, Atari, and a few others. There were, however, extraordinary female figures who served as the impresarios of social networking before there was an internet. It still seems wrong to name them, because it isn't clear if I would be talking about their private lives or their public contributions: I don't know how to draw a line.

These irresistible creatures would sometimes date alpha nerds, but mostly brought the act of socialising into a society where it probably would not have occurred otherwise. A handful of them had an extraordinary, often unpaid degree of influence over what research was done, which companies came to be, who worked at them and what products were developed.

That they are usually undescribed in histories of Silicon Valley is just another instance of what a fiction history can be. The advent of social networking software and oceans of digital memories of bits exchanged between people has only shifted the type of fiction we accept, not the degree of infidelity.

In retrospect, I cringe to think how naive and messianic the tech scene became amid all the post-1960s idealism. The two poles of San Francisco Bay Area 1960s culture - psychedelic hippies and leftist revolutionaries - became the poles of early computer culture.

In 1974, the philosopher Ted Nelson, the first person to propose and describe the programming of something like the web, published a large-format book composed of montages of nearly indecipherable small-print snippets flung in all directions, called Computer Lib/Dream Machines. If you turned the book one way, it was what Che Guevara would have been reading in the jungle if he had been a computer nerd. Flip it upside down, and you had a hippie-wow book with visions of crazy, far-out computation.

In fact, the very first description of the internet in any detail was probably E M Forster's The Machine Stops from 1909, decades before computers existed: "People never touched one another. The custom had become obsolete, owing to the Machine." It might still be the most accurate description. How Forster did it remains a mystery. Later, in the 1940s, the engineer Vannevar Bush wrote "As We May Think", an essay imagining a utilitarian experience with a computer and internet of the future. Bush's essay is often cited as a point of origin, and he even delved a little into how it might work, using such pre-digital components as microfilm.

But Ted Nelson was the first person, to my knowledge, to describe how you could implement new kinds of media in digital form, share them and collaborate. Ted was working so early - from 1960 onwards - that he couldn't invoke basic notions such as storing images, and not just text, because computer graphics had not been described yet. (The computer scientist Ivan Sutherland saw to that shortly.)

Ted was a talker, a character, a Kerouac. He was more writer than hacker, and didn't always fit into the nerd milieu. Thin, lanky, with a sharp chin and always a smile, he looked good. He came from Hollywood parents and was determined to be an outsider because, in the ethics of the times, only the outsiders were "where it's at". He succeeded tragically, in that he is not as well known as he ought to be, and it's a great shame he was not better able to influence digital architecture directly. He lives today on a houseboat in Sausalito, California, one of the other luminous, numinous nodes of Bay Area geo-mythology.

The hippest thing in the late 1970s and early 1980s was to form a commune, or even a cult. I remember one around the Haight-Ashbury neighbourhood of San Francisco which fashioned itself as the Free Print Shop. Members printed lovely posters for "movement" events in the spectral, inebriated, neo-Victorian visual style of the time. (How strange it was to hear someone recommended as "part of the movement". This honorary title meant nothing beyond aesthetic sympathy, but there was an infantile gravity to the word "movement", as though our conspiracies were consequential. They never were, except when computers were involved, in which case they were more consequential than almost any others in history.)

The Free Print Shop made money doing odd jobs, it included women and it enacted a formal process for members to request sex with one another through intermediaries. This was the sort of thing that seemed the way of the future and beckoned to computer nerds: an algorithm leading reliably to sex! I remember how reverently dignitaries from the Free Print Shop were welcomed at a meeting of the Homebrew Club at Stanford and other such venues where computer hobbyists shared their creations.

Ted had a band of followers or collaborators; it would have been uncool to specify what they were. They sometimes lived in a house here or there, or vagabonded about. They broke up and reconciled repeatedly, and were perpetually on the verge of presenting the ultimate software project, Xanadu, in some formulation that would have been remembered as the first implementation of the internet. Xanadu was a manifesto that never quite manifested.

If my tone has not been consistently reverent, please know that I am not cynical when it comes to my praise of Ted Nelson's ideas. As the first person on the scene, he benefited from an uncluttered view. Our huge collective task in finding the best future for the internet will probably turn out to be like finding our way back to where Ted was at the start.

In his conception, each person would be a free agent in a universal online market. Instead of separate stores of the kind run by Apple or Amazon, there would be one universal store, and everyone would be a first-class citizen, both buyer and seller. You wouldn't have to keep separate passwords or accounts for different online stores. That's a pain, and it guarantees that there can't be too many stores, thereby re-creating the kind of centralisation that shouldn't be inherited from physical reality.

This is an example of how thinking in terms of a network can strain intuition. It might seem as though having only one store would reduce diversity, yet it would increase it. When culture is privatised, as has happened recently online, you end up with a few giant players - the Googles and Amazons. It's better to put up with the rancour and pain of a single community, of some form of democracy, than to live in a world overseen by a few forces you hope will be benevolent. The stress of accommodation opens cracks from which brilliance emerges.

Ah, there it is - my idealism, still in your face after all these years. Silicon Valley remains idealistic, if sometimes narcissistic. We refer to uprisings in the Middle East as "Facebook revolutions" as if it's all about us. And yet, look. We code and scheme through the night, and then genuinely change the whole world within a few short years, over and over again. What other bunch of oddballs can say that?

Much has changed. Silicon Valley now belongs to the world. In a typical nerd cabal you will find recently arrived Indians, Chinese, Brits, Israelis and Russians. What is strangest in the recent waves of young arrivals in Silicon Valley is that they tend no longer to be downtrodden geniuses rejected in the playing of social status games, but sterling alpha males. Legions of perfect specimens seem to have grown up in manicured childhoods, nothing scrappy about them. When children started to be raised perfectly in the 1990s, chauffeured from one play date to the next, I wondered what world they would want as adults. Socialism? Facebook and similar designs seem to me continuations of the artificial order we gave children during the boom years.

Now we are entering a period of diminishing middle classes and economic dimming. What will Silicon make of this? Poorly conceived computer networks played central roles in many of our more recent troubles, particularly the 2008 financial crisis. Such tactics as high-frequency trading just pluck money out of the system using pure computation and without giving anything back.

Can we adjust the world, make it happier, merely by reprogramming computers? Perhaps. We continue to twiddle with human patterns from our weird suburb. Maybe, if we are able to echo the ancient idealism of those early days, we will do some good as the software grows.

Jaron Lanier is the author of "You Are Not a Gadget: a Manifesto" (Penguin, £9.99)

This article first appeared in the 15 August 2011 issue of the New Statesman, The coming anarchy

Credit: BRIDGEMAN IMAGES
Show Hide image

A century ago, the Spanish flu killed 100 million people. Is a new pandemic on the way?

Our leaders need to act like the outbreak has already started – because for all we know it may have.

It is hard not to have a sneaking envy of the virus. As complex creatures, we are distracted by myriad demands on our attention; we will never know the dead-eyed focus of the viral world. It is akin to the psychopath: a cold, purposeful drive to achieve its own agenda, coupled with the skills and resourcefulness to succeed. In a world threatened by nuclear war and devastating climate change, it may actually be the virus that we should fear most.

This is the centenary year of the Spanish flu outbreak, when a virus killed between 50 and 100 million people in a matter of months. The devastation was worldwide; it is only known as Spanish flu because Spain, neutral in the ongoing hostilities of World War One, was the only country without press restrictions. Across Europe, people assumed their own outbreaks originated in the only place reporting on the disaster.

A number of authors have lined up with a kind of grim celebration of influenza’s annus mirabilis. As well as chronicling the fatal reach of this organism, they all offer a warning about a follow-up pandemic that is overdue – and for which, it seems, we are largely unprepared. “Somewhere out there a dangerous virus is boiling up in the bloodstream of a bird, bat, monkey, or pig, preparing to jump to a human being,” says Jonathan Quick in The End of Epidemics. “It has the potential to wipe out millions of us, including my family and yours, over a matter of weeks or months.”

If that seems a little shlocky, you should know that Quick is no quack. He is a former director at the WHO, the current chair of the Global Health Council and a faculty member at Harvard Medical School. The book’s blurb includes endorsements from the director of the London School of Hygiene and Tropical Medicine, the president of Médicins Sans Frontières, and the president of the Rockefeller Foundation.

The numbers Quick serves up are stupefying. Bill Gates, for instance, has said it is more likely than not that he will live to see a viral outbreak kill over 10 million people in a year. In Gates’s nightmare scenario, outlined by computer simulations created with disease-modelling experts, 33 million people die within 200 days of the first human infection. The potential for exponential spread means a death toll of 300 million is possible in the first year. “We would be in a world where scrappy, ravaged survivors struggle for life in a zombie-movie wasteland,” Quick tells us in his informed, cogent and – honestly – frightening book.

If you can’t imagine what that is like, you could try asking the Yupik people of Alaska, who were devastated by the 1918 Spanish flu. You might not get an answer, however, because they remain traumatised, and have made a pact not to speak about the pandemic that shattered their ancient culture.  (A pandemic is a disease that spreads across continents; an epidemic is usually contained within a country or continent.)They aren’t the only long-term sufferers. The Vanuatu archipelago suffered 90 per cent mortality and 20 of its local languages went extinct. Those in the womb in 1918 were also affected. A baby born in 1919 “was less likely to graduate and earn a reasonable wage, and more likely to go to prison, claim disability benefit, and suffer from heart disease,” reports Laura Spinney in Pale Rider.

Such arresting snippets of the flu’s legacy abound in Spinney’s thoughtful, coherent take on the 1918 outbreak. The book’s subtitle suggests that the Spanish flu changed the world, and Spinney certainly backs this up. Societies broke down and had to be rebuilt; recovering populations were reinvigorated by the simple calculus of Darwin’s “survival of the fittest”; public health provisions were first imagined and then brought into reality; artists and writers responded to a new global mood by establishing new movements.

Not every outcome could be spun as a positive. Scientists, for instance, were humiliated by their inability to halt the flu’s progress, creating an opportunity for quack medicines to arise and establish themselves. Some of our greatest writers lived through the trauma, but could never bring themselves to discuss it in their stories. Virginia Woolf noted that it was “strange indeed that illness has not taken its place with love and battle and jealousy among the prime themes of literature”.

Spinney’s background as a science writer shines through: her handling of the workings of the flu is detailed and deft. She brings both the influenza A virus (the only type responsible for pandemics) and the human immune system to life, laying out the biochemical processes that kill and cure with clarity and care. She exposes the chilling roots of often-used but seldom-explained viral names such as “H1N1” (Spanish flu) or “H5N1” (bird flu). H is for haemagglutinin, the lollipop-shaped appendage that allows a virus to break into a cell and take over the means of production. N is for neuraminidase, the “glass-cutter” structure that allows replicated viruses to break out again and unleash hell upon the host. So far, we know of 18 H’s and 11 N’s and they all have ever-evolving sub-types that make a long-lasting general vaccine against the flu an elusive dream: “Every flu pandemic of the 20th century was triggered by the emergence of a new H in influenza A,” says Spinney.

For all her technical expertise, Spinney has a light touch and a keen eye for the comic. She relates how a ferret sneezing in the face of a British researcher in 1933 exposed influenza’s ability to travel between biological species, for instance. She also excels with the bigger picture, detailing the century of scientific detective work that has allowed us to piece together the genetic elements of the 1918 virus and gain insights into its creation. It seems to have jumped to humans on a farm in Kansas, via domestic and wild birds indigenous to North America. There may also have been some ingredients from pigs, too, but that’s not settled.

Spinney’s afterword questions whether our collective memory for such events ever reflects the truth of the moment. “When the story of the Spanish flu was told, it was told by those who got off most lightly: the white and well off,” she tells us. “With very few exceptions, the ones who bore the brunt of it, those living in ghettoes or at the rim, have yet to tell their tale. Some, such as the minorities whose languages died with them, never will.”

That said, Catharine Arnold has done a remarkable job of relating the tales of a diverse set of sufferers, crafting an arresting and intimate narrative of the 1918 pandemic. She pulls the accounts of hundreds of victims into a gripping tale that swoops down into the grisly detail, then soars up to give a broad view over the landscape of this calamitous moment in human history.

Arnold’s remembrances come from the unknown and from celebrities. A Margery Porter from south London emphasised that “we just couldn’t stand up. Your legs actually gave way, I can’t exaggerate that too much.” John Steinbeck described the experience of infection as almost spiritual. “I went down and down,” he said, “until the wingtips of angels brushed my eyes.”

The reality was, inevitably, less poetic. A local surgeon removed one of Steinbeck’s ribs so that he could gain access to the author’s infected lung. Most victims’ bodies turned blue-black as they died. Healthcare workers reported appalling scenes, with delirious patients suffering horrific nosebleeds. “Sometimes the blood would just shoot across the room,” a navy nurse recalled. If their lungs punctured, the patients’ bodies would fill with air. “You would feel somebody and he would be bubbles… When their lungs collapsed, air was trapped beneath their skin. As we rolled the dead in winding sheets, their bodies crackled – an awful crackling noise with sounded like Rice Krispies when you pour milk over them.”

The killer in 1918 was often not the flu virus itself but the “cytokine storm” of an immune system overreacting to the infection. Strong, fit young people, with their efficient immune systems, were thus particularly at risk, their bodies effectively shutting themselves down. Then there were the ravages of opportunistic bacteria that would lodge in the devastated tissue, causing pneumonia and other fatal complications. Arnold paints a grim but vivid picture of exhausted gravediggers and opportunistic funeral directors cannily upping their prices. The morgues were overflowing, and morticians worked day and night. In the end, mass graves were the only answer for the poverty-stricken workers attempting to bury their loved ones before they, too, succumbed.

No one was spared from grief or suffering at the hands of the “Spanish Lady”, as the flu came to be known. Louis Brownlow, the city commissioner for Washington DC, reported nursing his stricken wife while answering telephone calls from desperate citizens. One woman called to say that of the three girls she shared a room with, two had died, and the third was on her way out. Brownlow sent a police officer to the house. A few hours later, the sergeant reported back from the scene: “Four girls dead.”

Some of the other stories Arnold has unearthed are equally heartbreaking. A Brooklyn boy called Michael Wind wrote of the moment his mother died after less than a day of being ill. He and his five siblings were at her bedside, as was their father, “head in hands, sobbing bitterly”. The following morning, knowing that he was soon to die too, their father took the three youngest children to the orphanage.

Arnold writes beautifully, and starkly, of the tragedy that unfolded in the autumn months of 1918: “the Spanish Lady played out her death march, killing without compunction. She did not discriminate between statesmen, painters, soldiers, poets, writers or brides.” She chronicles the Lady’s path from the United States and Canada through Europe, Africa and Asia, culminating in New Zealand’s “Black November”. The book is utterly absorbing. But how do we respond to its horrors and tragedies? What are we to do with our collective memories of such visceral, world-shattering events? Learn from them – and fast, argues Jonathan Quick.

Unlike Arnold and Spinney, Quick is not content to be a chronicler or a bystander. He is, he says, both terrified at the looming disaster and furious at the lack of high-level reaction to its threat. He is determined to create a movement that will instigate change, mimicking the way activists forced change from governments paralysed by, and pharmaceutical companies profiteering from, the Aids pandemic. Quick has channelled his fury: The End of Epidemics is, at heart, a call to arms against influenza, Ebola, Zika and the many other threats before us.

 

So what are we to do? First, our leaders need to act like the outbreak has already started – because for all we know it may have. We must strengthen our public health systems, and create robust agencies and NGOs ready to monitor and deal with the threat. We must educate citizens and implement surveillance, prevention and response mechanisms, while fighting misinformation and scaremongering. Governments must step up (and fund) research.

We can’t develop a vaccine until the threat is manifest, but we can prepare technology for fast large-scale production. We can also invest in methods of early diagnoses and virus identification. Invest $1 per person per year for 20 years and the threat will be largely neutralised, Quick suggests. Finally – and most importantly – there is an urgent need to create grass-roots support for these measures: citizen groups and other organisations that will hold their leaders to account and prevent death on a scale that no one alive has ever experienced. Is this achievable? Traumatised readers of Quick’s book will be left hoping that it is.

For all the advances of the last century, there are many unknowns. Scientists don’t know, for instance, which microbe will bring the next pandemic, where it will come from, or whether it will be transmitted through the air, by touch, through body fluids or through a combination of routes.

While there is considerable attention focused on communities in West Africa, East Asia or South America as the most likely source of the next outbreak, it’s worth remembering that most scientists now believe the 1918 influenza outbreak began on a farm in Kansas. Quick suggests the
next pandemic might have a similar geographical origin, thanks to the industrialised livestock facilities beloved by American food giants.

Viruses naturally mutate and evolve rapidly, taking up stray bits of genetic material wherever they can be found. But it’s the various flu strains that live inside animals that bring sleepless nights to those in the know. They can exist inside a pig, bat or chicken without provoking symptoms, but prove devastating if (when) they make the jump to humans. As more and more humans live in close proximity to domesticated animals, encroach on the territories inhabited by wild animals, and grow their food on unprecedented scales, our chance of an uncontrollable epidemic increase.

The meat factories known as “Concentrated Animal Feeding Operations” (CAFOs) are particularly problematic. They provide cheap meat, poultry, dairy and
eggs from animals kept in what Quick terms “concentration camp conditions”, simultaneously creating the perfect breeding ground for new and dangerous pathogens. Pigs, he points out, eat almost everything, so their guts are the perfect mixing bowls for a new and deadly influenza strain. “CAFOs were the birthplace of swine flu, and they could very likely be the birthplace of the next killer pandemic,” Quick warns.

There are other possibilities, though – bioterror, for instance. Bill Gates is among
those who have warned that terrorist groups are looking into the possibility of releasing the smallpox virus in a crowded market, or on a plane. Then there is the possibility of a scientist’s mistake. In 1978 a woman died after smallpox was released from a laboratory at the University of Birmingham, UK. In 2004 two Chinese researchers accidentally infected themselves with the SARS virus and spread it to seven other people, one of whom died. In 2014, a cardboard box full of forgotten vials of smallpox was found in a National Institutes of Health facility in Bethesda, Maryland. A year later, the US military accidentally shipped live anthrax spores to labs in the US and a military base in South Korea. It’s not impossible that human error could strike again – with catastrophic results.

Such possibilities lie behind our discomfort with what scientists have to do to further our understanding. Researchers in Rotterdam, for instance, wanted to know whether the deadly H5N1 bird flu could develop a capacity for airborne transmission like the common cold virus. Having failed to modify its genetics to achieve this, they began to pass an infection between ferrets, the animals whose response to the virus most mimics that of humans. Ten ferrets later, healthy animals were catching the virus from the cage next door. Knowing how easily H5N1 can become airborne is exactly the kind of discovery that will bolster our vigilance. It is, after all, many times more fatal than the H1N1 strain that caused the Spanish flu. At the same time, there was a huge – but understandable –
furore over whether the research should
be published, and thus be available to potential bioterrorists.

We might have to live with such dilemmas, because it is important to be ready to challenge the killer virus when it arrives. As we have seen with Aids and the common cold, developing vaccines takes time, and there is no guarantee of success, even with a concerted research effort.

****

Will we be ready? Quick suggests that our best chance lies in the world’s business leaders realising what’s at stake: economies would be devastated by the next pandemic. In 1918, Arnold points out, the British government was telling citizens it was their patriotic duty to “carry on” and make sure the wheels of industry kept turning. The result was a perfect environment for mass infection. Political leaders made similar mistakes across the Atlantic: on 12 October President Wilson led a gathering of 25,000 New Yorkers down the “Avenue of the Allies”. “That same week,” Arnold reports, “2,100 New Yorkers died of influenza.”

It’s worth noting that Spanish flu did not abate because we outsmarted it. The pandemic ended because the virus ran out of people it could infect. Of those who didn’t die, some survived through a chance natural immunity, and some were lucky enough to have maintained a physical separation from those carrying the invisible threat. The virus simply failed to kill the rest, enabling their bodies to develop the antibodies required to repel a further attack. A generation or two later, when the antibody-equipped immune systems were in the grave, and humans were immunologically vulnerable (and complacent) once again, H1N1 virus re-emerged, causing the 2009 swine flu outbreak.

As these books make clear, this is a history that could repeat all too easily in our time. Of the three, Pale Rider is perhaps the most satisfying. It has greater complexity and nuance than Arnold’s collection of harrowing tales, fascinating though they are. Spinney’s analysis is more circumspect and thus less paralysing than Quick’s masterful exposition of our precarious situation. But the truth is we need all these perspectives, and probably more, if we are to avoid sleepwalking into the next pandemic. Unlike our nemesis, humans lack focus – and it could be our undoing. 

Michael Brooks’s most recent book is “The Quantum Astrologer’s Handbook” (Scribe)

Pale Rider: The Spanish Flu of 1918 and How it Changed the World
Laura Spinney
Vintage, 352pp, £25

Pandemic 1918: The Story of the Deadliest Influenza in History
Catharine Arnold
Michael O’Mara, 368pp, £20

The End of Epidemics
Jonathan D Quick with Bronwyn Fryer
Scribe, 288pp, £14.99

Michael Brooks holds a PhD in quantum physics. He writes a weekly science column for the New Statesman, and his most recent book is At the Edge of Uncertainty: 11 Discoveries Taking Science by Surprise.

This article first appeared in the 15 August 2011 issue of the New Statesman, The coming anarchy