Show Hide image

How quantum computing will change the world

We are on the cusp of a new era of computing, with Google, IBM and other tech companies using a theory launched by Einstein to build machines capable of solving seemingly impossible tasks.

In 1972, at the age of ten, I spent a week somewhere near Windsor – it’s hazy now – learning how to program a computer. This involved writing out instructions by hand and sending the pages to unseen technicians who converted them into stacks of cards punched with holes. The cards were fed overnight into a device that we were only once taken to see. It filled a room; magnetic tape spooled behind glass panels in big, grey, wardrobe-sized boxes. The next morning, we’d receive a printout of the results and the day would be spent finding the programming faults that had derailed our calculations of pi to the nth decimal place.

There was awed talk of computer experts who worked at an even rawer level of abstraction, compiling programs (no one called it coding then) in the opaque, hieroglyphic notation of “machine code”. Those were the days when you had to work close to the guts of the machine: you thought in terms of central processing units, circuit diagrams, binary logic. If you wanted to play games, you had to write them yourself – by the 1980s, on a BBC Micro or Sinclair ZX Spectrum with less graphical sophistication than an ATM.

I was reminded of those clunky, makeshift early days of public access to computers when, in September, I saw one of IBM’s quantum computers at the company’s research labs in Rüschlikon, a suburb of Zurich. On a hill overlooking Lake Zurich, in the early autumn sunshine, the labs have a laid-back air that is more Californian than Swiss. In the past several decades, they have been the incubator of Nobel Prize-winning scientific innovations. Things grow here that affect the world.

This computer has the improvised appearance of a work in progress. It’s a sturdy metal cylinder the size and shape of a domestic water-heater immersion tank, suspended on a frame of aluminium beams reaching to the ceiling and brought to life by a dense tangle of wires that lead to a bank of off-the-shelf microwave oscillators. The “brain” – the component in which binary ones and zeros of data are crunched from input to output – sits deep inside this leviathan, on a microchip the size of a baby’s fingernail.

The last time I visited IBM’s Zurich centre, in 2012, its head of science and technology, Walter Riess, talked about the company’s plans for an imminent “post-silicon” era, after the silicon-chip technology of today’s computers had reached the physical limits of its ability to offer more computing power. Back then, quantum computing seemed like a far-off and speculative option for meeting that challenge.

Now it’s real. This is what computing felt like in the 1950s, Riess told me in September as he introduced me to the new device. It has become routine to juxtapose images of these room-filling quantum machines with the first prototype digital computers, such as the valve-driven ENIAC (or “Electronic Numerical Integrator and Computer”) at the University of Pennsylvania, used for ballistics calculations by the US military. If this is where quantum computing is now, such pictures imply, just try to imagine what’s coming.

Quantum computing certainly sounds like the future. It’s the technology of choice for sci-fi film-makers who want their artificial intelligence networks to have unlimited potential. But what is it really about, and what might it do?


This “quantum information technology” is often presented as more of the same, but better. We have become so accustomed to advances in computing being reflected in slimmer, faster laptops and bigger memories that quantum computing is often envisaged in the same terms. It shouldn’t be.

It represents the first major shift in how computing is done since electronic computing devices were invented in the vacuum-tube-powered, steam-punk 1940s. Digital computers manipulate information encoded in binary form as sequences of ones and zeros; for example, as pulses of electrical current. The circuits contain “logic gates”, which produce binary outputs that depend in well-defined ways on the inputs: a NOT gate, say, simply inverts the input, converting a one to a zero and vice versa.

All the rest is software, whether that involves converting keystrokes or mouse movements into images, or taking numbers and feeding them into an equation to work out the answer. It doesn’t much matter what the hardware is – transistors replaced vacuum tubes in the 1950s and have since been shrunk to far smaller than the size of bacteria – so long as it can perform these transformations on binary data.

Quantum computers are no different, except in one crucial respect. In a conventional (“classical”) computer, one bit of binary data can have one of just two values: one or zero. Think of it as transistors acting like the light switches in your house: they are set to either on or off. But in a quantum computer, these switches, called quantum bits or qubits (pronounced “cue-bits”), have more options, because they are governed by the laws of quantum theory.

This notoriously recondite theory started to take shape at the beginning of the 20th century through the work of Max Planck and Albert Einstein. In the 1920s, scientists such as Werner Heisenberg and Erwin Schrödinger devised mathematical tools to describe the kinds of phenomena that Einstein and Planck had revealed. Quantum mechanics seemed to say that, at the minuscule level of atoms, the world behaves very differently from the classical mechanics that had been used for centuries to describe how objects exist and move. Atoms and subatomic particles, previously envisaged as tiny grains of matter, seemed sometimes to show behaviour associated instead with waves – which are not concentrated at a single point in space but are spread throughout it (think of sound waves, for example).

What’s more, the properties of such quantum objects did not seem to be confined to single, fixed values, in the way that a tossed coin has to be either heads or tails or a glove has to be left- or right-handed. It was as if they could be a mixture of both states at once, called a superposition.

That “as if” is crucial. No one can say what the “true nature” of quantum objects is – for the simple yet perplexing reason that quantum theory doesn’t tell us. It just provides a means of predicting what a measurement we make will reveal. A quantum coin can be placed in a superposition of heads and tails: once it is tossed, either state remains a possible outcome when we look. It’s not that we don’t know which it is until we look; rather, the outcome isn’t fixed, even with the quantum coin lying flat on the table covered by our palm, until we look. Is the coin “both heads and tails” before we look? You could say it that way, but quantum mechanics declines to say anything about it.

Thanks to superposition, qubits can, in effect, encode one and zero at the same time. As a result, quantum computers can represent many more possible states of binary ones and zeros. How many more? A classical bit can represent two states: zero and one. Add a bit (an extra transistor, say) to your computer’s processor and you can encode one more piece of binary information. Yet if a group of qubits are placed in a joint superposition, called an entangled state, each additional qubit doubles the encoding capacity. By the time you get to 300 qubits – as opposed to the billions of classical bits in the dense ranks of transistors in your laptop’s microprocessors – you have 2^300 options. That’s more than the number of atoms in the known universe.

You can only access this huge range of options, though, if all the qubits are mutually dependent: in a collective or “coherent” state, which, crudely speaking, means that if we do something to one of them (say, flip a one to a zero), all the others “feel” it. Generally, this requires all the qubits to be placed and maintained in an entangled state.

ENIAC, one of the world’s first digital computers, at the University of Pennsylvania

The difficulty of making a quantum computer mostly involves making and sustaining these coherent states of many qubits. Quantum effects such as superposition and entanglement are delicate and easily disrupted. The jangling atomic motions caused by heat can wash them away. So, to be coherently entangled, qubits must be cooled to extremely low temperatures – we’re typically talking less than a degree above absolute zero (-273° C) – and kept well isolated from the laboratory environment: that is, from the very equipment used to manipulate and measure them. That’s partly why the IBM quantum computer I saw is so bulky: much of it consists of cooling equipment and insulation from the lab environment.

Because of the fragility of entanglement, it has so far been possible only to create quantum computers with a handful of qubits. With more than a few dozen, keeping them all entangled stretches current quantum technologies to the limit. Even then, the qubits remain in a coherent state for just a fraction of a second. You have only that long to carry out your entire quantum computation, because once the coherence starts to decay, errors infect and derail the calculation.

There’s a consensus that silicon transistors are the best bits for conventional computers, but there’s no such agreement about what to make qubits from. The prototype devices that exist so far outside academic laboratories – at Google, IBM and the Canadian company D-Wave, for example – use miniaturised electronic circuits based on superconducting devices. Here, the quantum behaviour stems from an exotic effect called superconductivity, in which metals at very low temperatures conduct electricity without any resistance. The advantage of such qubits is that the well-developed micro­fabrication technologies used for making silicon circuitry can be adapted to make them on silicon chips, and the qubits can be designed to order.

But other researchers are placing their money on encoding data into the quantum energy states of individual atoms or ions, trapped in an orderly array (such as a simple row) using electric or magnetic fields. One advantage of atomic qubits is that they are all identical. Information can be written into, read out from and manipulated within them using laser beams and microwaves. One leading team, headed by Chris Monroe of the University of Maryland, has created a start-up company called IonQ to get the technology ready for the market.


Quantum computers have largely been advertised on the promise that they will be vastly faster at crunching through calculations than even the most powerful of today’s supercomputers. This speed-up – immensely attractive to scientists and analysts solving complex equations or handling massive data sets – was made explicit in 1994 when the American mathematician Peter Shor showed in theory that a computer juggling coherent qubits would be able to factor large numbers much more efficiently than classical computers. Reducing numbers to their simplest factors – decomposing 12 to “two times two times three”, for example – is an exercise in elementary arithmetic, yet it becomes extremely hard for large numbers because there’s no shortcut to trying out all the possible factors in turn. Factorising a 300-digit number would take current supercomputers hundreds of thousands of years, working flat out.

For this reason, a lot of data encryption – such as when your credit card details are sent to verify an online purchase – uses codes based on factors of large numbers, which no known computer can crack. Yet Shor showed that a quantum factorisation algorithm could find factors much more efficiently than a classical one can.

How it does so is hard to say. The usual explanation is that, because of all those entangled qubits, a quantum computer can carry out many calculations in parallel that a classical computer would do only one after the other. The Oxford physicist David Deutsch, who pioneered the theory of quantum computing in the 1980s, argues that the quantum computer must be regarded as lots of classical computers working at the same time in parallel universes – a picture derived from Deutsch’s belief in the controversial “many-worlds” interpretation of quantum mechanics, which holds that every possible measurement outcome on a superposition is realised in separate universes. Deutsch’s view isn’t the common one, but even the milder notion of parallel calculations doesn’t satisfy many researchers. It’s closer to the truth to say that entanglement lets qubits share information efficiently, so that quantum logic operations somehow “count for more” than classical ones.

Theorists tend then to speak of quantum computation as drawing on some “resource” that is not available to classical machines. The exact nature of that resource is a little vague and probably somewhat different for different types of quantum computation. As Daniel Gottesman of the Perimeter Institute in Waterloo, Canada, told me, “If you have ‘enough’ quantum mechanics available, in some sense, then you have a speed-up, and if not, you don’t.”

Regardless of exactly how quantum speed-up works, it is generally agreed to be real – and worth seeking. All the same (although rarely acknowledged outside the field), there’s no reason to believe that it’s available for all computations. Aside from the difficulty of building these devices, one of the biggest challenges for the field is to figure out what to do with them: to understand how and when a quantum computation will better a classical one. As well as factorisation, quantum computation should be able to speed up database searches – and there’s no question how useful that would be, for example in combing through the masses of data generated in biomedical research on genomes. But there are currently precious few other concrete applications.

One of the big problems is dealing with errors. Given the difficulty of keeping qubits coherent and stable, these seem inevitable: qubits are sure to flip accidently now and again, such as a one changing to a zero or getting randomised. Dealing with errors in classical computers is straightforward: you just keep several copies of the same data, so that faulty bits show up as the odd one out. But this approach won’t work for quantum computing, because it’s a fundamental and deep property of quantum mechanics that making copies of unknown quantum states (such as the states of qubits over the course of a computation) is impossible. Developing methods for handling quantum errors has kept an army of researchers busy over the past two decades. It can be done, but a single error-resistant qubit will need to be made from many individual physical qubits, placing even more demands on the engineering.

And while it is tempting to imagine that more powerful quantum computers mean more qubits, the equation is more complex. What matters, too, is how many logical steps in a computation you can carry out on a system of qubits while they remain coherent and relatively error-free – what quantum scientists call the depth of the calculation. If each step is too slow, having more qubits won’t help you. “It’s not the number of qubits that matters, but their quality,” John Martinis, who heads Google’s quantum-computing initiative in their Santa Barbara research centre, told me.


One of the likely first big applications of quantum computing isn’t going to set the world of personal computing alight, but it could transform an important area of basic science. Computers operating with quantum rules were first proposed in 1982 by the American physicist Richard Feynman. He wasn’t concerned with speeding up computers, but with improving scientists’ ability to predict how atoms, molecules and materials behave using computer simulations. Atoms observe quantum rules, but classical computers can only approximate these in cumbersome ways: predicting the properties of a large drug molecule accurately, for example, requires a state-of-the-art supercomputer.

Quantum computers could hugely reduce the time and cost of these calculations. In September, researchers at IBM used the company’s prototype quantum computer to simulate a small molecule called beryllium dihydride. A classical computer could, it’s true, do that job without much trouble – but the quantum computer doing it had just six qubits. With 50 or so qubits, these devices would already be able to do things beyond the means of classical computers.

That feat – performing calculations impossible by classical means – is a holy grail of quantum computing: a demonstration of “quantum supremacy”. If you asked experts a few years ago when they expected this to happen, they’d have been likely to say in one or two decades. Earlier this year, some experts I polled had revised their forecast to within two to five years. But Martinis’s team at Google recently announced that they hope to achieve quantum supremacy by the end of this year.

That’s the kind of pace the research has suddenly gathered. Not so long ago, the question “When will I have a quantum computer?” had to be answered with: “Don’t hold your breath.” But you have one now. And I do mean you. Anyone worldwide can register online to use IBM’s five-qubit quantum computer called IBM Q Experience, housed at the company’s research lab in Yorktown Heights, New York, and accessible via a cloud-based system.

During my Zurich visit, two young researchers, Daniel Egger and Marc Ganz­horn, walked me through the software. The user configures a circuit from just five qubits – as easily as arranging notes on a musical stave – so that it embodies the algorithm (sequence of logical steps) that carries out your quantum computation. Then you place your job in a queue, and in due course the answers arrive by email.

Of course, you need to understand something about quantum computing to use this resource, just as you needed to know about classical computing to program a BBC Micro to play Pong. But kids can learn this, just as we did back then. In effect, a child can now conduct quantum experiments that several decades ago were the preserve of the most hi-tech labs in the world, and that were for Schrödinger’s generation nothing but hypothetical thought experiments. So far, more than 60,000 users have registered to conduct half a million quantum-computing experiments. Some are researchers, some are students or interested tinkerers. IBM Q has been used in classrooms – why just teach quantum theory when you can learn by quantum experiments? The British composer Alexis Kirke has used the device to create “quantum music”, including “infinite chords” of superposed notes.

IBM has just announced that it is making a 20-qubit device available to its corporate clients, too – a leap perhaps comparable to going from a BBC Micro to an Intel-powered laptop. Opening up these resources to the world is an act of enlightened self-interest. If the technology is going to flourish, says the company’s communications manager, Chris Sciacca, it needs to nurture an ecosystem: a community of people familiar with the concepts and methods, who between them will develop the algorithms, languages and ultimately plug-in apps other users will depend on.

It’s a view shared across the nascent quantum computing industry. From Zurich, I travelled to Heidelberg to chair a panel on the topic that included Martinis. It was only with the advent of personal computing in the 1970s, he told the audience, that information technology began to take off, as a generation of computer geeks started to create and share tools such as the Linux operating system – and eventually the Google search engine, Facebook and all the rest.

There’s no shortage of people eager to get involved. Sciacca says that IBM’s researchers can barely get their work done, so inundated are they with requests from businesses. Every company director wants to know, “When can I get one?” While online access is all very well, for reasons of security and prestige you can be sure that companies will want their own quantum machine, just as they have their own web and email servers. Right now, these are hardly off-the-shelf devices and the cloud-based model seems the logical way forward. But no one is likely to forget the apocryphal comment by IBM’s co-founder Thomas J Watson in the 1940s that about five computers should be enough for the entire world.


The commercial potential is immense, and already the giants such as Google and IBM have stiff competition from young upstarts including Rigetti Computing in California, founded in 2013 by the former IBM scientist Chad Rigetti. The company has developed the world’s first commercial fabrication facility for quantum circuitry, and it’s in a bullish mood.

“We expect commercially valuable uses of quantum computing within the next five years,” Madhav Thattai, Rigetti’s chief strategy officer, told me, adding: “In ten to 15 years, every major organisation will use this technology.” What the chances are of us all having portable personal devices that rely on liquid-helium cooling is another matter.

This commercialisation will be accelerated by piggybacking on the infrastructure of conventional computing, such as access to cloud- and web-based resources. It’s likely quantum computers won’t be needed – perhaps not even best able – to do every aspect of a computational problem, but will work in partnership with classical computers, brought in to do only what they do well.

Martinis and his colleagues at Google are testing a 22-qubit quantum computer at Google and have a 49-qubit model in development. IBM has recently announced that it has tested a 50-qubit device, too. Achieving even these numbers is a daunting task – but Martinis has already turned the Google lab into a world leader in astonishingly short order. Given the precipitous recent progress after decades of talk and theory, the challenge is to balance the extraordinary, transformative potential of the field against the risk of hype. It’s as if we’re in the 1950s, trying to predict the future of computing: no one had any idea where it would lead.

In Heidelberg, I asked Martinis how he feels about it all. Is he pinching himself and thinking, “My God, we have real quantum computers now, and we’re already simulating molecules on them”? Or is he fretting about how on Earth he and his peers will surmount the technical challenges of getting beyond today’s “toy” devices to make the promised many-qubit machines that will change the world?

I should have seen the answer coming. A flicker of a smile played across his face as he replied, “I’m in a superposition of both.” 

This article first appeared in the 07 December 2017 issue of the New Statesman, Christmas special

Show Hide image

Einstein’s monsters: what the Cold War films of the 1980s can teach us

Amid the paranoia of the eighties, film-makers attempted to convey the terrifying reality of a nuclear attack. Now in this new age of anxiety we are returning to their prophetic visions

On 1 December 2017, Hawaii’s nuclear war siren network was tested for the first time since the Cold War. Then, on 13 January, a message was sent to that state’s mobile phone networks warning of an incoming ballistic attack (38 long minutes later, this was corrected). On 25 January, the Doomsday Clock was put forward to two minutes to midnight by the Bulletin of the Atomic Scientists, and on 2 February, the US Government published its Nuclear Posture Review, proposing a new arsenal of tactical weapons.

In the space of a few months, the West was transported back to a time that until recently seemed impossibly distant – a time when a new American president was expanding his military ambitions, and a British prime minister was doing anything in her power to galvanise that special relationship.

To grow up in the early 1980s was to grow up with a cloud, one that lifted suddenly into a toroidal fireball usually seen in stock footage or shuddery animation. It was also to grow up with a sound that had been familiar in Britain 40 years earlier: a low wail, rising and descending, like a wounded wolf’s howl. Another eerie sound lingers in the mind from this time: the calm, clipped vowels of a male announcer, advising how to build shelters, avoid fallout, and wrap up your dead loved ones in polythene, bury them, and tag their bodies.

These elements came together in Richard Taylor Cartoon Films’ Protect and Survive series, a collection of public information films made for the government’s Central Office of Information in 1975. They first leaked in 1980, inspiring two groundbreaking British films: a two-hour BBC docudrama that has only been shown three times by the broadcaster, Threads (1984), and a 90-minute animated film about an elderly couple following government advice before, during and after the bomb, called When The Wind Blows (1986).

Threads begins with a close-up of a spider weaving its web, and a voiceover telling us that “everything connects”. We cut to a young couple, middle-class Ruth and working-class Jimmy, heavy-petting in a car in the Peak District; she gets pregnant, and their families nervously meet. The warm hum of TV and radio news forms a comforting haze in the background, until its contents pulse through.

A schoolgirl slowly downs her milk and looks at her wireless. A pub landlord changes a TV channel but his punters want to hear more about Iran. A teenager runs into a shop to tell Mam to come home: the Russians and Americans have started fighting. Forty-six excruciatingly tense minutes into Mick Jackson and Barry Hines’s film, it comes: sirens, upturned buggies, urine down trouser legs, a soft swell of volatile gases above Sheffield. Blasts. Flames. Winds. Silence.

In January, a mass-watching of Threads, hashtagged #ThreadDread on Twitter, was led by Julie McDowall, a journalist and nuclear threat expert campaigning for the BBC to show it for the first time since 2003. The US secretary of state George Shultz saw the film when it aired on CNN in 1985, and it is alleged that it affected the Reagan’s government’s attitude to nuclear war. Jimmy Murakami’s adaptation of Raymond Briggs’ graphic novel When the Wind Blows was brought up by Lord Jenkins of Putney in the House of Lords: he asked Baroness Hooper for an assurance that it would not be banned from being shown in schools. The work of the visual imagination can be powerful; brutal enough to make a difference. 

 The 1984 BBC film Threads was unflinching in its depiction of the horror caused by nuclear fallout after a bomb falls in Sheffield. Credit: AF archive/ Alamy

The Protect and Survive films that had a huge impact on popular culture were only shown twice on British TV: first on 10 March 1980, on the Panorama episode, “If The Bomb Drops” – and once again on a shop’s TV screens in the first section of Threads (the films were declassified in 2005, and are now available on DVD). “They have never been seen before and won’t be seen again until nuclear war is imminent,” explained Panorama’s fresh-faced 29-year-old presenter, Jeremy Paxman. “Their advice is intended to be reassuring.”

Reassurance was the reason that the veteran voiceover artist Patrick Allen was chosen to be their narrator; he was best known at the time for a Barratt Homes TV advert, where he is filmed grinning from a helicopter. (In 1984, he recorded less reassuring lines for a 12-inch mix of Frankie Goes to Hollywood’s No 1 hit “Two Tribes” in a pointed Protect and Survive style: “I am the last voice you will ever hear,” Allen says. “Do not be alarmed.”)

The BBC Radiophonic Workshop’s Roger Limb wrote the series’ electronic theme, which involved two melodies at high and low pitches, coming together – like people, he says. He handed over his tape to the films’ producer, Bruce Parsons in an alley, such was the secrecy required. It is the films’ visual language, however, that remains their most haunting element. They feature a white, cardboard house against a wall of sky-blue, with two faceless parents holding their children’s hands for a logo. The animator Roger McIntosh, then 27, designed this and the film’s mushroom cloud, and remembers signing the Official Secrets Act. “Having a simple style was essential, so the films couldn’t be seen to be entertainment,” he says. “They had to be understood by all audiences, at all levels of education.”

There was a terrifying flipside to that innocent, familiar world. “Their instructions seemed absolutely pointless, to be honest with you,” McIntosh adds. “But, in the face of Armageddon… well, it was a job.”

The editor of Panorama in 1980, Roger Bolton, was shocked when he first saw the films. Now the presenter of Radio 4’s listener programme, Feedback, he remembers visiting the US in late 1979, and realising the impact expanding international defence programmes would have on the UK, which disbanded its civil defence corps in 1968. Panorama’s producer, David Darlow, convinced a local government commissioner to leak the Protect and Survive films to him; Bolton knew broadcasting them was a gamble. “But these films’ instructions were ludicrous. I knew the military would think them ludicrous. So I didn’t ask permission – I just put them out.”

 After broadcast, remarkably, there were few repercussions, although Darlow claims his name was blackened in intelligence circles. The Protect and Survive booklets, which the documentary claimed would take four weeks to produce in the immediate wake of a nuclear threat, were also printed up later that year, and sold, to those who could afford them, for 50p.

But attitudes towards the government were changing, Bolton says. “We have to remember this was only 35 years after the Second World War. People in government were older then, and still believed in the power of authority in wartime. But we were children of the Sixties. We knew we had to question everything.” The economic and political volatility of Britain in 1970s contributed to this mood, and Bolton’s young team rode with the spirit of the times.

“We were very young, and doubtless very arrogant, back then. But with the BBC’s resources, as they were then, at our disposal, if the basic question, ‘Should we do this?’ came up…” He laughs. “Well, we did this.”

 Jim consults his Protect and Survive pamphlet in When the Wind Blows (1986). Credit: AF archive/ Alamy

Across the Atlantic, in his Los Angeles sunroom, Mick Jackson is remembering his days as a BBC documentary maker too. He reads the handwritten letter framed on its wall, dated 24 September 1984, from the then leader of the opposition, Neil Kinnock:

Dear Michael Jackson and Barry Hines,

I’d like to thank you and everyone involved in the making of Threads for your important and impressive work. The story must be told time and time again until the idea of using nuclear weapons is pushed into past history. Don’t, by the way, be troubled by the possibility that some people might be inured to the real thing by seeing horrifying films. The dangers of complacency are much greater than
any risks of knowledge.

Neil Kinnock

“Great rhythmic phrase at the end,” Jackson says, proudly. “Very Kinnock-like.”

 Now a Hollywood director – the Whitney Houston/Kevin Costner blockbuster, The Bodyguard, and the David Irving biopic Denial are on his CV – Jackson began his career making science programmes. An electronic engineering graduate who “changed his mind and then went to film school”, he joined the BBC in 1965, soon after it had decided not to broadcast Peter Watkins’s The War Game, the first film to depict brutally the effects of a nuclear bomb (it was shown in cinemas instead and won the 1966 Oscar for Best Documentary).

“There was a real sense of shame pervading the BBC about that decision,” says Jackson. It had wanted to share the responsibility for broadcasting the film with the Home Office, he explains; the Cabinet Secretary at the time, Burke Trend, said the government “would be relieved” if the BBC didn’t transmit. “That was a clever move. The War Game obviously had a political agenda. And that’s also a problem, obviously, for the BBC.”

After the Panorama special, however, the BBC had renewed confidence, and protest movements against nuclear programmes were also developing at pace (the first women’s peace camp at Greenham Common took place in late 1981, after Margaret Thatcher’s government announced its acquisition of US Trident missiles). Now working on a new BBC science series, QED, Jackson proposed a “scrupulously factual, unbiased” episode, “A Guide to Armageddon”, which coolly described the effects of a one-megaton blast.

Throughout it, images of ordinary life are juxtaposed with horror-movie detail: Jackson used a photo of his local butcher’s in Holland Park, then a close-up of animal fats burning from a pig’s leg, to show the effects of nuclear blast on human flesh. Couples are also seen building or buying shelters of various kinds: Joy and Eric build one under the stairs that will save them for 17 seconds. “I’d wanted to call it ‘A Consumer’s Guide to Armageddon’,” Jackson laughs. “For some reason, the BBC thought that unduly provocative. ‘But I am a scientist,’ I said. ‘Everything will be citable, provable.’” Jackson’s documentary was broadcast on 26 July 1982 and Threads went into pre-production the following year.

Filmed in 17 days in early 1984 on a budget of £250,000, Threads featured a cast of extras consisting mainly of CND supporters, loaned by Sheffield City Council (the area had recently declared itself a nuclear-free zone). Its script was by Barry Hines, best known for the uncompromising 1968 film Kes: he knew how to write Yorkshire because that’s where he was from. He battled ferociously with Jackson about Paul Vaughan’s intermittent, newsy voiceover, feeling that it smothered his drama, but Jackson knew a sui generis form for the film was essential to make it stand out.

This attitude hardened in November 1983 after Jackson saw the American post-apocalyptic TV movie, The Day After. Watched by 100 million people in the US, and featuring a similarly slow-burning series of real-life stories to Threads, well-known actors such as Jason Robards and Steve Guttenberg prettied it up, and its setting was sanitised. “I mean, the hospital scene in it – the electricity was working!” Jackson rants. In Threads, amputations are delivered without anaesthetic; people bite on rags. Jackson says: “The idea of nuclear war informing a new species of made-for-TV disaster movies was the worst thing that could happen, to my mind. I wanted to show the full horror. I felt that was absolutely my responsibility.”

There were other motivations behind this attitude, he says. A day after Threads was broadcast, as part of a night that also featured a political debate, Jackson went on BBC One’s Pebble Mill with a beeper on his belt – his wife was due to have their first child. Her being pregnant throughout the filming of Threads puts three of its scenes in a particularly tough light: Ruth sees a woman rocking her dead baby, her eyes numb and wide; she herself gives birth in a rural barn, alone, biting through her daughter’s umbilical cord with her teeth; and her own daughter, Jane, gives birth ten years later. In the final scene, Jane is handed her baby, but we don’t see the child. Jane looks at it and she screams. “For Threads to work, I had to try to let images and emotion happen in people’s minds,” Jackson says. “Or rather in the extensions of their imaginations.”


Sheffield City Centre, January 2018. Around the corner from The Moor, the square in which we see the upturned buggies after the bomb, 75-year-old Rita May sits in BBC Sheffield’s reception. “When the bomb goes off, the camera’s on me!” she says, half-surprised – she watched Threads the day before for the first time in decades, seeing herself in a front room in her early forties, next to a window unprotected from the blast. “It’s dated a bit, I thought. But oh, that make-up. Bran flakes and gelatine. Horrible, it was.”

She played Mrs Kemp, the mother of Jimmy, a woman oblivious to the encroaching horror. Her character screams for the first time when she realises her youngest son, Michael, isn’t with her – then her skin is horrendously burned. She goes into the fallout minutes later with her husband, against all advice, and finds Michael’s blackened foot in the rubble.

May keeps her maroon anorak on while she talks, her manner all no-nonsense northern. After the bomb drops the film continues for an hour and seven minutes, covering another ten years. Backstage was a gala of cheap, terrifying special effects, she remembers. Racks of clothes were blowtorched daily on-set by the wardrobe team. Karen Meagher, who played Ruth Beckett, wore her cataract contact lenses while doing her supermarket shopping, in order to get used to them. And the umbilical cord Ruth chewed through? “Made of liquorice!” This cheapness is often apparent in the film, but other moments ensure it doesn’t matter: Mrs Kemp’s husband trying to find food while holding on to Michael’s favourite toy, a broken electronic game; Ruth carrying Jimmy’s old book of birds. Old threads being clung to, before they finally yield.

The subtle familiarity of the faces in Threads is a large part of its power today. May has played minor characters in Coronation Street, larger roles in BBC and Sky One sitcoms, and after Threads was in the ITV kids’ series Children’s Ward for years. This may explain why Threads had a disturbing effect on the generation who
were aware of the nuclear threat as children, but only saw the films a little later. Recognisable faces made it more chilling.

May remembers a screening for the whole cast and extras just before the BBC broadcast. It was a Sunday, in Sheffield’s Fiesta Nightclub, the tables set in a cabaret style. “After it finished, no one could speak.” (Jackson recalls this event too: “These people had known what they were doing in the film, taken part in the crowd scenes, but the effect the whole thing had on them was extraordinary – all these people weeping.”)

May herself had a recurring dream afterwards, she says, in which she was standing by a window, just like Mrs Kemp had been. “My boys were young in it, playing outside, and then I saw a mushroom cloud behind them. Funny that, isn’t it?” It also made May think about her mother, who’d seen a doodlebug suddenly, one day in Sheffield, during the Second War. “Apparently, it destroyed the house next door,” she says. May tugs her gold locket. “We forget what that fear feels like easily, don’t we?”


There is, however, an appetite to remember. On a late winter’s afternoon in London, the BFI Southbank’s NFT3 cinema is full of people ready to experience When the Wind Blows on a big screen. It begins gently: Jim Bloggs (John Mills) bumbling about the house, a Protect and Survive booklet in his hand acquired from his local library. He gazes out of his window in the countryside, seemingly so far away from danger. After the bomb drops, his wife, Hilda (Peggy Ashcroft), worries about trivial things: the filth on her cushions, her blackened, slashed curtains – then later, as reality hits her, the weals on her legs. At the end of the film Jim prays, his mind unravelling with sickness, as the couple tuck themselves up in the bags that become their forgotten coffins.

The film’s executive producer, Iain Harvey, talked to the BFI audience. He explained that it took three years to raise funds to make When the Wind Blows, despite it being developed after the success of another Raymond Briggs adaptation, The Snowman. Nuclear weapons policy had hardened, if anything, in Britain in the mid-1980s:  as late as April 1986 Thatcher was writing her first open letter on the topic to her local paper, the Finchley Times. “Nuclear weapons have kept the peace for over 40 years,” she wrote. “Of course, in an ideal world there would be no weapons of mass destruction. But they exist, and they cannot be disinvented.” Fifteen days later, on 25 April, the No 4 reactor at Chernobyl Nuclear Power Plant exploded, sending clouds of radioactive caesium-137 slowly drifting westwards.

When the Wind Blows felt particularly vital at its world premiere just six months after Chernobyl. The film is dedicated to the children born to the relatively young cast and crew during its production: Harvey’s daughter, now 32, is in the audience today. Two women raise their hands, admitting that When the Wind Blows haunted them after they saw it as children. “We weren’t out to terrify you,” Harvey assures them. He tells me later how angry he would get when the film was criticised as being too party political. “After all,” he says, “what is party political about trying to ensure the world isn’t destroyed by nuclear war?”

A week later, Raymond Briggs calls me: now 84, he rarely ventures from his rural Sussex home. He also couldn’t stop watching When the Wind Blows the other day – but for different reasons. “That box separate to the telly – I couldn’t bloody switch it off.” He’s grumpy this morning and half-apologises; he’s softer recalling an old memory that inspired his anti-war stance.

“I remember standing at my window in Wimbledon Common, thinking of those ships on their way to Cuba. ‘All this out here,’ I remember thinking, ‘could be gone.’” He was 28 in 1962. “And now all this North Korea business. One bloke speaking off the cuff and the next day…” He tails off. “Thank God I’m 84, that’s all I can say.”

When the Wind Blows acknowledges how easy it is to become romantic about war. Briggs used his childhood experiences in the Second World War to address this nostalgia in the film, inserting his own Morrison shelter, covered with pin-ups, for Jim Bloggs’s, and taking inspiration from his own brief evacuation to a rural idyll far away from the bombs.

But as Threads and When the Wind Blows made clear, there is no rural idyll away from the bombs. And while modern dramas and documentaries have not confronted this reality, these older, bolder films still have a power to draw people together – on social media, in government, or even in smaller, more familiar ways. Mick Jackson’s father spent time in the Royal Army Medical Corps during the war. After he saw Threads, he started talking about what he’d seen for the first time. “That was absolutely what our work was about,” says Jackson, 34 years later. “To never forget, but to try, with the power we had, to change things.” 

“Threads” is released on DVD through Simply Media on 9 April; “When the Wind Blows” is out now on DVD, through the BFI

Tom Gatti and Kate Mossman are joined by Jude Rogers to discuss the 1984 nuclear disaster drama Threads. Then they talk about the Oscar-nominated film I, Tonya, and finally celebrate the noniversary of Jarvis Cocker invading the stage at the 1996 Brit Awards.

Listen on iTunes here, on Acast here or via the player below:

Our theme music is "God Speed" by Pistol Jazz, licensed under Creative Commons.

This article first appeared in the 07 December 2017 issue of the New Statesman, Christmas special