How I didn't quite have the nerve to kill the millennium bug


Now it can be told: the tale of how I missed the biggest story of the century. I was editing the Independent on Sunday in 1995 when Charles Arthur, the assiduous technology correspondent for that paper and its daily sister, revealed the threat of the millennium bug: that, because computers were programmed to recognise years only by the last two digits ('68, '92, etc), millions of them would go haywire on 1 January 2000, thinking either that it was 1900 again or that history had simply come to an end (the latter being a perfectly reasonable conclusion for any computer that had spent much time listening to American local radio stations).

When the copy arrived on my desk, I dismissed it as another of those preposterous scare stories that have given Sunday newspapers a bad reputation. One day we were being told that computers would soon be smart enough to rule the world, the next that they were too stupid to understand a simple date change. I suggested that Arthur find something more plausible, but he persisted. In the end, I ran the story, but presented it as a "just fancy that" kind of tale at the bottom of an inside news page where nobody noticed what could have been hailed as a major scoop. A few weeks later, the Daily Telegraph picked up the story and ran it across the top of its front page. The rest is . . . well, history of a sort. The story, as we say in the trade, had legs: over the next four years, we learnt that nuclear power stations would explode, aeroplanes fall from the sky, bank deposits disappear, food supplies collapse, stock markets crash, lifts leave their passengers stranded in mid-air. Journalistically, Charles Arthur was proved right and I wrong; no editor should take credit for underplaying a story that became a global talking-point for more than four years and which led governments and private companies to spend £430 billion.

Yet I never wavered in my lonely belief (I was aware only of a software professor at University College London who appeared to agree with me) that the millennium bug would prove to be the non-event of our lifetimes. The closer we got to 1 January 2000, the more convinced I became, remembering my late father's injunction never to trust experts. For them, the bug always struck me as a win-win situation: if the promised meltdown actually occurred, they could claim they hadn't been called early enough or paid enough money; if it didn't, they could claim they prevented it. In that sense, the bug story failed as science, because it failed the Popperian test of falsifiability. It also seemed to fail - like the Marxist theory of history or most modern dietary plans - on the largeness of its claims. Trouble with pension payments or bank accounts I could just about credit. But why should an aeroplane or a nuclear missile or an office lift need to know the date? As we approached the dreaded Y2K, more questions piled up. I had charge cards with expiry dates in 2000 and 2001; if computers thought these meant 1900 and 1901, why were my card purchases still accepted? Then there was 9.9.99. For some reason - computers seem expected to behave like superstitious yokels in a Thomas Hardy novel - this date too was deemed likely to cause mayhem. Nothing happened.

Though some sages persist in predicting chaos - wait for 29 February, they say desperately and implausibly - I now feel confident enough to draw some media lessons.

First, almost nobody over 30 understands computers. They can type spreadsheets, surf the net, play games and the rest but they have no grasp at all of how they actually work; indeed, they are surprised that they work at all. Since all newspaper editors (like ministers, top civil servants and captains of industry) are over 30, they will believe anything they are told about these strange machines.

Second, the science and technology correspondents, like all newspaper specialists, have a vested interest in scare stories being true. Just as the computer experts cashed in on bug fears, so the newspaper technology writers got their stories on the front page (unfamiliar territory for them) and went on to earn rich freelance pickings.

Third, newspapers and journalists don't like sticking their necks out for a negative. Over the past year, I asked five separate journalists (no names, they know who they are) to write debunking pieces on the millennium bug for the NS. Only one of them delivered but, though the writer had himself offered to do it, he seemed to have lost his nerve while writing it. The trouble is that journalists can predict death and disaster to their hearts' content (I've done it myself) and readers forget and forgive. Some will call them alarmist, to be sure, but any red-blooded, testosterone-fuelled reporter (this is still a very masculine trade) prefers being called alarmist to being called complacent. Perhaps readers expect doom from their newspapers and automatically discount what they read, while they give special credence to the exceptional doom-denied story. Heaven knows how many reckless warnings of war and civil disorder have been issued by the British popular papers; but what millions never forgot (and some never forgave) was the Daily Express's statement in 1939 that "there will be no war in Europe this year".

I therefore put forward a modest proposal. It is fashionable for papers to run daily "corrections and clarifications" columns for factual inaccuracies. They could add "predictions" columns, testing the various forecasts made, say, one month previously. As Private Eye would put it: "We may have inadvertently given our readers the impression that they would now be crouching in their homes, shivering in darkness, nibbling on stale bread, choking on nuclear radiation. We now accept that there was not a shred of truth . . ."

Peter Wilby was editor of the Independent on Sunday from 1995 to 1996 and of the New Statesman from 1998 to 2005. He writes the weekly First Thoughts column for the NS.