Don't blame it on the wetware

When computers go wrong, you can bet it's the user's fault - so the IT experts cry. Terence Green wo

Last month, on the hottest weekend of the year so far, thousands of air passengers across the UK were delayed when air traffic systems failed for the second time in a week. Richard Wright, a spokesman for National Air Traffic Services (Nats), told CNN that this was the first serious problem to hit air traffic control in 15 years and that the computer failure "was caused by a software problem, not a human error".

What a terrific excuse. It implies that the Nats software somehow emerges untainted by human hands, fully functional, with a mission to control the thousands of flights that pass over the UK each day. This is a nonsense. Human beings build the computers and write the programs. When things go wrong, the source of the problem is invariably traced back to wetware, otherwise known as the human chair/computer interface.

The online Risks Digest is the definitive source for the many and varied ways in which computer systems can be mismanaged. Founded in 1985 by Peter Neumann, a principal scientist at the SRI International Computer Science Laboratory in Menlo Park, California, Risks Digest explores the risks to the public of computers and related systems. By any measurement, human error seems to be the common factor. The October 1986 computer crash at the London ATC centre referred to by Richard Wright is covered in volume four, issue seven. Staff loading a new version of air traffic control software were unaware of an "unexpected flaw", which turned out to be simple human error.

It's no surprise that we tend to blame computers for mistakes we make. The simplest personal computer is a complex device and, unlike HAL in Stanley Kubrick's 2001: a space odyssey, can't answer back (yet). Because they're complex systems, computers offer easy targets for blame. Perhaps you've experienced the problem yourself. Your computer crashes, so you call the manufacturers. They blame the software. You call the software publishers. They blame the hardware. Result? You're left in limbo unless you can locate the proverbial needle in a haystack - a friendly, knowledgeable techie.

You may not find such a paragon in your organisation's IT department. The operative on the help desk - was there ever a more inappropriately named department? - is trained to speak slowly, as though to an idiot, leading you through a series of questions designed not so much to solve the problem as to expose you as completely unfit to operate a pocket calculator, let alone a computer. Some help-desk staffers have an unpleasant habit of referring to inquirers as "lusers", a derogatory contraction of "local user", meaning the person sitting at the computer. The irony is, if they're so smart, how come they're stuck babysitting lusers?

To be fair, only a minority of help-desk staffers treat their users with contempt and, when they do, it may be from bitter experience. People lie shamelessly when they report faults. "Of course I haven't changed anything recently," you reply indignantly, convinced that the game you installed from a free CD mounted on a computer magazine couldn't possibly be the cause of your computer going pear-shaped.

The very best help desk staff are inured to dealing with devious users, and have developed the patience and interpersonal skills of highly trained therapists. But they still can't work around human behaviour.

The "love bug" virus, which swept through computer systems worldwide in May and laid low the system in the House of Commons, is part of a new "computer problem" that actively exploits human behaviour. We all want to be loved, so we ignore the improbable and open an e-mail from a distant colleague expressing somewhat more than simple approval for a job well done. In the aftermath of the love bug, pundits were quick to blame those who opened the e-mail attachment that activated the virus. This completely misses the point. Even Microsoft, the very company whose software the virus targeted, was affected by the love bug because, like us, the people who run it are only human.

This is the underlying message of Risks Digest. Blaming computers for errors caused by humans is no more useful than blaming people for making mistakes. A blame culture deters us from developing an understanding of how errors occur and using that knowledge to build better computer systems.

It is ironic that June's air traffic control breakdown was widely reported to have been caused by a "computer glitch". Ironic because the etymology of glitch - "a sudden irregu- larity or malfunction" - is, says the Concise Oxford Diction-ary, unknown. Wonder if it was a computer error . . .

Terence Green began working with computers in 1977 and writing about their quirks in 1987. He spends far too much time evaluating software and hardware and wonders when the PC will emerge from its 20-year beta test cycle

This article first appeared in Education, education, profit

2000-07-10