Sense of duty: Martin Bromiley founded the Clinical Human Factors Group to bring change to the NHS. Photo: Muir Vidler
Show Hide image

How mistakes can save lives: one man’s mission to revolutionise the NHS

After the death of his wife following a minor operation, airline pilot Martin Bromiley set out to change the way medicine is practised in the UK  – by using his knowledge of plane crashes. 

1.

Martin Bromiley is a modest man with an immodest ambition: to change the way medicine is practised in the UK.

I first met him in a Birmingham hotel, at a meeting of the Clinical Human Factors Group, or CHFG. Hospital chief executives, senior surgeons, experienced nurses and influential medical researchers met, debated and mingled. Keynote speakers included the former chief medical officer for England Sir Liam Donaldson. In the corridors and meeting rooms, rising above the NHS jargon and acronyms and low-level grumbling about government reforms, there floated a tangible sense of purpose and optimism. This was a meeting of believers.

A slow transformation in the way health care works is finally gaining traction. So far, it has gone largely unnoticed by the media or the public because it hasn’t been the result of government edict or executive order. But as Suren Arul, a consultant paediatric surgeon at Birmingham Children’s Hospital put it to me: “We are undergoing a quiet revolution and Martin Bromiley will, one day, be recognised as the man who showed us the way.”

Although I knew whom to look for, Bromiley was hard to spot at first. He wasn’t on stage, and he didn’t address the full conference. He was, I discovered, sitting at a table at the edge of the hall, in the suburbs of the meeting. You would hardly have guessed that the CHFG was a group he’d founded, or that everyone at the meeting that day was there because of him. Bromiley doesn’t fit with our preconceived ideas of a natural leader. He speaks with a soft voice. He doesn’t command your attention, though you find yourself giving it.

Neither is he a doctor, or a health professional of any kind. Bromiley is an airline pilot. He is also a family man, with a terrible story to tell.

 

2.

Early on the morning of 29 March 2005, Martin Bromiley kissed his wife goodbye. Along with their two children, Victoria, then six, and Adam, five, he waved as she was wheeled into the operating theatre and she waved back.

Over Christmas, Elaine had suffered a swelling of her face, connected to sinus problems that had troubled her for years. She was advised by a consultant that the only way to deal with the problem once and for all was to undergo a minor operation to straighten the inside of her nose. Bromiley knew of colleagues who had undergone the operation – the sinuses of pilots take a beating from sharp changes in air pressure – so he didn’t feel overly concerned, that morning, as he drove Victoria and Adam back to the family home in a peaceful Buckinghamshire village.

At about 11 that same morning he received a call from the ear, nose and throat consultant. “Elaine isn’t waking up properly from the anaesthetic,” said the doctor. “Can you come back in?” At the hospital, Bromiley was met by the consultant, who explained that there had been a problem keeping Elaine’s airway open after she had been anaesthetised and her oxygen levels had fallen to dangerously low levels. A decision had been taken to move her to the intensive-care unit.

Grasping for medical knowledge from half-remembered episodes of Casualty, Bromiley asked if the doctors had attempted a tracheotomy – a cut to the throat to allow air in. They explained that the safer option had been to let her wake up naturally. He made his way to the intensive-care unit. When he got there, the first person to approach him was the consultant anaesthetist, who, without saying anything, gave him a hug. Bromiley found himself trying to console him. “I said, ‘I know these things happen.’

He took a seat and waited for news. After ten minutes, two doctors emerged and took seats opposite him. In sombre tones, they told Bromiley that Elaine had been without oxygen for a long period of time and, as a consequence, had suffered severe brain damage. He could hardly process what they were saying. “I just thought, ‘Fuck. What? How?’ I was stunned. My whole world changed.”

An hour later, Bromiley was allowed to see his wife. “She didn’t look any different,” he told me. But she was different. After finally settling her oxygen levels, the doctors had put her into a coma to prevent her brain from swelling to the point where it crushed itself against the top of her spine.

It soon became apparent that it was a coma from which she would never recover. Days later, after a series of discussions with the doctors, he consented to having her life support switched off. The doctors were surprised at the strength of her heart, which continued to beat for another week until, on 11 April 2005, Elaine Bromiley died.

 

3.

How could this have happened? When he surfaced from the shock, that was the question to which Bromiley wanted an answer. At first, he accepted the word of the ENT consultant, who told him that the doctors had made all the right decisions but had simply come up against an emergency for which nobody could have planned: the exceptional difficulty of getting a tube down Elaine’s throat.

Still, he assumed that the next step would be an investigation – standard practice in the airline industry after every accident. “You get an independent team in. You investigate. You learn.” When he asked the head of the intensive-care unit about this, the doctor shook his head. “That’s not how we do things in the health service. Not unless somebody complains or sues.”

This doctor was privately sympathetic to Bromiley’s question, however. Shortly after Elaine’s death, he got in touch with Bromiley to say that he had asked a friend of his, Professor Michael Harmer, an eminent anaesthetist, if he would be prepared to lead an investigation. Harmer had said yes. After Bromiley gained the hospital’s consent, Harmer set to work, interviewing everyone involved, from the consultants to the nursing team.

In July that year, he submitted his report. As Bromiley read it, his mind went back to one of the last nights he had spent in the hospital during his wife’s coma, and to something the duty nurse had said to him: “It’s terrible. I can’t believe that happened.” With hindsight, that was a hint.

Harmer’s minute-by-minute narrative of the operation revealed a different story from the one Bromiley had heard when he spoke with the ENT surgeon. The truth was that Elaine had died at the hands of highly accomplished, technically proficient doctors with 60 years of experience between them, in a fine, well-equipped modern hospital, because of a simple error.

4.

Doctors make mistakes. A woman undergoing surgery for an ectopic pregnancy had the wrong tube removed, rendering her infertile. Another had her Fallopian tube removed instead of her appendix. A cardiac operation was performed on the wrong patient. Some 69 patients left surgery with needles, swabs or, in one case, a glove left inside them. These are just some of the
incidents that occurred in English hospitals in the six months between April and September 2013.

Naturally, we respect and admire doctors. We believe that health care is scientific. We think of hospitals as places of safety. For all these reasons, it comes as something of a shock to realise that errors still play such a significant role in whether we leave a hospital better or worse, alive or dead.

The National Audit Office estimates that there may be 34,000 deaths annually as a result of patient safety incidents. When he was medical director, Liam Donaldson warned that the chances of dying as a result of a clinical error in hospital are 33,000 times higher than dying in an air crash. This isn’t a problem peculiar to our health-care system. In the United States, errors are estimated to be the third most common cause of deaths in health care, after cancer and heart disease. Globally, there is a one-in-ten chance that, owing to preventable mistakes or oversights, a patient will leave a hospital in a worse state than when she entered it.

There are other industries where mistakes carry grave consequences, but the mistakes of doctors carry a particular moral charge because their job is to make us better, and we place infinite trust in the expectation they
will do so. When you think about it, it’s extraordinary we’re prepared to give a virtual stranger permission to cut us open with a knife and rearrange our insides as we sleep.

Perhaps because of the almost superstitious faith we need to place in surgeons, we hate to think of them as fallible; to think that they perform worse when they are tired, or that some are much better at the job than others, or that hands can slip because of nerves, or that bad decisions get taken because of overconfidence, or stress, or poor communication. But all of these things happen, because doctors are human.

 

5.

Within two minutes of Elaine Bromiley’s operation beginning, the anaesthetic consultant realised that the patient’s airway had collapsed, hindering her supply of oxygen. After repeatedly trying and failing to ventilate the airway, he issued a call for help. An ENT surgeon answered it, as did another senior anaesthetist. The three consultants struggled to get a tube down Elaine’s throat, a procedure known as intubation, but encountered a mysterious blockage. So they tried again.

“Can’t ventilate, can’t intubate” is a recognised emergency in anaesthetic practice, for which there are published guidelines. The first instruction in one version of the guidelines is this: “Do not waste time trying to intubate when the priority is oxygenation.” Deprived of oxygen, our brains soon find it hard to function, our hearts to beat: ten minutes is about the longest we can suffer such a shortage before irreversible damage is done. The recommended solution is to carry out a form of tracheotomy, puncturing the windpipe to allow air in. Do not waste time trying to intubate.

Twenty minutes after Elaine’s airway collapsed, the doctors were still trying to get a tube down her throat. The monitors indicated that her brain was starved of oxygen and her heart had slowed to a dangerously low rate. Her face was blue. Her arms periodically shot up to her face, a sign that brain tissue is being irritated. Yet the doctors ploughed on. After 25 minutes, they had finally intubated their patient. But that was too late for Elaine.

If the severity of Elaine’s condition in those crucial minutes wasn’t registered by the doctors, it was noticed by others in the room. The nurses saw Elaine’s erratic breathing; the blueness of her face; the swings in her blood pressure; the lowness of her oxygen levels and the convulsions of her body. They later said that they had been surprised when the doctors didn’t attempt to gain access to the trachea, but felt unable to broach the subject. Not directly, anyway: one nurse located a tracheotomy set and presented it to the doctors, who didn’t even acknowledge her. Another nurse phoned the intensive-care unit and told them to prepare a bed immediately. When she informed the doctors of her action they looked at her, she said later, as if she was overreacting.

Reading this, you may be incredulous and angry that the doctors could have been so stupid, or so careless. But when the person closest to this event, Martin Bromiley, read Harmer’s report, he responded very differently. His main sensation wasn’t shock, or fury. It was recognition.

 

6.

Shortly after 5pm on the clear-skied evening of 28 December 1978, United Airlines Flight 173 began its descent to Portland International Airport. The plane had taken off from New York that morning and, after making a pre-scheduled stop in Denver, it was reaching its final destination with 189 souls on board.

As the landing gear was lowered there was a loud thump and the aircraft yawed slightly to the right. The flight crew noticed that one of the green landing gear indicator lights wasn’t lit. The captain radioed air-traffic control at Portland, telling them, “We’ve got a gear problem.”

Portland’s control agreed that the plane would orbit the airport while the captain, first officer and second officer worked out what to do. The passengers were told that there would be a delay. The cabin crew began to carry out checks. The flight attendants were instructed to check the visual indicators on the wings, which suggested that the landing gear was locked down.

Nearly half an hour after the captain told Portland about the landing gear problem, he contacted the United Airlines maintenance centre, informing the staff there that he intended to continue the holding pattern for another 15 or 20 minutes. He reported 7,000lbs of fuel aboard, down from 13,000 when he had first spoken to Portland.

United’s controller sounded a mild note of concern. “You estimate that you’ll make a landing about five minutes past the hour. Is that OK?” The captain’s response was ostentatiously relaxed: “Yeah, that’s a good ball park. I’m not gonna hurry the girls [the cabin crew].” United 173 had 30 minutes of fuel left.

The captain and his two officers continued to debate the question of whether the landing gear was down. The captain asked his crew how much fuel they would have left after another 15 minutes of flying. The flight engineer responded, “Not enough. Fifteen minutes is gonna – really run us low on fuel here.” At 18.07 one of the plane’s engines lost power. Six minutes later, the flight engineer reported that both engines were gone. The captain, as if waking up to the situation for the first time, said: “They’re all going. We can’t make Troutdale [a small airport on the approach route to Portland].” “We can’t make anything,” said the first officer. At 18.13, the first officer sent the plane’s final message to air-traffic control: “We’re going down. We’re not going to be able to make the airport.”

 

7.

This story of United 173 is known to every airline pilot, because it is studied by every trainee. To the great credit of the aviation industry, it became one of the most influential disasters in history. Galvanised by it and a handful of other crashes from the same era, the industry transformed its training and safety practices, instituting a set of principles and procedures known as CRM: crew resource management.

It worked. Although we usually notice only the high-profile exceptions, crashes are at the lowest level they have ever been, and flying is now one of the safest ways you can spend your time. As they are fond of saying in aviation, these days the most dangerous part of a flight is the journey to the airport.

CRM was born of a realisation that in the late 20th century the most frequent cause of crashes wasn’t technical failure, but human error. Its roots go back to the Second World War, when the US army assigned a psychologist called Alphonse Chapanis to investigate a curious phenomenon. B-17 bombers kept crashing on to the runway on landing, even though there were no apparent mechanical problem with the planes. Rather than blaming the pilots, Chapanis pointed to the instrument panel. The lever to control the landing gear and the lever that operated the flaps were next to each other. Pilots, weary after long flights, were confusing the two, retracting the wheels and causing the crash. Chapanis suggested attaching a wheel to the handle of the landing lever and a triangle to the flaps lever, making each easily distinguishable by touch alone. Problem solved.

Chapanis had recognised that human beings’ propensity to make mistakes when they are tired is much harder to fix than the design of levers. His deeper insight was that people have limits, and many of their mistakes are predictable effects of those limits. That is why the architects of CRM defined its aim as the reduction of human error, rather than pilot error. Rather than trying to hire or train perfect pilots, it is better to design systems that minimise or mitigate inevitable human mistakes.

In the 1990s, a cognitive psychologist called James Reason turned this principle into a theory of how accidents happen in large organisations. When a space shuttle crashes or an oil tanker leaks, our instinct is to look for a single, “root” cause. This often leads us to the operator: the person who triggered the disaster by pulling the wrong lever or entering the wrong line of code. But the operator is at the end of a long chain of decisions, some of them taken that day, some taken long in the past, all contributing to the accident; like achievements, accidents are a team effort. Reason proposed a “Swiss cheese” model: accidents happen when a concatenation of factors occurs in unpredictable ways, like the holes in a block of cheese lining up.

James Reason’s underlying message was that because human beings are fallible and will always make operational mistakes, it is the responsibility of managers to ensure that those mistakes are anticipated, planned for and learned from. Without seeking to do away altogether with the notion of culpability, he shifted the emphasis from the flaws of individuals to flaws in organisation, from the person to the environment, and from blame to learning.

The science of “human factors” now permeates the aviation industry. It includes a sophisticated understanding of the kinds of mistakes that even experts make under stress. So when Martin Bromiley read the Harmer report, an incomprehensible event suddenly made sense to him. “I thought, this is classic human factors stuff. Fixation error, time perception, hierarchy.”

 

8.

It’s a miracle that only ten people were killed after Flight 173 crashed into an area of woodland in suburban Portland; but the crash needn’t have happened at all. Had the captain attempted to land, the plane would have touched down safely: the subsequent investigation found that the landing gear had been down the whole time. But the captain and officers of Flight 173 became so engrossed in one puzzle that they became blind to the more urgent problem: fuel shortage. This is called “fixation error”. In a crisis, the brain’s perceptual field narrows and shortens. We become seized by a tremendous compulsion to fix on the problem we think we can solve, and quickly lose awareness of almost everything else. It’s an affliction to which even the most skilled and
experienced professionals are prone.

Imagine a stalled car, stuck on a level crossing as a distant train bears down on it. Panic rising, the driver starts and restarts the engine rather than getting out of the car and running. The three doctors bent over Elaine Bromiley’s throat were intent on finding a way to intubate, just as the three pilots in the cockpit of United 173 were determined to establish the status of the landing gear. In neither case did these seasoned professionals look up and register the oncoming train: in the case of Elaine, her oxygen levels, and in the case of United 173, its fuel levels.

When people are fixating, their perception of time becomes highly erratic; minutes stretch and elongate. One of the most striking aspects of the transcript of United 173’s last minutes is the way the captain seems to be under the impression that he has plenty of time, right up until the moment the engines cut out. It’s not that he didn’t have the correct information; it’s that his brain was running to a different clock. Similarly, it’s not that the doctors weren’t aware that Elaine Bromiley’s oxygen supply was a problem; it’s that their sense of how long she had been without it was distorted. When Harmer interviewed him, the anaesthetic consultant confessed that he had no idea how much time had passed.

Imagine, for a moment, being one of those doctors. You have a patient who has stopped breathing. The clock is ticking. The standard procedure isn’t working, but you have employed it dozens of times before and you know it works. Each of the senior colleagues around you is experiencing the same difficulty, which reassures you. You cling to the belief that, between the three of you, you will solve the problem, if it is soluble at all. You vaguely register nurses coming into the room and saying things but you don’t really hear what they say. Perhaps it occurs to you to step back from the patient and demand a rethink, but you don’t want your peers to see you as panicky or naive. So you focus on the one thing you can control: the procedure. You repeat it over and over, hoping for a different result. It is madness, but it is comprehensible madness.

Team trauma: British Midland Flight 92 came down near the M1 at Kegworth after a breakdown in communication among the crew

 

9.

In the months after Elaine’s death, as Bromiley tried to rebuild his family life, he couldn’t stop wondering about the difference between the way people in health care treated accidents and the way his industry dealt with them. So he would phone people in and around the National Health Service and ask them about it.

He discovered that many others – an anaesthetist in Scotland, a medical researcher in London – had been wondering the same thing. Eventually, he accumulated a long list of like-minded people, none of whom was talking to any of the others. So he booked a room in a hotel, called a meeting and invited them all, along with experts from other industries and academics, including James Reason. Everyone agreed that when it came to safety, health care was languishing in the Dark Ages. Hospitals more or less pretended that mistakes didn’t happen, failed to learn from them and, as a result, repeated them. If we don’t like to think that doctors make mistakes, doctors like to think about it even less.

One of the biggest problems identified was the unwritten but entrenched hierarchy of hospitals. Bromiley, who has worked with experts from various “safety-critical” industries, including the military, told me that the hospital is by far the most hierarchical workplace he has come across. At the top of the tree are consultant surgeons, the rock stars of the hospital corridors: highly driven, competitive, mostly male and not the kind who enjoy confessing to uncertainty. Then come anaesthetists, often quieter of disposition and warier of risk. Further down are nurses, valued for their hard work but not for their brains.

A key principle of human factors is that it is the unspoken rules of who can say what and when that often lead to crucial things going unsaid. The most painful part of the transcript of Flight 173’s final hour is the flight engineer’s interjections. You can sense his concern about the fuel situation, and his hesitancy about expressing it. Fifteen minutes is gonna – really run us low on fuel here. Perhaps he’s assuming the captain and his officers know the urgency of their predicament. Perhaps he’s worried about being seen to speak out of turn. Whatever it is, he doesn’t say what he feels: This is an emergency. We need to get this plane on the ground – NOW. Similarly, the nurses who could see the urgency of Elaine Bromiley’s condition didn’t feel able to tell the doctors that they were on the verge of committing a grave error. So they made tentative suggestions that were easy to ignore.

John Pickles, an ENT surgeon and former medical director of Luton and Dunstable Hospital NHS Foundation Trust, told me that usually when an operation is carried out on the wrong part of the body (a class of error known as “wrong-site surgery”), there is at least one person in the room who knows or suspects a mistake is being made. He recalled the case of a patient in South Wales who had the wrong kidney removed. A (female) medical student had pointed out the impending error but the two (male) surgeons ignored her and carried on. The patient, who was 70 years old, was left with one diseased kidney, and died six weeks later. In other cases nobody spoke up at all.

The pioneers of crew resource management knew that merely warning pilots about fixation error was not sufficient. It is too powerful an instinct to be repressed entirely even when you know about it. The answer lay with the crew. Because even the most experienced captains are prone to human error, the entire aircraft crew needed to act as a collective intelligence, vigilant for problems and responsible for solutions. “It’s the people at the edge of the room, standing back from the situation, who can often see it best,” Bromiley said to me.

He recalled the case of British Midland Flight 92, which had just taken off for its flight from London to Belfast on 8 January 1989 when the pilots discovered one of the engines was on fire. Following procedure, they shut it down. Over the PA, the captain explained that because of a problem with the right engine he was making an emergency landing. The cabin staff, who – like the passengers, but unlike the cockpit crew – could see smoke and flames coming from the left engine, didn’t pass this information on to the cockpit. After the pilots shut down the only functioning engine, British Midland 92 crashed into the embankment of the M1 motorway near Kegworth in Leicestershire. Forty-seven of the 126 people on board died; 74 sustained serious injuries.

The airline industry pinpointed a major block to communication among members of the cockpit crew: the captain. The rank of captain retained the aura of imperial command it inherited from the military and from the early days of flying, when pilots such as Chuck Yeager, immortalised in Tom Wolfe’s book The Right Stuff, were celebrated as audacious mavericks. The pioneers of CRM realised that, in the age of mass air travel, charismatic heroism was precisely the wrong stuff. The industry needed team players. The captain’s aura was a force field, stopping other crew members from speaking their mind at critical moments. It wasn’t just the instrument panel that had to change: it was the culture of the cockpit.

Long before they started doing more good than harm, surgeons were revered as men of genius. In the 18th and 19th centuries, surgical superstars performed operations in packed amphitheatres before hushed, admiring audiences. A great surgeon was a virtuoso performer with the hands of a god. His nurses and assistants were present merely to follow the great man’s commands, much as the planets in an orrery revolve around the sun. The advent of medical science gave this myth a grounding in reality: at least we can be confident that doctors today make people better, most of the time. But it reinforced a mystique that makes doctors, and especially surgeons (who, of course, still perform in operating theatres), hard to question, by either patients or staff.

Better safety involves bringing doctors off their pedestal or, rather, inviting them to step down from it. Modern medicine is more reliant than ever on teamwork. As operations become more complex, more people and procedures are involved. Operating rooms swarm with people; various specialists pronounce judgement or perform procedures, and then leave. Surgical teams are often comprised of individuals who know each other only vaguely, if at all. It is a simple but unavoidable truth that the more people are involved in something, and the less well they know each other, the more likely it is that someone will make an error.

The most significant human factors innovation in health care in recent years is surprisingly prosaic: the checklist. Borrowed from the airline industry, the checklist is a standardised list of procedures to follow for every operation, and for every eventuality. Checklists compensate for the inbuilt tendency of human beings under stress to forget or ignore what is important, including the most basic things (the first item on one aviation checklist is FLY THE AIRPLANE). They also empower the people at the edges of the room: before the operation and at key moments during it, the whole team goes through each point in turn, including emergencies, which gives a cue to more reserved members of the team to speak up.

Checklists are most effective in an atmos­phere of informality and openness: it has been shown that simply using the first name of the other team members improves communication, and that giving people a chance to say something at the beginning of a case makes them more likely to speak up during the operation itself.

Naturally, this spirit of openness entails a diminishment of the surgeon’s power – or a dispersal of that power around the team. Some doctors don’t mind this – indeed, they welcome it, because they realise that their team can save them from career-ruining mistakes. Others are more resistant, particularly those who treasure their independence; mavericks don’t do checklists. Even those who see themselves as evolved team players may overestimate their openness. J Bryan Sexton, a psychologist at Johns Hopkins University in the US, has conducted global surveys of operating-room staff. He found that while 64 per cent of surgeons rated their operations as having high levels of teamwork, only 28 per cent of nurses agreed.

The lessons of human factors go far beyond the status of surgeons. From his earliest conversations with insiders, Bromiley realised that the NHS needed to undergo a profound cultural change if it was to reach the level of the aviation industry in terms of safety. Hospitals gave little or no thought to how their teams functioned. Doctors underestimated the effects of tiredness on their own performance. Medical schools taught doctors that technical excellence trumped everything else and spent little or no time teaching communication or team management skills. Specialists saw their job as fixing parts of the body, rather than helping a person (at this year’s Clinical Human Factors Group conference, Peter Jaye, a consultant surgeon at Guy’s Hospital in London, remarked: “At medical school I was trained to think of a patient as a pair of kidneys”). There was little or no data on which hospitals and doctors were making mistakes, and therefore which required the most urgent improvement.

Safety risks were routinely misperceived. The “can’t ventilate/can’t intubate” emergency happens about one in every 20,000 times, which anaesthetists consider a remote possibility. Yet as Bromiley told me: “In aviation, when we find out there’s a one-in-a-million chance of an engine failing, we worry. To me, one in 20,000 means a regular occurrence.”

As James Reason showed, mistakes arise out of coincidence. Suren Arul, the consultant paediatrician in Birmingham, told me that “when mistakes happen, it’s almost never one person’s fault. It’s usually a whole series of things, some of them tiny.” Bromiley asks hospital boards to consider their procurement of marker pens, used to mark the part of the body about to be operated on (best practice is for the surgeon to make the mark and to sign it with his initials). “I tell them, ‘I understand your need to cut budgets. But do you realise that because you didn’t buy those marker pens you’ve just trebled your likelihood of having a wrong-site surgery case?’

There is now greater awareness of the complexity of safety than ever. The body that Bromiley founded, the Clinical Human Factors Group, has no official status within the NHS, but its influence has been felt right across that sprawling and multifarious institution. At the meeting I attended, everyone to whom I spoke seemed to believe that things are moving, albeit too slowly, in the right direction.

10.

It was a trauma situation: an 18-month-old baby boy had fallen down the steps at Euston Square station, smashing his head and injuring his leg. Through a one-way mirror, I watched as a young doctor entered the operating room and was greeted by the baby’s distressed mother. The woman sitting next to me turned to her team. “Why don’t you play the orthopaedic surgeon this time, Dave?” she said. “Clare, can you be the anaesthetic consultant?”

In recent years, simulations have started to become part of the training of doctors. Sarah Chieveley-Williams, the consultant anaesthetist who is director of clinical simulation at University College London Hospitals (UCLH), had invited me to watch junior paediatricians being put through their paces. The room before us was a near-perfect replica of an operating theatre, with an anaesthesia machine, various equipment, monitors and a stock of medication. The dummy baby had pupils that dilate, a heartbeat and a noisy cry.

Chieveley-Williams and her team were interested in the doctor’s ability to identify the right priorities: first, stabilise the baby’s condition by anaesthetising it; second, get a neurological consultant to look at its head injuries, in order to prevent or minimise brain damage. On our side of the glass, one of the team sat by a computer from where she could manipulate the baby’s vital signs. She slowed down its heart rate, ratcheting up the urgency of the situation.

Chieveley-Williams turned to another colleague: “Dave, see if you can get her fixated on the leg.” Dave left us and a moment later reappeared in the operating room wearing a white coat. After examining the patient, he proposed, with the air of someone used to being agreed with, that an X-ray be taken of the injured leg. Chieveley-Williams watched intently. In a quiet but firm voice, the young doctor said, “Right now, the priority is his head injury. The leg will have to wait.”

Chieveley-Williams turned to me with a grin. “That told him,” she said.

Over the next 20 minutes, a succession of people entered and left the room. Specialists were summoned, medications ordered and procedures arranged. At times the impression was one of near-chaos. A trauma incident, Chieveley-Williams explained to me, presents an acute management challenge, as well as a medical one. Because it often involves injuries to different parts of the body, many specialists come into the treatment room, each with his or her own agenda. “The doctor needs to establish leadership and keep everyone focused on the big picture – the patient’s health.”

In the case of Elaine Bromiley, there was too much hierarchy and too little. On the one hand, the nurses didn’t assert themselves. On the other hand, nobody was taking ultimate responsibility for the patient’s safety. As John Pickles remarks, “You had three very senior people in the room and no one in charge.”

Hierarchy, in the sense of clear leadership, is a good thing, as long as the leaders are confident enough to confess uncertainty. A common problem, Chieveley-Williams said, is young doctors being reluctant to say they don’t know what the answer is because they are so eager to project competence. A member of her team told me, “We tell them that when you’re stuck, ask everyone in the team for their view. One of them probably has the answer, but until you speak up they’ll assume you have it, too.”

These sorts of lessons weren’t being given ten years ago. Like UCLH, Great Ormond Street children’s hospital in London is at the forefront of the new thinking about patient safety, and is absorbing lessons from other industries. The transfer of patients from surgery to the intensive-care unit is a complex process that has to be accomplished at speed, and involves several people. Unsurprisingly, it is a well-known danger zone: things get dropped, tubes are left unattached, and patients suffer. In collaboration with the human factors researcher Ken Catchpole, the hospital studied the process of pit-stop changes in Formula 1, learning the importance of every individual on the team being allocated a precisely defined task. Mistakes fell.

 

11.

Martin Bromiley has rebuilt his life. Happily remarried, he is stepdad to his second wife’s two children, as well as still dad to Victoria and Adam. He is not haunted by the tragedy of Elaine’s death but driven by it. Between flying commitments, he talks to doctors, nurses, researchers and NHS boards, connecting the like-minded; telling his story to those, whether managers or medical students, who most need to hear it. It is a heavy workload. I wondered if he was ever tempted to leave it behind, now that the CHFG has its own momentum. He shook his head. “This is a duty.”

Improving the safety of patients in health care doesn’t necessarily require spending on expensive new technologies, or complex structural reorganisation. It requires forethought, empathy, humility and a willingness to learn from mistakes. Which, after all, is a duty to those who have suffered from them. Bromiley insisted that the Harmer report be made public as happens with air accident reports, and he chose for it an epigraph borrowed from aviation: “So that others may learn, and even more may live.” All of the medical staff involved in Elaine’s operation are back at work. That, says Bromiley, is exactly what he wanted, because they will be better clinicians for their experience, and advocates for the cause.

There are two approaches to reforming a large institution. You can impose change from outside by invoking the will, or the wrath, of the public – or you can persuade those inside to let you in and to listen to your message. Both can work. When Julie Bailey exposed the gross malpractices of staff at Stafford Hospital, she shook the health service from top to bottom. Bromiley greatly admires what Bailey has achieved, but he has taken a different path. Rather than using his story as a club – and nobody would have blamed him for doing that – he has deployed it as you would wish a surgeon to apply the knife to someone you love: with skill, subtlety and precision.

“From the moment something went wrong with Elaine, it was different, because they knew my profession,” he says. Responding to his calmness and extraordinary ability to empathise with their situation, the team rose to their best. As Elaine lay in a coma, they involved him in every decision they took, right up until the last one. It was proper teamwork, and a model for the long campaign that followed. “I’m an outsider who is also an insider,” he says.

Martin Bromiley has reminded clinicians that not everything is or should be clinical. His legacy, says Professor Jane Reid, a researcher in nursing at Queen Mary’s Hospital in south London, is “a new safety culture” in the NHS. He has no desire to take up any official position. “I’m not an expert on medical practice,” he told me. “I’m just a guy who flies planes.” 

Ian Leslie is the author of Curious: The Desire to Know and Why Your Future Depends on It (Quercus, £10.99)

Ian Leslie is a writer, author of CURIOUS: The Desire to Know and Why Your Future Depends On It, and writer/presenter of BBC R4's Before They Were Famous.

This article first appeared in the 28 May 2014 issue of the New Statesman, The elites vs the people

Charlie Forgham-Bailey for the New Statesman
Show Hide image

"I teach dirty tricks": the explosives expert who shows armies how to deal with terrorists

Sidney Alford used to blow things up in his garage. Now his expertise is helping save lives.

“I’ll fetch the hammer,” says Sidney Alford, leaving me in a laboratory filled with mysteriously named drawers and small bottles with skulls on their labels. When he has fetched it – “it’s a jeweller’s hammer, given to me in Paris by a friend of Salvador Dali” – the 82-year-old plans to tap gently on a small mound of white powder called triacetone triperoxide, or TATP, better known as the explosive favoured by Isis in their suicide belts and homemade bombs. Because of its instability and destructive power, its nickname is “Mother of Satan”.

Tapping it with a hammer is enough to make it go bang.

Directing me to stand by the door, he searches for ear plugs before stuffing some paper in his ears – “I’m quite deaf, you know,” were almost his first words to me that morning – and begins to tap the Mother of Satan. On the fourth tap, it explodes in a genteel fashion with a flash and a pop. Its sensitivity to percussion is one of the reasons that jihadi bomb-makers suffer so many workplace accidents. “See,” Alford says. “You’d be OK walking, just don’t fall over or get shot.”

I have wanted to meet Sidney Alford ever since I heard about him from the investigative journalist Meirion Jones, who once uncovered a British man who sold £50m-worth of fake bomb detectors in Iraq and other countries. (The fraudster, James McCormick, was jailed for ten years in 2013.)

Giving a presentation to students, Jones mentioned that he could prove the gadgets were useless – just black boxes with radio aerials sticking out of them – because he had taken them “to a guy the BBC uses for explosives, who has a quarry in Somerset where he blows things up”. I decided then and there that I was very interested in being in a quarry in Somerset where someone blew things up. Maybe I would even get to press the button.

There was a less childish reason for visiting, too. Sidney Alford’s life story is interwoven with one of the technologies that defines the modern world: explosives. We fear explosives – suicide bombs, car bombs, bombs on aircraft – but we also need them, for everything from realistic film scenes to demolition. (Alford has a letter from Stanley Kubrick thanking him for his help on Full Metal Jacket.) Surprisingly, the best way to defuse an explosive is often with another explosive, something that Sidney’s company, Alford Technologies, has pioneered.

In other words, if you want to make something go bang – or, just as importantly, stop something going bang – he is the man to talk to. Quite loudly.

***

The first explosive materials Alford ever saw were fragments of bombs and V2 rockets left over from the German shelling of London. Born in 1935 in the suburb of Ilford, he moved with his family to Bournemouth when the Second World War broke out. When he returned, he found rich pickings in his battered neighbourhood in the form of magnesium incendiary bombs, which he filed down and turned into fireworks.

I ask him if, like my own father, he ever frightened his teachers with nitrogen triiodide, an unstable explosive compound that schoolchildren used to make themselves and set off in lessons to terrify unwary members of staff in the era before health and safety. “Oh yes,” he says. “I put it under my French teacher’s chair.” A pause. “He’d been in the army, so he didn’t make a fuss.”

Alford went to a grammar school, where he was an undistinguished pupil, angry that the headmaster wouldn’t let him learn German (rather than Latin) so he could speak to the Jewish child refugees he knew. But he was always interested in chemistry, and “by the fifth form, I’d recruit classmates to make bigger bangs”.

A chemistry degree came next, followed by a series of odd jobs, including diet research and studying the brain, an MSc in the science of environmental pollution, and two business associations with men he now characterises as “bad sorts”, who ripped him off.

By this time, he had moved to Ham, in west London, and had begun to take his chemistry experiments more seriously. It was the early 1970s, and the IRA’s bombing campaign had come to England. How could these weapons be neutralised, Alford wondered? Was it better to encase suspect packages in “blast containers”, or use shaped charges – typically, small cones that focus explosive energy into a point – to disrupt their ability to go off?

A brief digression on explosives is necessary here. When you think of something going bang in a spectacular fashion, that’s a detonation. “Detonare,” says Alford at one point during my tour of the quarry, relishing the Latin. “Like thunder.”

High explosives such as TNT, nitroglycerin or Semtex can be detonated by administering a violent shock to the main charge using a small amount of relatively sensitive and violent material in a metal capsule. This creates a hot shock wave, which sweeps through the substance faster than the speed of sound.

Old-fashioned gunpowder, house fires and your car’s internal combustion engine go through a different process, known as “deflagration”, where the chemical reaction moves through the molecules much more slowly. This burning is usually less dramatic and easier to manage. (Alford hates the term “controlled explosion”, reasoning that an expert should always control their explosions. If they fail, it’s a cock-up.)

The theory goes, then, that if you attack a munition just hard enough to ignite its contents but without causing a violent shock wave, it will deflagrate but, on a good day, it will not detonate. “Yes, it might make a massive fireball, but I’ve done it in jungles under a tree,” says Alford. “[With deflagration] the tree may lose most of its leaves, but with detonation, there is no tree.”

In the 1970s, he set up a makeshift laboratory in his suburban garage. There, he would experiment with making explosive charges, using measured quantities of material in different casings. He would leave his car engine running so any bangs could be plausibly written off as backfiring.

This cover story clearly didn’t wash with the neighbours, though, as first the police and then MI5 – “the most gentlemanly man” – came round to see why exactly a chemistry graduate they had never heard of was blowing stuff up in his suburban garage. When he explained himself to the security services, they put him in touch with the Ministry of Defence, and he was offered a contract.

***

Alford Technologies has a slogan: “For when you can’t afford to fail”. It also has an office in a business park outside Trowbridge in Wiltshire, but the real action happens at its testing ground, a former quarry amid the rolling hills of the Mendips, not far outside Bath. It feels like a cross between a scrapyard and a building site. “Here’s the bottom half of a Soviet mine, which we use as a brazier,” says Alford at one point, prodding it with a toecap.

Soldiers from various armies come here to learn about explosives and how to render them harmless. It’s vital work: last year in Iraq and Syria there were dozens of car bombs, with a single one in Baghdad claiming 250 lives. In Manchester this year an Isis-inspired jihadi killed 22 concert-goers and injured 250 with a backpack bomb apparently built from instructions found
on the internet.

Learning to counter such threats means understanding them; jihadists and other terrorists might have access only to basic materials, but many also display great ingenuity. When I ask why Alford has a packet of Tampax in his lab, he says the tampons can be dipped in liquid explosives and turned into cartridges: “I teach dirty tricks so they don’t get caught out by them.”

Sidney Alford’s contributions to the world of explosives rest on an unlikely substance: water. When he first began tinkering in his garage in the 1970s, engineers had already worked out a rough-and-ready way of disabling improvised explosive devices (IEDs). They used a gun barrel loaded with a blank cartridge to fire a jet of water that broke through the explosive’s casing and disrupted it. However, a sufficiently strong casing – say, one made of steel – could defeat this method.

In a low outbuilding in the quarry, Alford shows me his answer to this problem. Within a shaped charge, the force of a small explosion collapses a metal cone, turning it inside out and extruding it into a long, thin rod that shoots out at high velocity, about five times faster than a bullet.

The young chemist had an idea: why not combine the water from the older gun-barrel method with the accuracy and force of the metal jet in a shaped charge? In Alford inventions such as the Vulcan and the Pluton, the explosive charge shoots a targeted jet of water at high speed and with incredible accuracy.

Ho ho, you’re thinking. Water! Very scary. This is broadly what I thought until I saw one of Alford’s smaller shaped charges in action. After the demonstration with the hammer, he put on a pair of sturdy boots instead of brogues and we hopped into a small four-by-four to get to the base of the quarry. “Should I take my safety glasses?” I asked, even though we would be inside an old reinforced lookout hut salvaged from the Maze prison in Northern Ireland. “Oh no,” replied Alford. “If it goes wrong, it will kill you. No need to waste a perfectly good pair of glasses.”

The Vulcan is about six-inches long, with a case of grey plastic, and loaded with 30g of plastic explosives with a cone of water held in front of it. The explosive is “about two toasts’ worth of butter,” said Alford’s project manager, Matt Eades, who served in the Royal Engineers for 25 years.

Alford placed the charge above a 10mm-thick steel plate using the aluminium-wire legs as a tripod, inserted an electric detonator into the Vulcan, and we retired to the hut, whose thick, double-glazed windows gave a good, if smeary, view of the sandpit. “If you write a nice, ingratiating article about me you can press the button,” said Alford.

I pressed the button.

There was a significant bang, making me glad of my ear defenders, but the plume went straight upwards. When we ventured out to the sandpit, Alford practically skipped up the side and fished out the metal plate, now with a clean-edged circular hole punched straight through it.

This practical demonstration had followed a whirlwind tour of the various Alford Technologies products and a brisk explanation of the theory of explosives. Alford clearly enjoys naming his creations: the Vulcan sits in his display alongside the Krakatoa and the Vesuvius, which can also be used for bomb disposal and demolition. The BootBanger is so called because “it bangs car boots” while the Van Trepan cuts a neat, round hole in the top of a larger vehicle. The Bottler is not only shaped like a bottle, but named for the Australian slang “that’s a bottler”, which Alford translates as “the cat’s whiskers”.

Even the Dioplex, a linear charge that creates a chopping blade, has a story attached: “I thought it was a do-it-yourself device, but I thought ‘do it oneself’ sounded better. So: ‘Do It Oneself Plastic Explosive’.”

One of the things a trip to the quarry teaches me is that the ways in which humans try to kill and maim each other are nothing if not inventive. The company sells a version of a Bangalore torpedo, an old invention used by Alford’s own father when he fought in the First World War. This is a modular tube you can push underneath barbed wire, blowing it apart to clear a path for infantry. A stronger version was needed, Alford says, because of the advent of razor wire. “Barbed wire was soft steel, designed to keep in cows. Razor wire was designed to cut you.” The new Alford Bangalore Blade torpedoes through the wire coils, severing them using four aluminium cutters and creating an unobstructed 10m route through.

The Breacher’s Boot is a door-shaped panel filled with water, used to punch through walls in hostage situations. “It gives a ‘kick’ to the wall, so bits of it will fall down. You don’t want to use shaped charges then,” he says. “If there’s a person on the other side of the wall, you’d cut them in half. And if you simply used a mass of high explosive, the concrete would fly almost horizontally.”

A similar idea lies behind the Alford Strip, a sticky rope of explosives and tamping material used in terror arrests, where the police would once have used a sledgehammer to open a door, but are now much more worried about booby traps. You run the 25mm- or 42mm-long plastic extrusion down a door, window or wall and then lay a length of det cord far enough away from it to put service personnel at a safer distance.

Down in the quarry, having punched through one square steel plate, we now try ten taped together versus a 40g load of explosives and a copper cone. The result: a 2m-high flash and the same clean hole – although the jet doesn’t make it through all ten plates. It stops at seven.

This isn’t an error: the shaped charges can use copper, water, aluminium or magnesium, depending on the force and space needed. Magnesium is incendiary; water and aluminium might be chosen because they lose velocity very quickly. You cut through what you want to cut through, without damaging either the structural integrity of the object surrounding it or innocent bystanders.

This precision is particularly important in demolition work. Last year, Alford Technologies took over the contract to break up Didcot Power Station, slicing through steel beams to dismantle the decommissioned building. It was called in after a terrible accident on 23 February 2016, when four workers employed by a respected firm, Coleman and Company, were killed while trying to lay charges inside the structure. “There was this crash – I looked over my shoulder and saw the boiler coming down,” one of the survivors, Mathew Mowat, told the Birmingham Mail. “We ran in self-preservation – then there was a loud bang and a massive cloud of dust, we couldn’t see much for a few minutes.”

It took months to recover the bodies of all four missing men, who had to be identified from dental records and tattoos.

***

Over an Eccles cake in the main office, Alford tells me about some of his other jobs, including cutting up sunken ships in the Persian Gulf during the “Tanker War” of the mid-1980s, between Iran and Iraq, and joining a mission to retrieve £40m in gold bars from HMS Edinburgh, which sank in 1942 off the coast of Norway. (It was carrying 4,570kg of Russian bullion destined for the western allies.) The ship had been designated a war grave to stop it being plundered, and an air of mystery hung over the whole salvage project. Alford was told not to mention that he was an explosives expert.

Perhaps unsurprisingly, his work – and his anti-authoritarian streak – has caused conflict. “I’m doing things government departments ought to be doing,” he tells me in the car on the way to the quarry. “I’m in the anomalous position of someone who is quite admired, but also quite despised. Civil servants hate my guts.” When he was 40, he says, he asked for a formal job working with the department of defence, “and was told I was too old to have new ideas”. He set up Alford Technologies in 1985, and it now employs six people. The latest set of accounts at Companies House value the firm’s net worth at £2.3m.

Although Alford is scrupulously careful when handling explosives, he loathes health-and-safety culture. As we tramp round the quarry, he indicates a sign next to a pond, reading “Deep Water”, and tuts theatrically. He voted for Brexit to give the establishment a kick, not thinking it would actually happen.

It is a source of great chagrin that the government breathes down his neck, regulating what compounds he can keep and how he can keep them. “You have to have a licence for every substance,” he tells me in the car. “I’ve got them all. Well, it might be different if I wanted to go nuclear.”

 In 1996, he decided to make a stand against the pettifogging bureaucracy that, as he saw it, interfered with his work. Spooked by the thought of Irish republican terrorism, the regulators had insisted that he had to put a lock on his explosives store. “I told them that if the IRA really wanted to get my explosives, they would kidnap one of my family.” (He has two sons with his Japanese-born wife, Itsuko; the elder, 46-year-old Roland, now runs the business.) Besides which, he didn’t see why he should put an alarm on his few kilos of various explosives when the farmer next door had tonnes of ammonium nitrate fertiliser, a key ingredient in the IRA’s bomb-making.

The stand-off broke when his request to renew his explosives licence was turned down; soon after, the police came to raid his stores. He had tipped off a friendly journalist, however, and the visit was captured on camera and written up first in the local paper and then the Daily Mail, where Christopher Booker took up the cause of a Englishman’s inalienable right to keep high explosives in his shed. “I felt morally obliged to be prosecuted,” he says now.

The court case, documented in the newspaper clippings, sounds like a mixture of deadening legal procedure and high farce. At the magistrates’ court, Alford and a friend pursued and rearrested the next defendant, who tried to do a runner; when his case was kicked upwards to Swindon Crown Court, he turned up in an armoured Daimler Ferret, posing for photographs with his head poking out of the top, white hair tucked into a helmet. He was eventually charged with possessing explosives without a licence and fined £750, with £250 costs. The judge ordered the police to give him his licence back, but ticked him off for using the court system for political purposes.

Listening to this story, it becomes clearer why Alford never ended up in the warm embrace of an official government role. He offered his ideas to the Ministry of Defence, but he shows me a letter from April 1977, where an unlucky official reveals that he is “regarding your correspondence with diminishing enthusiasm”. Still, he is sanguine. “Most of my enemies have now gone to the laboratory in the sky, or retired,” he says. “I’m glad I didn’t work for them. Would I have fitted in? Probably not.” In any case, he has had some official recognition, receiving an OBE in 2015.

***

Alford’s work is used in war zones including Afghanistan, but also places like Cambodia, which are still riddled with unexploded ordnance from previous ground wars. Over the years, he has visited that country and Laos several times to practise new ways of dealing with old bombs. (The company produces a more affordable version of the Vulcan for non-military use.) He first went to Vietnam during the war; the last person, he says, to get a Japanese tourist visa into the country in the 1950s. The company’s brochures show smiling locals posing next to the sleeping monsters they have had to live alongside for decades.

But Iraq, too, is in dire need of methods to deal with cheap, homemade explosives. After Matt the Ex-Army Guy and Alford have demonstrated how to blow a door off its hinges, cut through a 50mm steel bar, and turn a fire extinguisher inside out – “that is unzipped in all known directions, it is a former IED,” says Alford, Pythonesquely – they show me the Bottler and the BootBanger.

They drag beer kegs into the boot of an old blue Nissan Almera, explaining that these were a favoured IRA device: who questions a few beer kegs in the street? First, they stick a Bottler between the front seats, showing how you would disrupt any electronics without setting the vehicle on fire – which would destroy forensic evidence. “They’d usually use a robot,” explains Matt. “And the robot usually leaves [the area], because they’re expensive.” A six-wheeler bomb disposal robot costs around £750,000.

We retreat again to the hut. I must be looking increasingly nervous, because Alford tries to reassure me about the building’s structural integrity: “If it tips over, it will take two weeks to get you out. But they’ll know where to find your body.”

As promised, the explosion is focused – and controlled, in the Alford-approved sense of the word. The windscreen is peeled back, lying on the roof, but the fuel tank didn’t ignite and the back windows are intact. “I know it might look like a mess,” says Matt, “but this would be classified as a result. You use a smaller bit of explosive to get rid of a larger one.”

Finally, it’s time for the big one. Matt slides the BootBanger, shaped like a suitcase, under the back end of the car. It has a curved sheet of 400g of plastic explosive through the middle, sandwiched by water on both sides and encased in nondescript grey plastic.

Now this is a bigger bang. I suddenly see the point of all those “Blasting!” warning signs that surround the quarry. If you drove past and heard this, you’d think the Russians had invaded. As an orange-red flame flashes and a deep, throaty boom fills the quarry, the beer kegs are fired out of the back of the car, pinwheeling 20 feet in the air and coming to rest yards away. Debris rains down on the roof of the hut. I swear I can hear the plinking sound of metal cooling. The car is now missing its back windscreen, and is, it’s fair to say, probably never going to pass another MOT. Nevertheless, it is still recognisably car-shaped; the skeleton is undisturbed.

Unfazed, Alford hurries to the car, and plucks a piece of paper from the boot, clearly left there by a previous owner. It is undamaged.

And then it’s time to rejoin the real world. As he drives me back to Bath, I ask Alford what it feels like to do what he does. He has saved possibly hundreds, maybe thousands of lives. “Yes, but in an already over-populated world,” he sighs.

I know he doesn’t mean it callously; he just doesn’t want credit for what, in his eyes, is barely a job at all. The schoolboy who wanted to make a bigger bang got his wish. 

Helen Lewis is deputy editor of the New Statesman. She has presented BBC Radio 4’s Week in Westminster and is a regular panellist on BBC1’s Sunday Politics.

This article first appeared in the 28 May 2014 issue of the New Statesman, The elites vs the people