The New Statesman Essay - Which doctor?

Why we need measurement in the NHS

Just under a year ago, my wife was in agony in a hospital corridor, trying not to give birth to our fourth son. She had chosen this hospital from three offered by the GP. He asked her which she would prefer.

"I don't know. Which is the best?"

"I hear good and bad about all of them."

She picked at random. Now she was furious. A sheepish midwife was telling her she might have to go into labour right there in the corridor because all the birth suites were busy.

"It's been a very busy night," she said. "Try to hang on."

"I've had three children already and never have I been told I might have to give birth in a corridor." She lifted herself up and forced her way on to the labour ward. In the end, they found a bed.

A few days later, she came up with a simple idea - a Good Hospital Guide, something that might have helped her make a more informed decision. But was such a thing possible - an objective measure of the quality of a hospital, its departments and doctors?

Currently, if you are lucky, you can ring a friend of a friend who knows a consultant. But why shouldn't there be a record accessible to everyone, well connected or not?

For most of the past year, I and my colleagues have been working to create the first guide ever published in Britain that allows ordinary people, as well as their GPs, to compare the standards of local hospitals and make informed choices. It is more important than people realise. Research conducted for us by Imperial College shows that the NHS does its job increasingly well - overall death rates are declining. But it also reveals wide variations in standards of hospital care. New Labour has declared war on variable quality in the NHS, and it is right to do so.

According to our research, the same person with the same condition could be sent unknowing into one hospital and find he or she is at more than 50 per cent greater risk of dying during the course of treatment than in another. Some hospitals, it emerges, have six times as many doctors per bed as others. The existence of this lottery is well known in the health business, though it has never before been quantified. "We have established that where you go or whom you see affects quality far more than who you are," concluded a recent report by the Nuffield Trust.

We have become familiar with the idea of measuring the performance of public services. Performance is one of the watchwords of this consumer age. But can you really measure a hospital - or a school or police force - and, if you can, is the exercise of any real benefit to the taxpayer? When the government first started publishing indicators on school performance in the mid-1990s, there was widespread concern and criticism because the indicators do not reveal what parents most want to know: whether a school teaches well. The simple publication of exam results does not allow for variations in a school's intake of pupils. A school that seems average may actually be very good, because it has boosted the performance of children whose achievement was low on entry.

The same problems apply to the measurement of police performance. We have plenty of statistics for the different forces (clear-up rates, the use of stop and search, for example), but nothing that takes account of demographic factors - unemployment, age profile, social class - in the areas they police. Now, however, as Channel 4's programme The Crime List shows this month, academics at Leeds University have filled the gap and produced more plausible performance tables for the police.

The government takes the view that transparency is vital to healthy public services. It has created a new Statistics Commission to improve the quality of information collected (and to end arguments about "fiddling" figures). Those who argue against a crime list, or against our "Dr Foster Hospital Guide", argue for inaction. Once a problem is identified, something can be done about it. Without comparative information, nothing can be done.

In 1859, Florence Nightingale published the first study of hospital death rates in the world. The Ladybird history tends to miss this out, but the Victorian reformer's best work was conducted far away from the battlefields of the Crimea. Nightingale was the architect of the modern hospital - and identified immediately the overriding importance of measuring performance.

"It may seem a strange principle to enunciate as the very first requirement in a hospital that it should do the sick no harm," she wrote in Notes on Hospitals. "It is quite necessary nevertheless to lay down such a principle, because the actual mortality in hospitals, especially those of large crowded cities, is very much higher than any calculation founded on the mortality of the same class of patient treated out of hospital would lead us to expect." Nightingale, the first female Fellow of the Statistical Society, demonstrated that high death rates in large hospitals were preventable.

Sir Brian Jarman, emeritus professor of primary healthcare at Imperial College, is today's Florence Nightingale. As she did, he came to medicine late in life. His PhD was in seismology and he had an early career prospecting for oil in Libya. Later, seeking a different challenge, he retrained as a GP. Then he discovered information technology and developed a computer system to assess patient eligibility for benefits. It is now routinely used by GPs and has been adopted by the Department of Health and some social service departments. However, it is his current work that may prove his enduring legacy.

Jarman established a way to calculate death rates as a measure of hospital performance. Part of his purpose was to identify those hospitals that are "outliers" - the ones that perform particularly well, or particularly badly. Most of the rest are much of a muchness and not distinguishable on a statistical level. The next step is to break down mortality rates further to examine unit performance, where possible. Jarman believes that comprehensive analysis of this kind helps to identify mortality anomalies such as those for the children's heart surgery unit at Bristol Royal Infirmary (he is a member of the current inquiry).

Jarman tried in the past to have these figures published, but failed to get permission from the Department of Health. His requests were refused at ministerial level. Now the department has allowed us to publish his research - a symptom of the change in public, as well as political, attitudes to accountability.

However, there remains some paranoia about comparative statistics in the middle orders of the NHS. It is seriously argued in some quarters that patients will not benefit from greater information, and that they should not become involved in decisions that are complex beyond their comprehension. Hospital managers, straining under budgetary pressures, say that transparency threatens staff morale when they already have recruitment problems.

The most telling rebuttal of this argument comes from across the Atlantic. A decade ago, Newsday went to court under the US Freedom of Information Act to force the health commissioner to publish the death rates for cardiac surgery in New York state. It was a landmark case. Newsday won. The death rates - for coronary artery bypass grafts - were risk-adjusted for social factors. The effect has been nothing short of astonishing. When the figures were first published, there was big variation between the hospitals. Six years later, that had significantly narrowed. Most impressive, the overall mortality rates fell by 41 per cent, compared with an estimated 18 per cent drop over the same period nationwide.

"There has been a tendency to identify this trend with the publication of data," says Arthur Levin, the director of the Centre for Medical Consumers. "Doctors and hospitals do not like to look bad compared to their peer colleagues or institutions."

Publishing the information "shamed" the worst hospitals into looking at their own work. They then copied the example of the best institutions, improving their mortality rate. "The first time the figures were published, one hospital actually got rid of the surgeon who was dragging down their average," adds Levin. New York is now working on publishing figures for other medical procedures.

There are many factors other than death that matter in judging the performance of a hospital - how well people are treated by staff, how clean the wards are, whether individual departments meet clinical best practice standards, the level of patient satisfaction.

More than 100 indicators of this kind for every district general hospital in the NHS have been collected for the Dr Foster guide - most provided by the hospitals themselves (only five trusts refused to co-operate). Mortality may be the most robust benchmark we currently have, but it cannot be taken in isolation.

Measuring hospitals is a teething science that will take time to perfect. But it is clear that publication of any kind of performance data helps to improve quality. The Department of Health already publishes a large amount on hospitals - and it helps. Hospitals with long waiting-lists make an effort to shorten them. Managers do not like to run hospitals that come bottom of lists.

This is just the beginning. Imagine a world in which you could tap the name of a consultant, GP or dentist into a computer and find out if he or she was actually any good.

Dealing with deficiencies in the NHS is now an absolute political imperative. It requires a top-to-bottom regime of performance measurement, so that the incompetents (and worse) can be identified and removed. This is the way to bolster trust in a public service that has never been more criticised in its half-century of existence. Margaret Thatcher's internal market, it was agreed, led to more disruption than improvement. There is a better way to identify and promote good practice - through monitoring and publishing information under independent regulators.

It is also a matter of cost. A recent report by the chief medical officer estimated that there are 850,000 "adverse incidents" in the NHS each year, which cost in excess of £2bn. This could be a substantial underestimate. A recent report by the US Institute of Medicine estimated that between 2.9 and 3.7 per cent of hospitalisations involved medical errors. Between 8.8 and 13.6 per cent of these "adverse incidents" led to death. Brian Jarman has pointed out that if the American projections were applied to hospital admissions in England, it would imply that, annually, there are more than 25,000 deaths from medical accidents.

"More people die each year as a result of medical errors in US hospitals than die from motor vehicle accidents, breast cancer or Aids," said Professor Alan Maynard, the director of the health policy unit at the University of York. "Many of these events can be avoided at low cost. For instance, hospital-acquired infection costs the NHS over £1bn, and much of it could be avoided if doctors and nurses could be persuaded to wash their hands."

Earlier this year, the Nuffield Trust commissioned a report called Dying to Know. It assessed the benefits, or otherwise, of releasing information about quality in healthcare to the general public. "We would expect a new movement to emerge. The movement involves public disclosure of information about quality at the level of a named doctor, a named hospital, primary care organisation or a named health authority."

The government has already started to take this on board. The NHS plan, launched by Alan Milburn in June, is a blueprint for the most patient-centred health service in the world, and the most transparent. Hospitals will now have to listen more and record what patients say, improve advisory and complaint services and meet new clinical standards. The government has already established the Commission for Health Improvement, which is responsible for enforcing standards in hospitals; the National Institute for Clinical Excellence, whose objective is to end so-called "postcode prescribing"; and the National Service Framework standards, clinical targets that hospitals must meet by 2002. This month, in the wake of the Harold Shipman case, ministers announced the creation of a National Clinical Assessment Authority to monitor the performance of individual doctors.

Milburn has also initiated a new relationship with the private sector, so that NHS patients can be treated in private hospitals to speed up waiting times. If this is to work, the government will have to monitor private-sector performance. Private hospitals are aware of this and, as a first step, all have provided information about services and emergency back-up for the Dr Foster hospital guide. And insurance companies are already demanding better tools of measurement, particularly on individual consultants, so that they do not accredit poor performers.

This revolution in the health sector, a quite sudden commitment to measurement, has almost passed the consumer by. When performance information is published by the government, it is presented in a way that renders it near meaningless to the non-medical reader. People don't know how to use it because nobody has ever told them. Who doesn't know how to use a bank or a supermarket? Who does know how to use a hospital and make sure they are getting the best out of it?

This is one of the main functions of the hospital guide - to explain why you are better taking an injured child to a hospital that has 24-hour paediatric cover, or why it is useful to know that a hospital performs more than 500 cardiac catheterisations a year if you have a history of chest pain.

Give people better information and they will use it. They can sit with their GP and explore treatment options that may take them out of their local area. Does this mean that people will start moving house to be near the best performing hospitals? That didn't happen in America. All that did happen was that the less good hospitals got better.

For that reason most of all, individual consumers need to learn to be more demanding of their local hospitals.

Tim Kelsey is chief executive of Dr Foster. Additional research by Peter Apps

"The Dr Foster Guide to Hospitals in England, Scotland, Wales, Northern Ireland and the Republic of Ireland" is published in the Sunday Times on 14 and 21 January and over the internet on 21 January at www.drfoster.co.uk