It's wrong to publish league tables

New scores tell parents nothing about schools

This month saw the publication of the annual league tables for secondary schools in England, showing GCSE and A-level results. All the posh papers publish them, usually accompanied by a dramatic news headline (this year, we were informed, 500,000 pupils were attending "failing schools" that had fallen short of a new government target) and by features on "top-performing" or "most improved" schools. But the main point, to borrow the Times headline, is to allow parents to "find the best school in your area".

Should these tables be published at all? This may seem a peculiar question; we hacks usually try to reveal information, not suppress it. Schools are financed from taxation and, if the government kept details of their performance secret (or didn't collect them at all), newspapers would write indignant editorials. Yet league tables are not published centrally in Scotland, Wales and Northern Ireland, nor in most other European countries.

The main argument against publication is that the tables don't show the "best" schools, even if we accept exam results as a measure of quality. Schools that recruit the brightest and most advantaged pupils - which means fee-charging and grammar schools particularly - are bound to do best. The government has tried to address this problem by publishing, beside the "raw" exam results, "contextual value-added" scores (CVAs). These take account, not only of children's prior attainment when they enter secondary school at 11, but also of background influences, such as eligibility for free meals or lack of spoken English at home. The final score tells us whether a school is doing better or worse than expected, given the pupil intake.

The CVAs throw up some pleasing outcomes. Fully comprehensive London boroughs such as Lewisham, Lambeth and Islington get considerably higher scores than authorities, such as Buckinghamshire, which still have grammar schools.

As a tool for parents trying to identify the best local schools, however, the CVAs are close to useless. Even if the background variables are collected and recorded accurately and even if they include the full range of external influences - both of which I doubt - the numbers of pupils in each school are mostly too small to produce statistically reliable comparisons.

As for all statistics, there is a range of doubt or "confidence intervals"; the same pupils, taking their exams on a different day, might have performed differently. If an averagely sized school has a score of 1,000 (denoting that it performs, given its intake, no worse and no better than it should) all it tells you is that the "true" score could be anywhere between 992 and 1008. In Islington, for example, the highest CVA is Highbury Fields's 1021.6. But the "true" score could be only 1011.0 and the "true" scores for six of the other nine schools in Islington could be higher than that.

A Bristol University study published this month - Accurate Performance Measure But Meaningless Ranking Exercise? by Deborah Wilson and Anete Piebalga (Centre for Market and Public Organisation) - finds that, using CVAs, only 52 per cent of secondary schools in England can be judged significantly better or worse than the national average. Among primary schools - for which CVAs are also published, on the basis of tests taken at 11 - the proportion will be much smaller, as their average cohort size is about a quarter that of the average secondary school. Moreover, they note, CVAs tell parents nothing about schools' effectiveness with children of different abilities.

All this explodes the case for publishing league tables. By inventing CVAs - which I doubt one parent in a hundred studies, never mind interprets correctly - the government tacitly admits the "raw" results are misleading. But the CVAs cannot adequately discriminate between most of the country's schools, and they exclude fee-charging schools, which do not collect the necessary background data on their pupils.

Among the many other arguments against publication are the perverse incentives they create. Many schools discourage pupils from taking "hard" subjects, such as foreign languages and sciences, because they fear depressing the proportion achieving GCSE grades A-C, which is what appears in the league tables. Others concentrate excessively on "borderline" pupils, who might just scrape a C grade, at the expense of those striving for A grades and those who might manage a D but not a C.

I agree information of this sort should not be kept secret. But it need not be published centrally. Each school should hold standardised information, more detailed than is now published, and give it to parents of prospective pupils on request. Ministers object that journalists will collect all the results and publish less reliable league tables. So be it. All school league tables mislead, and official publication gives them an undeserved authority.

Peter Wilby was editor of the Independent on Sunday from 1995 to 1996 and of the New Statesman from 1998 to 2005. He writes the weekly First Thoughts column for the NS.

This article first appeared in the 21 January 2008 issue of the New Statesman, Art is the new activism