The Work Programme destroyed a job for every £4600 it spent

Not a paragon of efficiency.

The government is now trying to spin the Work Programme figures, as expected, by focusing on the initiative's "cost effectiveness". The BBC's Nick Robinson, for instance, writes:

Ministers claim that they are meeting their "off benefit targets" and that they are saving money too. The cost of every job secured under their Work Programme is, they say, just over £2,000 compared with a cost of almost £7,500 under Labour's [Flexible] New Deal because the contractors are only paid 60% of their fee once someone is in a sustainable job: ie for six months.

It's certainly the case that Labour's programmes were more expensive than the coalition's replacements. But what this spin demonstrates is a serious failure to control for background noise. The Work Programme is, so far, worse than nothing at ensuring "job outcomes" – that is, people in unsubsidised work six months after they leave the programme. In the first fourteen months, 3.5 per cent of participants achieved job outcomes, but for people not on the programme, 5 per cent were expected to get jobs, according to Labour's shadow minister Liam Byrne.

(The news shouldn't be hugely surprising – one very effective way to get a job is to spend all day every day applying for jobs. Any training programme has to overcome that hurdle.)

Some quick back of the envelope maths, here. The full data is simply not available, but if ministers are saying that the Work Programme cost £2000 per job, and we know that there have been 32,310 job outcomes, then presumably they are claiming a budget to date of £65m.

Given that 5 per cent background rate, we can expect that if the Work Programme had never been instituted, there would have been 46,000 jobs in the normal process: 14,000 more.

In other words, the Work Programme did not cost £2000 per job. Instead, for every £4,600 it spent, it destroyed one participant's chance of employment.

Updated: The effect of the work programme was on the 14,000 job difference, and so the effect is one job destroyed for every £4,600, not for every £1,400. 3.5 per cent is the result for the first fourteen months, not the first year. Clarified the source of the 5 per cent figure.

Men at work. Photograph: Getty Images

Alex Hern is a technology reporter for the Guardian. He was formerly staff writer at the New Statesman. You should follow Alex on Twitter.

Getty
Show Hide image

A small dose of facts could transform Britain's immigration debate

While "myth-busting" doesn't always work, there is an appetite for a better informed conversation than the one we're having now. 

For some time opinion polls have shown that the public sees immigration as one of the most important issues facing Britain. At the same time, public understanding of the economic and social impacts of immigration is poor and strongly influenced by the media: people consistently over-estimate the proportion of the population born outside the UK and know little about policy measures such as the cap on skilled non-EU migration. The public gets it wrong on other issues too - on teenage pregnancy, the Muslim population of the UK and benefit fraud to name just three. However, in the case of immigration, the strength of public opinion has led governments and political parties to reformulate policies and rules. Theresa May said she was cracking down on “health tourists” not because of any evidence they exist but because of public “feeling”. Immigration was of course a key factor in David Cameron’s decision to call a referendum on the UK’s membership with the EU and has been central to his current renegotiations.  

Do immigration facts always make us more stubborn and confused?

The question of how to both improve public understanding and raise the low quality of the immigration debate has been exercising the minds of those with a policy and research interest in the issue. Could the use of facts address misconceptions, improve the abysmally low quality of the debate and bring evidence to policy making? The respected think tank British Future rightly warns of the dangers associated with excessive reliance on statistical and economic evidence. Their own research finds that it leaves people hardened and confused. Where does that leave those of us who believe in informed debate and evidence based policy? Can a more limited use of facts help improve understandings and raise the quality of the debate?

My colleagues Jonathan Portes and Nathan Hudson-Sharp and I set out to look at whether attitudes towards immigration can be influenced by evidence, presented in a simple and straightforward way. We scripted a short video animation in a cartoon format conveying some statistics and simple messages taken from research findings on the economic and social impacts of immigration.

Targeted at a wide audience, we framed the video within a ‘cost-benefit’ narrative, showing the economic benefits through migrants’ skills and taxes and the (limited) impact on services. A pilot was shown to focus groups attended separately by the general public, school pupils studying ‘A’ level economics and employers.

Some statistics are useful

To some extent our findings confirm that the public is not very interested in big statistics, such as the number of migrants in the UK. But our respondents did find some statistics useful. These included rates of benefit claims among migrants, effects on wages, effects on jobs and the economic contribution of migrants through taxes. They also wanted more information from which to answer their own questions about immigration. These related to a number of current narratives around selective migration versus free movement, ‘welfare tourism’ and the idea that our services are under strain.

Our research suggests that statistics can play a useful role in the immigration debate when linked closely to specific issues that are of direct concern to the public. There is a role for careful and accurate explanation of the evidence, and indeed there is considerable demand for this among people who are interested in immigration but do not have strong preconceptions. At the same time, there was a clear message from the focus groups that statistics should be kept simple. Participants also wanted to be sure that the statistics they were given were from credible and unbiased sources.

The public is ready for a more sophisticated public debate on immigration

The appetite for facts and interest in having an informed debate was clear, but can views be changed through fact-based evidence? We found that when situated within a facts-based discussion, our participants questioned some common misconceptions about the impact of immigration on jobs, pay and services. Participants saw the ‘costs and benefits’ narrative of the video as meaningful, responding particularly to the message that immigrants contribute to their costs through paying taxes. They also talked of a range of other economic, social and cultural contributions. But they also felt that those impacts were not the full story. They were also concerned about the perceived impact of immigration on communities, where issues become more complex, subjective and intangible for statistics to be used in a meaningful way.

Opinion poll findings are often taken as proof that the public cannot have a sensible discussion on immigration and the debate is frequently described as ‘toxic’. But our research suggests that behind headline figures showing concern for its scale there may be both a more nuanced set of views and a real appetite for informed discussion. A small dose of statistics might just help to detoxify the debate. With immigration a deciding factor in how people cast their vote in the forthcoming referendum there can be no better time to try.