The horsemeat scandal, alone amongst food scares, is not about health. Why the hell is it so huge?

Martha Gill's Irrational Animals column.

Over the weekend I caught up with an American cousin. His questions started out friendly enough, but when I confessed that I was “still a journalist”, they took a turn for the patronising. “I’ve been in England two weeks now and every time I switch on the news it’s just horse meat, horse meat, horse meat,” he said. “Does nothing happen in this country?”

Nothing does, but even I can see that our media’s horsemeat content is nigh on indigestible at the moment. The industry reaction has been huge too. Tesco has dropped €360m in market value. European leaders have called emergency meetings in Brussels. Now, large-scale (in other words, extremely expensive) DNA testing is being talked about.

What makes the scale of this food scare particularly odd is that it isn’t even a health scare. All our previous food scares have been: BSE, salmonella, listeria. This one is mostly about surprise. Looks like beef, tastes like beef, sold as beef, actually horse(!) (Let’s ignore the murmurings about bute, by the way, the horse analgesic that “may” have entered the food chain. Even if treated horses had ended up in some burgers the estimated dose would be too low to have any effects, and as the drug is used therapeutically in human beings anyway, the effects would be fairly innocuous.)

No: the scale of the reaction here, I’d argue, is all about BSE – another food scandal involving dangerous cost-cutting, regulatory failures and beef - but that time with fatal consequences. An important difference, you might think - yet almost every comment piece on the recent scandal has linked the two. Google “BSE horsemeat”, for example, and you get 182,000 results. BSE is Horsegate’s nearest relation and the scale of that crisis is dictating this one.

Our mistake here is an example of “anchoring”, taking an early piece of information and leaning on it too heavily as a reference point. We’re all vulnerable to the error. Here's how it works: ask someone the following two questions:

1)Was Gandhi more or less than 144 years old when he died?

2)How old was Gandhi when he died?

..and now ask someone else these questions:

1)Was Gandhi more or less than nine years old when he died?

2)How old was Gandhi when he died?

Absurd as the two openers are, they will still affect the answers you get. In group testing, the first questions had Gandhi die at an average age of 50 and the second at an average age of 67.

The trick is simple and effective but can also be dangerous, and when reacting to a developing crisis we are particularly susceptible, as with limited information available we cling all the harder to what we have.

In July 2011, when a bomb went off in Oslo, the world’s media instantly assumed that it was a work of jihadist terror, before the real identity of the perpetrator – the far-right extremist Anders Behring Breivik –was revealed. Nearly a year later, when a killer went on the rampage in France, the same media outlets pointed the finger at the far right until they discovered the murderer was an Islamist named Mohamed Merah.

We can’t avoid such mistakes entirely, but we can deal with them if we know they can happen. As panic starts to die down, we must reassess our evidence and start to piece together the real story.

Horsemeat: less about health than surprise. Photograph: Getty Images

Martha Gill writes the weekly Irrational Animals column. You can follow her on Twitter here: @Martha_Gill.

This article first appeared in the 25 February 2013 issue of the New Statesman, The cheap food delusion

Getty
Show Hide image

A small dose of facts could transform Britain's immigration debate

While "myth-busting" doesn't always work, there is an appetite for a better informed conversation than the one we're having now. 

For some time opinion polls have shown that the public sees immigration as one of the most important issues facing Britain. At the same time, public understanding of the economic and social impacts of immigration is poor and strongly influenced by the media: people consistently over-estimate the proportion of the population born outside the UK and know little about policy measures such as the cap on skilled non-EU migration. The public gets it wrong on other issues too - on teenage pregnancy, the Muslim population of the UK and benefit fraud to name just three. However, in the case of immigration, the strength of public opinion has led governments and political parties to reformulate policies and rules. Theresa May said she was cracking down on “health tourists” not because of any evidence they exist but because of public “feeling”. Immigration was of course a key factor in David Cameron’s decision to call a referendum on the UK’s membership with the EU and has been central to his current renegotiations.  

Do immigration facts always make us more stubborn and confused?

The question of how to both improve public understanding and raise the low quality of the immigration debate has been exercising the minds of those with a policy and research interest in the issue. Could the use of facts address misconceptions, improve the abysmally low quality of the debate and bring evidence to policy making? The respected think tank British Future rightly warns of the dangers associated with excessive reliance on statistical and economic evidence. Their own research finds that it leaves people hardened and confused. Where does that leave those of us who believe in informed debate and evidence based policy? Can a more limited use of facts help improve understandings and raise the quality of the debate?

My colleagues Jonathan Portes and Nathan Hudson-Sharp and I set out to look at whether attitudes towards immigration can be influenced by evidence, presented in a simple and straightforward way. We scripted a short video animation in a cartoon format conveying some statistics and simple messages taken from research findings on the economic and social impacts of immigration.

Targeted at a wide audience, we framed the video within a ‘cost-benefit’ narrative, showing the economic benefits through migrants’ skills and taxes and the (limited) impact on services. A pilot was shown to focus groups attended separately by the general public, school pupils studying ‘A’ level economics and employers.

Some statistics are useful

To some extent our findings confirm that the public is not very interested in big statistics, such as the number of migrants in the UK. But our respondents did find some statistics useful. These included rates of benefit claims among migrants, effects on wages, effects on jobs and the economic contribution of migrants through taxes. They also wanted more information from which to answer their own questions about immigration. These related to a number of current narratives around selective migration versus free movement, ‘welfare tourism’ and the idea that our services are under strain.

Our research suggests that statistics can play a useful role in the immigration debate when linked closely to specific issues that are of direct concern to the public. There is a role for careful and accurate explanation of the evidence, and indeed there is considerable demand for this among people who are interested in immigration but do not have strong preconceptions. At the same time, there was a clear message from the focus groups that statistics should be kept simple. Participants also wanted to be sure that the statistics they were given were from credible and unbiased sources.

The public is ready for a more sophisticated public debate on immigration

The appetite for facts and interest in having an informed debate was clear, but can views be changed through fact-based evidence? We found that when situated within a facts-based discussion, our participants questioned some common misconceptions about the impact of immigration on jobs, pay and services. Participants saw the ‘costs and benefits’ narrative of the video as meaningful, responding particularly to the message that immigrants contribute to their costs through paying taxes. They also talked of a range of other economic, social and cultural contributions. But they also felt that those impacts were not the full story. They were also concerned about the perceived impact of immigration on communities, where issues become more complex, subjective and intangible for statistics to be used in a meaningful way.

Opinion poll findings are often taken as proof that the public cannot have a sensible discussion on immigration and the debate is frequently described as ‘toxic’. But our research suggests that behind headline figures showing concern for its scale there may be both a more nuanced set of views and a real appetite for informed discussion. A small dose of statistics might just help to detoxify the debate. With immigration a deciding factor in how people cast their vote in the forthcoming referendum there can be no better time to try.