Has global warming really stopped?

Mark Lynas responds to a controversial article on newstatesman.com which argued global warming has s

On 19 December the New Statesman website published an article which, judging by the 633 comments (and counting) received so far, must go down in history as possibly the most controversial ever. Not surprising really – it covered one of the most talked-about issues of our time: climate change. Penned by science writer David Whitehouse, it was guaranteed to get a big response: the article claimed that global warming has ‘stopped’.

As the New Statesman’s environmental correspondent, I have since been deluged with queries asking if this represents a change of heart by the magazine, which has to date published many editorials steadfastly supporting urgent action to reduce carbon emissions. Why bother doing that if global warming has ‘stopped’, and therefore might have little or nothing to do with greenhouse gas emissions, which are clearly rising?

I’ll deal with this editorial question later. First let’s ask whether Whitehouse is wholly or partially correct in his analysis. To quote:

"The fact is that the global temperature of 2007 is statistically the same as 2006 as well as every year since 2001. Global warming has, temporarily or permanently, ceased. Temperatures across the world are not increasing as they should according to the fundamental theory behind global warming – the greenhouse effect. Something else is happening and it is vital that we find out what or else we may spend hundreds of billions of pounds needlessly."

I’ll be blunt. Whitehouse got it wrong – completely wrong. The article is based on a very elementary error: a confusion between year-on-year variability and the long-term average. Although CO2 levels in the atmosphere are increasing each year, no-one ever argued that temperatures would do likewise. Why? Because the planet’s atmosphere is a chaotic system, which expresses a great deal of interannual variability due to the interplay of many complex and interconnected variables. Some years are warmer and cooler than others. 1998, for example, was a very warm year because an El Nino event in the Pacific released a lot of heat from the ocean. 2001, by contrast, was somewhat cooler, though still a long way above the long-term average. 1992 was particularly cool, because of the eruption of a large volcano in the Philippines called Mount Pinatubo.

‘Climate’ is defined by averaging out all this variability over a longer term period. So you won’t, by definition, see climate change from one year to the next - or even necessarily from one decade to the next. But look at the change in the average over the long term, and the trend is undeniable: the planet is getting hotter.

Look at the graph below, showing global temperatures over the last 25 years. These are NASA figures, using a global-mean temperature dataset known as GISSTEMP. (Other datasets are available, for example from the UK Met Office. These fluctuate slightly due to varying assumptions and methodology, but show nearly identical trends.) Now imagine you were setting out to write Whitehouse’s article at some point in the past. You could plausibly have written that global warming had ‘stopped’ between 1983 and 1985, between 1990 and 1995, and, if you take the anomalously warm 1998 as the base year, between 1998 and 2004. Note, however, the general direction of the red line over this quarter-century period. Average it out and the trend is clear: up.

Note also the blue lines, scattered like matchsticks across the graph. These, helpfully added by the scientists at RealClimate.org (from where this graph is copied), partly in response to the Whitehouse article, show 8-year trend lines – what the temperature trend is for every 8-year period covered in the graph.

You’ll notice that some of the lines, particularly in the earlier part of the period, point downwards. These are the periods when global warming ‘stopped’ for a whole 8 years (on average), in the flawed Whitehouse definition – although, as astute readers will have quickly spotted, the crucial thing is what year you start with. Start with a relatively warm year, and the average of the succeeding eight might trend downwards. In scientific parlance, this is called ‘cherry picking’, and explains how Whitehouse can assert that "since [1998] the global temperature has been flat" – although he is even wrong on this point of fact, because as the graph above shows, 2005 was warmer.

Note also how none of the 8-year trend lines point downwards in the last decade or so. This illustrates clearly how, far from having ‘stopped’, global warming has actually accelerated in more recent times. Hence the announcement by the World Meteorological Organisation on 13 December, as the Bali climate change meeting was underway, that the decade of 1998-2007 was the “warmest on record”. Whitehouse, and his fellow contrarians, are going to have to do a lot better than this if they want to disprove (or even dispute) the accepted theory of greenhouse warming.

The New Statesman’s position on climate change

Every qualified scientific body in the world, from the American Association for the Advancement of Science to the Royal Society, agrees unequivocally that global warming is both a reality, and caused by man-made greenhouse gas emissions. But this doesn’t make them right, of course. Science, in the best Popperian definition, is only tentatively correct, until someone comes along who can disprove the prevailing theory. This leads to a frequent source of confusion, one which is repeated in the Whitehouse article – that because we don’t know everything, therefore we know nothing, and therefore we should do nothing. Using that logic we would close down every hospital in the land. Yes, every scientific fact is falsifiable – but that doesn’t make it wrong. On the contrary, the fact that it can be challenged (and hasn’t been successfully) is what makes it right.

Bearing all this in mind, what should a magazine like the New Statesman do in its coverage of the climate change issue? Newspapers and magazines have a difficult job of trying, often with limited time and information, to sort out truth from fiction on a daily basis, and communicating this to the public – quite an awesome responsibility when you think about it. Sometimes even a viewpoint which is highly likely to be wrong gets published anyway, because it sparks a lively debate and is therefore interesting. A publication that kept to a monotonous party line on all of the day’s most controversial issues would be very boring indeed.

However, readers of my column will know that I give contrarians, or sceptics, or deniers (call them what you will) short shrift, and as a close follower of the scientific debate on this subject I can state without doubt that there is no dispute whatsoever within the expert community as to the reality or causes of manmade global warming. But even then, just because all the experts agree doesn’t make them right – it just makes them extremely unlikely to be wrong. That in turn means that if someone begs to disagree, they need to have some very strong grounds for doing so – not misreading a basic graph or advancing silly conspiracy theories about IPCC scientists receiving paycheques from the New World Order, as some of Whitehouse’s respondents do.

So, a mistaken article reached a flawed conclusion. Intentionally or not, readers were misled, and the good name of the New Statesman has been used all over the internet by climate contrarians seeking to support their entrenched positions. This is regrettable. Good journalism should never exclude legitimate voices from a debate of public interest, but it also needs to distinguish between carefully-checked fact and distorted misrepresentations in complex and divisive areas like this. The magazine’s editorial policy is unchanged: we want to see aggressive action to reduce carbon emissions, and support global calls for planetary temperatures to be stabilised at under two degrees above pre-industrial levels.

Yes, scientific uncertainties remain in every area of the debate. But consider how high the stakes are here. If the 99% of experts who support the mainstream position are right, then we have to take urgent action to reduce emissions or face some pretty catastrophic consequences. If the 99% are wrong, and the 1% right, we will be making some unnecessary efforts to shift away from fossil fuels, which in any case have lots of other drawbacks and will soon run out. I’d hate to offend anyone here, but that’s what I’d call a no-brainer.

Mark Lynas has is an environmental activist and a climate change specialist. His books on the subject include High Tide: News from a warming world and Six Degree: Our future on a hotter planet.
MUCHTAR ZAKARIA/AP PHOTO
Show Hide image

Breaking the consensus

Even IMF researchers are calling time on free market dogma and the neoliberal orthodoxies of the past 30 years.

What has come over the International Monetary Fund? Not content with playing the good cop to Europe’s bad in the ongoing Greek crisis – in which it has been arguing for debt relief and less austerity – the fund has just published an article in its in-house magazine by three of its leading researchers entitled “Neoliberalism: Oversold?”. Their answer is “yes”.

The article takes aim at two of the most important aspects of the neoliberal economic agenda that has been so influential since the early 1980s. The first is the removal of restrictions on the movement of capital across international borders – so-called capital account liberalisation. Readers of a certain age will recall that 40 years ago there were strict limits on the amount of foreign currency one could buy before going abroad on holiday and companies had to show evidence of the need to import supplies to gain access to the foreign exchange market. Such restrictions were even harsher for international investment – making it almost impossible for institutions in one country to invest in the equity and bond markets of another.

Neoliberal theorists decried this situation as absurd. Rich countries have abundant capital, so the rate of return on it is relatively low, they argued. Poor ones are capital-scarce, so the returns on investment are high. Erecting artificial barriers preventing capital from flowing from rich countries to poor ones was therefore like stopping water from flowing downhill: an unhelpful intervention in the natural order of things, with detrimental consequences for all. During the 1980s and 1990s, international capital controls were thus dismantled worldwide – and often as a precondition for IMF assistance. The scale of private cross-border capital flows rocketed and soon eclipsed those of public-sector lenders, such as the World Bank and the IMF itself. 

But while these private capital flows were large, it quickly became obvious that they could also be extremely erratic. Throughout the 1990s, a succession of big developing countries enjoyed huge inflows of money  to be used for financing government spending and infrastructure development. But in each case, the new sources of funding turned out to be fickle, as private investors proved far less tolerant of heterodox economic policy than official funders had been. The result was a succession of crises – in Mexico in 1994, in east Asia in 1997, in Russia in 1998, in Argentina in 2001 – as the newly discovered rivers of capital suddenly began flowing the other way.

The IMF became well known at the time for insisting that these occasional stunning crashes should not derail liberalisation: they were just the price of reforms not fully complete. The new IMF article, in the June edition of Finance & Development magazine, disagrees. After nearly 30 years, it argues, the growing pains have not stopped. Open capital accounts have indeed increased developing countries’ access to capital for development but, strikingly, there is little evidence that this has raised growth rates. And there is no question that it has exaggerated the boom-bust business cycle, increased inequality and raised the odds of periodic financial crises.

Couched as it is in the equivocal language of cost-benefit analysis, this change of tune might sound inconsequential. It is not. Twenty years ago, Malaysia’s prime minister, Mahathir Mohamad, was branded an international pariah for reimposing capital controls to insulate his country from the east Asian financial crisis. The new IMF article concludes that such measures are “a viable, and sometimes the only, option”.

The second plank of the neoliberal agenda at which the IMF article takes aim will be even more familiar to UK readers: curbing the size of the state. In the 1980s and 1990s, the main emphasis on this front was on privatisation. As that agenda began to run its course, emphasis shifted to methods of constraining governments’ abilities to run excessive deficits of spending over revenues – and rules to avoid the accumulation of too much public debt. The Maastricht rules introduced by the eurozone countries in 1993, which mandated annual deficits of no more than 3 per cent of GDP and public debt of no more than 60 per cent, were perhaps the most prominent example.

For most of the 2000s, such self-denying ordinances seemed to be costless virtues.  Then, in 2007, the global economic crisis hit. After a brief flirtation with increased state spending when confronted with the steep recessions of 2008-09, the governments of the eurozone and the UK were converted again to the crucial importance of shrinking public debt and cutting spending. The notion that cutting spending can (or even is necessary to) boost growth – of “expansionary fiscal contraction” – came roaring back into fashion.

***

The IMF broached its dissent early in the post-crisis period, with its economists expressing scepticism over the pace and timing of austerity in Europe. Christine Lagarde, the fund’s managing director, and Olivier Blanchard, its chief economist, argued for relaxing spending constraints and turning a blind eye to debt burdens until depressed economies were solidly recovering. 

Gossip-mongers at the World Economic Forum in Davos put it down to the fact that they are both French and therefore constitutional backsliders on matters of fiscal prudence; and policymakers preferred to pick up on pseudo-scientific economic sound bites such as the idea of a public debt tipping-point at 90 per cent of GDP. In reality, however, the IMF was merely stating the clear conclusions of conventional economic models – models that the vast difference since 2009 in the recovery of the US, which did not opt for austerity, and Europe, which did, appears to have proved largely correct.

The new IMF article drives home the point. The “short-run costs of lower output and welfare and higher unemployment”, it concludes, “have been underplayed, and the desirability . . . of simply living with high debt and allowing debt ratios to decline organically through growth is underappreciated”. Austerity is often self-defeating and debt limits by themselves are meaningless.

Is this two-part mea culpa on both capital flows and the size of the state a major landmark in the evolution of the IMF’s thinking – and could this be important in practice, given the intellectual heft that the Washington institutions bring to the international policy debate? It is, and it could.

Will it rehabilitate the IMF as an institution among the populations of the countries it is meant to serve? Here I am more sceptical. There is no question that there was disagreement on policy in east Asia in 1997, for example. But the real problem with the IMF’s intervention had to do not with the correctness of its prescriptions but their legitimacy. The single most enduring image of that painful period was the photo of the then managing director of the IMF, Michel Camdessus, arms folded and frowning like a schoolmaster giving detention, watching over President Suharto of Indonesia as, humiliatingly, Suharto bowed to the inevitable and signed up to the fund’s financing plan.

In many developing countries, memories of unjust colonial domination are raw and if the IMF is to help resolve the growing dissatisfaction of populations with policymaking elites, it will need to do more than just make improvements to its advice – no matter how sincere and welcome such improvements may be. The reality that, in effect, power over its assistance belongs exclusively to a handful of rich economies will have to change. Reforming its governance to give developing countries more control is the place to start.

In the UK, meanwhile, we can have no such complaints. We have no one to blame for taking neoliberalism’s crazier ideas too seriously but ourselves.

Felix Martin is the author of “Money: the Unauthorised Biography” (Vintage)

Macroeconomist, bond trader and author of Money

This article first appeared in the 02 June 2016 issue of the New Statesman, How men got left behind