Has global warming really stopped?

Mark Lynas responds to a controversial article on newstatesman.com which argued global warming has s

On 19 December the New Statesman website published an article which, judging by the 633 comments (and counting) received so far, must go down in history as possibly the most controversial ever. Not surprising really – it covered one of the most talked-about issues of our time: climate change. Penned by science writer David Whitehouse, it was guaranteed to get a big response: the article claimed that global warming has ‘stopped’.

As the New Statesman’s environmental correspondent, I have since been deluged with queries asking if this represents a change of heart by the magazine, which has to date published many editorials steadfastly supporting urgent action to reduce carbon emissions. Why bother doing that if global warming has ‘stopped’, and therefore might have little or nothing to do with greenhouse gas emissions, which are clearly rising?

I’ll deal with this editorial question later. First let’s ask whether Whitehouse is wholly or partially correct in his analysis. To quote:

"The fact is that the global temperature of 2007 is statistically the same as 2006 as well as every year since 2001. Global warming has, temporarily or permanently, ceased. Temperatures across the world are not increasing as they should according to the fundamental theory behind global warming – the greenhouse effect. Something else is happening and it is vital that we find out what or else we may spend hundreds of billions of pounds needlessly."

I’ll be blunt. Whitehouse got it wrong – completely wrong. The article is based on a very elementary error: a confusion between year-on-year variability and the long-term average. Although CO2 levels in the atmosphere are increasing each year, no-one ever argued that temperatures would do likewise. Why? Because the planet’s atmosphere is a chaotic system, which expresses a great deal of interannual variability due to the interplay of many complex and interconnected variables. Some years are warmer and cooler than others. 1998, for example, was a very warm year because an El Nino event in the Pacific released a lot of heat from the ocean. 2001, by contrast, was somewhat cooler, though still a long way above the long-term average. 1992 was particularly cool, because of the eruption of a large volcano in the Philippines called Mount Pinatubo.

‘Climate’ is defined by averaging out all this variability over a longer term period. So you won’t, by definition, see climate change from one year to the next - or even necessarily from one decade to the next. But look at the change in the average over the long term, and the trend is undeniable: the planet is getting hotter.

Look at the graph below, showing global temperatures over the last 25 years. These are NASA figures, using a global-mean temperature dataset known as GISSTEMP. (Other datasets are available, for example from the UK Met Office. These fluctuate slightly due to varying assumptions and methodology, but show nearly identical trends.) Now imagine you were setting out to write Whitehouse’s article at some point in the past. You could plausibly have written that global warming had ‘stopped’ between 1983 and 1985, between 1990 and 1995, and, if you take the anomalously warm 1998 as the base year, between 1998 and 2004. Note, however, the general direction of the red line over this quarter-century period. Average it out and the trend is clear: up.

Note also the blue lines, scattered like matchsticks across the graph. These, helpfully added by the scientists at RealClimate.org (from where this graph is copied), partly in response to the Whitehouse article, show 8-year trend lines – what the temperature trend is for every 8-year period covered in the graph.

You’ll notice that some of the lines, particularly in the earlier part of the period, point downwards. These are the periods when global warming ‘stopped’ for a whole 8 years (on average), in the flawed Whitehouse definition – although, as astute readers will have quickly spotted, the crucial thing is what year you start with. Start with a relatively warm year, and the average of the succeeding eight might trend downwards. In scientific parlance, this is called ‘cherry picking’, and explains how Whitehouse can assert that "since [1998] the global temperature has been flat" – although he is even wrong on this point of fact, because as the graph above shows, 2005 was warmer.

Note also how none of the 8-year trend lines point downwards in the last decade or so. This illustrates clearly how, far from having ‘stopped’, global warming has actually accelerated in more recent times. Hence the announcement by the World Meteorological Organisation on 13 December, as the Bali climate change meeting was underway, that the decade of 1998-2007 was the “warmest on record”. Whitehouse, and his fellow contrarians, are going to have to do a lot better than this if they want to disprove (or even dispute) the accepted theory of greenhouse warming.

The New Statesman’s position on climate change

Every qualified scientific body in the world, from the American Association for the Advancement of Science to the Royal Society, agrees unequivocally that global warming is both a reality, and caused by man-made greenhouse gas emissions. But this doesn’t make them right, of course. Science, in the best Popperian definition, is only tentatively correct, until someone comes along who can disprove the prevailing theory. This leads to a frequent source of confusion, one which is repeated in the Whitehouse article – that because we don’t know everything, therefore we know nothing, and therefore we should do nothing. Using that logic we would close down every hospital in the land. Yes, every scientific fact is falsifiable – but that doesn’t make it wrong. On the contrary, the fact that it can be challenged (and hasn’t been successfully) is what makes it right.

Bearing all this in mind, what should a magazine like the New Statesman do in its coverage of the climate change issue? Newspapers and magazines have a difficult job of trying, often with limited time and information, to sort out truth from fiction on a daily basis, and communicating this to the public – quite an awesome responsibility when you think about it. Sometimes even a viewpoint which is highly likely to be wrong gets published anyway, because it sparks a lively debate and is therefore interesting. A publication that kept to a monotonous party line on all of the day’s most controversial issues would be very boring indeed.

However, readers of my column will know that I give contrarians, or sceptics, or deniers (call them what you will) short shrift, and as a close follower of the scientific debate on this subject I can state without doubt that there is no dispute whatsoever within the expert community as to the reality or causes of manmade global warming. But even then, just because all the experts agree doesn’t make them right – it just makes them extremely unlikely to be wrong. That in turn means that if someone begs to disagree, they need to have some very strong grounds for doing so – not misreading a basic graph or advancing silly conspiracy theories about IPCC scientists receiving paycheques from the New World Order, as some of Whitehouse’s respondents do.

So, a mistaken article reached a flawed conclusion. Intentionally or not, readers were misled, and the good name of the New Statesman has been used all over the internet by climate contrarians seeking to support their entrenched positions. This is regrettable. Good journalism should never exclude legitimate voices from a debate of public interest, but it also needs to distinguish between carefully-checked fact and distorted misrepresentations in complex and divisive areas like this. The magazine’s editorial policy is unchanged: we want to see aggressive action to reduce carbon emissions, and support global calls for planetary temperatures to be stabilised at under two degrees above pre-industrial levels.

Yes, scientific uncertainties remain in every area of the debate. But consider how high the stakes are here. If the 99% of experts who support the mainstream position are right, then we have to take urgent action to reduce emissions or face some pretty catastrophic consequences. If the 99% are wrong, and the 1% right, we will be making some unnecessary efforts to shift away from fossil fuels, which in any case have lots of other drawbacks and will soon run out. I’d hate to offend anyone here, but that’s what I’d call a no-brainer.

Mark Lynas has is an environmental activist and a climate change specialist. His books on the subject include High Tide: News from a warming world and Six Degree: Our future on a hotter planet.
Show Hide image

The return of big history: the long past is the antidote to short-termism

Historians Jo Guldi and David Armitage have created a powerful, ambitious rebuttal to "the spectre of the short term".

Photo Op (2006) by Peter Kennard and Cat Phillipps

“There never has been a time when . . . except in the most general sense, a study of history provides so little instruction for our present day,” Prime Minister Tony Blair declared in a speech to the US Congress in July 2003. Nowadays Blair is not exactly deemed a voice of authority, but the opinions he expressed are still widely shared. In an era when technology has revolutionised our daily existence – even the nature of life itself – understanding the past may seem irrelevant when planning the future. But history does matter. And many academics are anxious to explain why.

A striking contribution comes from the historians Jo Guldi and David Armitage. At a mere 165 pages, their book The History Manifesto is modest in scale but not in ambition: its first sentence mimics the opening of the Communist Manifesto: “A spectre is haunting our time: the spectre of the short term.” Guldi, who teaches at Brown, and Armitage, a British-born professor at Harvard, point to politicians trapped in the electoral cycle, business leaders fixated on profit returns and bureaucrats obsessed by performance targets. Academics, one might add, have also been sucked into the vortex, with the rigid six-year cycle of the Research Excellence Framework deterring big historical projects that take time to mature.

Yet Guldi and Armitage insist that historical writing can provide the answer to short-termism, if properly conceived and delivered. In the last quarter of the 20th century, they argue, most historians produced scholarly monographs or doctoral dissertations about narrow periods and specific topics, or they indulged in microhistories of “exceptionally normal” episodes from everyday life, such as Robert Darnton’s investigation of a bizarre cat massacre in 18th-century Paris. There seemed little appetite to explore the longue durée, a term popularised in the 1950s by Fernand Braudel and other scholars associated with the French journal Annales.

This obsession with the miniature reflected the increasing professionalisation of historical writing. In contrast to earlier centuries, when the historian’s craft had been the preserve of amateurs such as Gibbon and Macaulay, the 20th century was the era when history professionals emerged – men and women who earned their living from teaching and writing history as employees of universities. Like other professionals, they sought advancement by becoming unquestioned masters of a small terrain, fenced off by their command of specialist archives. The explosion since the 1970s of new subdisciplines – including social history, women’s history and cultural history – encouraged further balkanisation of the subject. Academic historians seemed to be saying more and more about less and less.

In consequence, Guldi and Armitage lament, the big debates of our day lack the benefit of historical perspective. They spotlight a trio of vital contemporary questions – climate change, international governance and socio-economic inequality – that have been addressed mostly by economists and other social scientists, often using data and assumptions that are rooted in the short term. Yet these subjects cry out for a longue durée approach. And Guldi and Armitage show how historians have started to respond over the past decade, exploiting the mass of information that can now be marshalled thanks to the digitisation of archives and other databases, combined with the ubiquity of keyword searching. In the age of IT, social problems on a scale previously beyond the grasp of a large research group are feasible for a lone, but digitally smart, scholar. And so, The History Manifesto proclaims, big history is once again possible, thanks to big data.

Guldi and Armitage write with brio and passion and their ambition should be applauded. Yet their supposedly universal panacea is in many ways very American. The Manifesto offers a reworking for historians of a tradition of “big” thinking that has characterised American intellectual life since the Second World War. “Big science” led the way (in projects such as the Bomb, mainframe computers and the transistor), followed by big social science (through foundations such as the Ford and Rockefeller and the RAND Corporation) – all closely harnessed to the needs of the federal government. Big history, now much in fashion in leading US history departments such as Harvard’s, is another facet of that academic-governmental nexus: the cover of the Manifesto proclaims a desire to “speak truth to power”.

And yet, like many programmatic writings, The History Manifesto seems strangely indifferent to practicalities. It does not make clear how these big historical projects would grab the attention of people in power. Simply addressing topical issues such as climate change is not enough: as Guldi and Armitage acknowledge, politicians are creatures of the short term who prefer to ignore big problems that cannot be solved, or at least visibly ameliorated, within an electoral cycle. They are also busy people who do not have time for lengthy reading and reflection. All this shows that big historical truths must be served up in politically digestible, bite-sized chunks.

A more user-centred approach is exemplified by the work of Richard Neustadt and Ernest May – Harvard academics, now sadly deceased – who for many years taught a course on the uses of history to American politicians, officials and senior military. The book that grew out of it, Thinking in Time, was published way back in 1986, and The History Manifesto makes no reference to it. Yet Neustadt and May offer an instructive alternative response to the curse of short-termism in high places.

Their main injunction derives from Avram Goldberg, the chief executive of a New England grocery chain. Whenever a manager came to him in a flap, he wouldn’t ask, “What’s the problem?” but say, “Tell me the story.” That way, Goldberg said, “I find out what the problem really is.” His maxim became the premise of the book by Neustadt and May. Rather than focus on the crisis at hand (while already straining for a quick-fix solution), one should stand back and ask, “How did we get into this mess?” That is the first step to seeing a way out.

Telling the story requires identifying critical events and turning points, asking what happened when. This basic chronology then has to be fleshed out by addressing “who” and “why” questions about personalities and motivations: what Neustadt and May call “journalists’ questions”. Digging out this kind of human detail is as much a historical activity as constructing a chronology. It requires probing into the past of a person or a country, just the sort of thing that Blair, Bush and their aides did not do properly before the invasion of Iraq.

Asking “What’s the story?” may seem a strange way to define the practice of history. Our normal definition is content-based – the names-and-dates regime that destroyed any feel for the subject among millions of schoolchildren and that still features in the UK citizenship test. Nor does “What’s the story?” chime with the idea that history provides a stock of useful analogies, such as the “lessons of appeasement” that have seduced many political leaders, from Anthony Eden in 1956 to Blair and Bush in 2003. Instead of history as a body of facts or a toolkit of lessons, Neustadt and May presented it as a way of thinking: thinking in the stream of time.

Actually, that is not such an alien idea: it’s what we do every evening, constructing a narrative of what has happened during the day by highlighting some events and downplaying others within an arc of what seems, with hindsight, to be significant. Thinking in Time essentially urged policymakers to apply the same narrative mode of thinking more systematically when making decisions that relate to government.

Neustadt and May’s prescriptions still seem to me apt and perceptive. They are rooted in the recognition that human beings fundamentally are historical animals and they provide simple, practical advice about how people in power can be their own historians. But the Achilles heel of Thinking in Time in 1986 was how would-be practitioners could speedily obtain the essential historical information to put flesh on the bare bones of their narrative timelines. Neustadt and May suggested a range of useful books, articles and bibliographies, but it seemed implausible that most busy policymakers, or even their aides, would have time to do the necessary research.

Nearly 30 years on, however, the IT-age tools that Guldi and Armitage identify can also help the policymaker who wants to become historically literate. There is now a profusion of information out there, available at a few clicks of a mouse. The new problem is quality control: identifying the information that is reliable and that rises above mere WikiHistory.

One answer comes from History & Policy, a web-based think tank run jointly from Cambridge and King’s College London. This posts short papers of 2,500 to 3,000 words, each offering a historically informed view on issues of current concern. To date, nearly 200 papers have appeared, covering a wide range of issues; recent topics include power-sharing in Northern Ireland, the London airport debate, treatment of the mentally ill and the Ukraine crisis. The organisation also runs specialist seminars targeted at specific interests, with the aim of providing the busy politician, civil servant or business person with a broader perspective but in succinct, manageable form. Although each paper suggests further reading, it is assumed that most users won’t have the time for a long academic tutorial. The aim here is not big history but applied history, useful at the point of decision-making.

For some traditionalist scholars, this search for relevance threatens a core value of professional history – the recognition of the past as a foreign country. But, as John Tosh has insisted in his book Why History Matters (2008), what we need is “a critical applied history”, one that is attentive to both continuity and difference. Neustadt and May developed the same point: “the future has nowhere to come from but the past”, yet “what matters for the future in the present is departures from the past” – hence the predictive capacity and also the potential pitfalls of historical analysis. Those departures may be slight and subtle but recognising them is essential when trying to anticipate the future.

Public awareness of the interconnection of past, present and future has been particularly keen at moments of dramatic rupture or transition. The end of the Second World War, with the total collapse of Hitler’s European empire and the horrific exposure of his “Final Solution”, constituted one such moment; another was the end of the cold war in 1989-91, when the “Iron Curtain” disintegrated and the Soviet Union fell apart. Such evidently “historic” moments have kindled an interest in “contemporary history”, or Zeitgeschichte, as the Germans call it. In this area, too, historical awareness has relevance for political debate, by helping us to locate our contemporary problems in the longer sweep of events.

Definitions of the appropriate time span for “contemporary history” lack precision: surveying various writers, Kristina Spohr of the London School of Economics suggests that the term has generally been employed to signify the history of “one’s own time”. She quotes Geoffrey Barraclough, an exponent in the 1960s: “Contemporary history begins when the problems which are actual in the world today first take visible shape.” When exactly that was will vary from case to case and is a matter of judgement for individual historians, requiring them to construct narratives on the Neustadt-May model but over the longue durée.

To Eric Hobsbawm, a lifelong Marxist, his own time was naturally defined by the rise and fall of the Soviet state and he framed his Age of Extremes around the dates 1914 and 1991. Hobsbawm’s book has become a classic, but in the 20 years since it first appeared our sense of the “contemporary” has moved on from the cold war. In an era preoccupied by globalisation, historians, when trying to discern how today’s problems took visible shape, have looked back to moments and markers that differ from Hobsbawm’s.

One significant trend is the vogue for “transnational” history, transcending the conventional western focus on the evolution of nation states: what the Harvard scholar Charles Maier calls the principle of “territoriality”. One of these new frameworks for understanding contemporary history is the cultural “clash of civilisations”, attractive to many American conservatives preoccupied with Islamic fundamentalism and the rise of China. Another framework is the emergence of supranational structures such as the European Union, intended to break out of the cycle of ruinous nationalist wars between France and Germany and to escape the perpetual “bloodlands” of eastern Europe. If European integration is indeed the trajectory of our own time, it implies a very different way of telling modern history from the conventional narratives about territorial nation states.

This approach is, of course, unlikely to have much appeal in our dis-United Kingdom. A political class trapped between the erosion of a once-solid state based on shared Britishness and a Continental behemoth depicted as the embodiment of alien “European” values does not seem in any mood to venture beyond territoriality. However, for those who are inclined to escape the bunker of Britishness, asking “What’s the story?” has utility in this larger sense. It invites us to interrogate the grand narratives we tell ourselves as a country about where we have come from and where we might be going.

Big history, thinking in time, applied history, alternative narratives: these are just a few ways that those who study the past are engaging with the present. That pioneer of “contemporary history”, Thucydides, writing 24 centuries ago, presented his account of the Peloponnesian wars as a warning for future decision-makers – for those who, as he put it, “want to understand clearly the events which happened in the past and (human nature being what it is) will at some time or other and in much the same ways be repeated in the future”.

He described how an ill-conceived foreign adventure – the disastrous attack on Syracuse – triggered the climactic phase of a long power struggle that not only destroyed Athenian democracy but also sapped the power of the Greek city states, laying the peninsula open to foreign domination. In our own day, after a year of national mourning for the men who marched away in 1914, we might raise our eyes to take in the bigger historical picture and the haunting parallels with the lost grandeur of Greece: an international conflict that exploded out of the blue in 37 days, which was sustained for four blood-soaked years by the intransigence of national leaders and from whose suicidal destruction Europe never recovered. We may not share Thucydides’s idea of a universal “human nature”, but his proclamation that history matters still has resonance today.

David Reynolds is Professor of International History at Cambridge. His latest book is “The Long Shadow: the Great War and the 20th Century” (Simon & Schuster)

This article first appeared in the 23 January 2015 issue of the New Statesman, Christianity in the Middle East