Has global warming really stopped?

Mark Lynas responds to a controversial article on newstatesman.com which argued global warming has s

On 19 December the New Statesman website published an article which, judging by the 633 comments (and counting) received so far, must go down in history as possibly the most controversial ever. Not surprising really – it covered one of the most talked-about issues of our time: climate change. Penned by science writer David Whitehouse, it was guaranteed to get a big response: the article claimed that global warming has ‘stopped’.

As the New Statesman’s environmental correspondent, I have since been deluged with queries asking if this represents a change of heart by the magazine, which has to date published many editorials steadfastly supporting urgent action to reduce carbon emissions. Why bother doing that if global warming has ‘stopped’, and therefore might have little or nothing to do with greenhouse gas emissions, which are clearly rising?

I’ll deal with this editorial question later. First let’s ask whether Whitehouse is wholly or partially correct in his analysis. To quote:

"The fact is that the global temperature of 2007 is statistically the same as 2006 as well as every year since 2001. Global warming has, temporarily or permanently, ceased. Temperatures across the world are not increasing as they should according to the fundamental theory behind global warming – the greenhouse effect. Something else is happening and it is vital that we find out what or else we may spend hundreds of billions of pounds needlessly."

I’ll be blunt. Whitehouse got it wrong – completely wrong. The article is based on a very elementary error: a confusion between year-on-year variability and the long-term average. Although CO2 levels in the atmosphere are increasing each year, no-one ever argued that temperatures would do likewise. Why? Because the planet’s atmosphere is a chaotic system, which expresses a great deal of interannual variability due to the interplay of many complex and interconnected variables. Some years are warmer and cooler than others. 1998, for example, was a very warm year because an El Nino event in the Pacific released a lot of heat from the ocean. 2001, by contrast, was somewhat cooler, though still a long way above the long-term average. 1992 was particularly cool, because of the eruption of a large volcano in the Philippines called Mount Pinatubo.

‘Climate’ is defined by averaging out all this variability over a longer term period. So you won’t, by definition, see climate change from one year to the next - or even necessarily from one decade to the next. But look at the change in the average over the long term, and the trend is undeniable: the planet is getting hotter.

Look at the graph below, showing global temperatures over the last 25 years. These are NASA figures, using a global-mean temperature dataset known as GISSTEMP. (Other datasets are available, for example from the UK Met Office. These fluctuate slightly due to varying assumptions and methodology, but show nearly identical trends.) Now imagine you were setting out to write Whitehouse’s article at some point in the past. You could plausibly have written that global warming had ‘stopped’ between 1983 and 1985, between 1990 and 1995, and, if you take the anomalously warm 1998 as the base year, between 1998 and 2004. Note, however, the general direction of the red line over this quarter-century period. Average it out and the trend is clear: up.

Note also the blue lines, scattered like matchsticks across the graph. These, helpfully added by the scientists at RealClimate.org (from where this graph is copied), partly in response to the Whitehouse article, show 8-year trend lines – what the temperature trend is for every 8-year period covered in the graph.

You’ll notice that some of the lines, particularly in the earlier part of the period, point downwards. These are the periods when global warming ‘stopped’ for a whole 8 years (on average), in the flawed Whitehouse definition – although, as astute readers will have quickly spotted, the crucial thing is what year you start with. Start with a relatively warm year, and the average of the succeeding eight might trend downwards. In scientific parlance, this is called ‘cherry picking’, and explains how Whitehouse can assert that "since [1998] the global temperature has been flat" – although he is even wrong on this point of fact, because as the graph above shows, 2005 was warmer.

Note also how none of the 8-year trend lines point downwards in the last decade or so. This illustrates clearly how, far from having ‘stopped’, global warming has actually accelerated in more recent times. Hence the announcement by the World Meteorological Organisation on 13 December, as the Bali climate change meeting was underway, that the decade of 1998-2007 was the “warmest on record”. Whitehouse, and his fellow contrarians, are going to have to do a lot better than this if they want to disprove (or even dispute) the accepted theory of greenhouse warming.

The New Statesman’s position on climate change

Every qualified scientific body in the world, from the American Association for the Advancement of Science to the Royal Society, agrees unequivocally that global warming is both a reality, and caused by man-made greenhouse gas emissions. But this doesn’t make them right, of course. Science, in the best Popperian definition, is only tentatively correct, until someone comes along who can disprove the prevailing theory. This leads to a frequent source of confusion, one which is repeated in the Whitehouse article – that because we don’t know everything, therefore we know nothing, and therefore we should do nothing. Using that logic we would close down every hospital in the land. Yes, every scientific fact is falsifiable – but that doesn’t make it wrong. On the contrary, the fact that it can be challenged (and hasn’t been successfully) is what makes it right.

Bearing all this in mind, what should a magazine like the New Statesman do in its coverage of the climate change issue? Newspapers and magazines have a difficult job of trying, often with limited time and information, to sort out truth from fiction on a daily basis, and communicating this to the public – quite an awesome responsibility when you think about it. Sometimes even a viewpoint which is highly likely to be wrong gets published anyway, because it sparks a lively debate and is therefore interesting. A publication that kept to a monotonous party line on all of the day’s most controversial issues would be very boring indeed.

However, readers of my column will know that I give contrarians, or sceptics, or deniers (call them what you will) short shrift, and as a close follower of the scientific debate on this subject I can state without doubt that there is no dispute whatsoever within the expert community as to the reality or causes of manmade global warming. But even then, just because all the experts agree doesn’t make them right – it just makes them extremely unlikely to be wrong. That in turn means that if someone begs to disagree, they need to have some very strong grounds for doing so – not misreading a basic graph or advancing silly conspiracy theories about IPCC scientists receiving paycheques from the New World Order, as some of Whitehouse’s respondents do.

So, a mistaken article reached a flawed conclusion. Intentionally or not, readers were misled, and the good name of the New Statesman has been used all over the internet by climate contrarians seeking to support their entrenched positions. This is regrettable. Good journalism should never exclude legitimate voices from a debate of public interest, but it also needs to distinguish between carefully-checked fact and distorted misrepresentations in complex and divisive areas like this. The magazine’s editorial policy is unchanged: we want to see aggressive action to reduce carbon emissions, and support global calls for planetary temperatures to be stabilised at under two degrees above pre-industrial levels.

Yes, scientific uncertainties remain in every area of the debate. But consider how high the stakes are here. If the 99% of experts who support the mainstream position are right, then we have to take urgent action to reduce emissions or face some pretty catastrophic consequences. If the 99% are wrong, and the 1% right, we will be making some unnecessary efforts to shift away from fossil fuels, which in any case have lots of other drawbacks and will soon run out. I’d hate to offend anyone here, but that’s what I’d call a no-brainer.

Mark Lynas has is an environmental activist and a climate change specialist. His books on the subject include High Tide: News from a warming world and Six Degree: Our future on a hotter planet.
Show Hide image

The English question

The political community that is England is neither stable nor settled. But something is stirring among Chesterton’s secret people.

From the late 18th century to the early 20th, Britain’s political class wrestled with an Irish Question: how could the British state govern “John Bull’s Other Island” in a way that kept the native Irish quiescent, without jeopardising its own security? When Ireland was partitioned in 1921 the question disappeared from the British political agenda – only to reappear in another guise during the Troubles in Northern Ireland half a century later. It was not laid to rest until the Belfast Agreement of 1998. More recently, politicians and commentators on both sides of the border have had to come to terms with an increasingly intractable Scottish Question: how should the ancient and once independent Scottish nation relate to the other nations of the United Kingdom and to the Westminster parliament? As the convoluted debate provoked by the coming EU referendum shows, a more nebulous English Question now looms in the wings.

Like the Irish and Scottish Questions, it is the child of a complex history. England became a united kingdom in Anglo-Saxon times. It faced external enemies, notably invading Danes, but its kings ruled their own territory with an iron hand. The Norman Conquest substituted francophone rulers and a francophone nobility for these Anglo-Saxon kings; the new elite spoke French, sent their sons to France to be educated and polished and, in many cases, owned territory in France. Simon de Montfort, once credited with founding the English parliament, was a French nobleman as well as an English one. But the kingdom remained united. The Celtic people who had once inhabited what is now England were driven out by the Anglo-Saxons; Lloegr, the Welsh word for England, means “the lost land”. It stayed lost after the Conquest; and indeed, the Norman rulers of England pushed further into Wales than their Anglo-Saxon predecessors had done.

United did not mean peaceful or stable. Henry II, William the Conqueror’s great-grandson, ruled a vast Continental empire stretching from the English Channel to the Pyrenees, as well as England. Inept kings, uppity barons, an aggressive church, restive peasants, a century-long war with France and bitter dynastic rivalries undermined his achievement. But there was no English equivalent to the powerful, de facto independent duchies of Burgundy or Aquitaine in what is now France, or to the medley of principalities, city states and bishoprics that divided Germans and Italians from each other until well into the 19th century. That was still true after the Welshman Henry Tudor defeated Richard III at the Battle of Bosworth in 1485 and seized the English crown as Henry VII. His son (who became Henry VIII) was not content with keeping England united. Having broken with the Catholic Church when the Pope refused to annul his first marriage, he made himself head of the Church in England and proclaimed that the realm of England was an “empire”, free from all external authority.

From the upheavals of Henry’s reign and the subtle compromises of his daughter Elizabeth’s emerged the Church of England – an institutional and theological third way between the Catholicism of Rome, on the one hand, and the Protestantism of John Calvin’s Geneva and Martin Luther’s Germany on the other. The Church of England has spoken to and for the English people ever since. Sometimes it has spoken feebly and complacently, as in the 18th century. At other times it has been outspoken and brave, as in the Second World War, when William Temple was the archbishop of Canterbury, and during the 1980s, when a Church of England commission excoriated the Thatcher era’s “crude exaltation” of “individual self-interest”. Despite (or perhaps because of) the subtle compromises embodied in it, the Anglican Church has been prone to schism. “High Church” Anglicans have stressed its Catholic inheritance; followers of the “low” Church have insisted on its Protestantism. Two charismatic High Anglican priests – John Henry Newman and Henry Edward Manning – converted to Catholicism and ended as cardinals.

Yet these schisms did not affect the laity or diminish the Church’s role in English life. From the end of the English civil wars in 1660 to the late 19th century, England was ruled by the Anglican landed class, the most relaxed and confident governing class in Europe. A bien-pensant, easygoing and undogmatic latitudinarianism shaped relations between church and state. Doctrinal precision was tiresome, even a little vulgar. Wherever possible, differences were fudged: the very Thirty-Nine Articles of the Anglican Church are a fudge. There were exceptions. Gladstone’s restless, sometimes tormented religiosity and baffling combination of high ideals with low cunning could hardly have been less easygoing. And as the 19th century wore on, Protestant dissenters, Catholics and even Jews and unbelievers were slowly incorporated into the political nation. Joseph Chamberlain, who did more to make the political weather than any other leader in the late 19th and early 20th centuries, and contrived to split both the Liberal and the Conservative parties, was a Unitarian, contemptuous of fudge.

However, the style and mood of English governance were still quintessentially Anglican. Fudge prevailed. Trollope’s political novels are a hymn to fudging. Disraeli, ethnically Jewish, though baptised into the Church of England, was a fudger to his fingertips. In his low-cunning moods, even Gladstone was not above fudging. After the Act of Union between England and Scotland in 1707 the monarchy itself rested on a mountain of fudge: the monarch was an Anglican in England, but a Presbyterian in Scotland. The English and Scottish parliaments were merged into a British parliament, but because England was far more populous and far richer than Scotland, it was the English parliament writ large, and embodied English constitutional doctrine. Equally, the Scots became junior partners in a new British empire, ultimately controlled by the Anglican elite. It won the race for empire against France, but the stiff-necked, pernickety legalism of successive London governments drove its colonies on the seaboard of what is now the United States into revolt and eventual independence.

The Anglican elite learned their lesson. Thereafter, imperial governance was English governance writ large. From an early stage the colonies of settlement, later known as the “white dominions”, were, in effect, self-governing. At first sight, India, “the brightest jewel in the British crown”, was an exception. It was acquired by force and maintained, in the last resort, by force. The Great Rebellion of 1857, once known as the Indian Mutiny, was brutally suppressed. In the Amritsar Massacre of 1919, Brigadier General Dyer ordered his troops to fire on an unarmed and peaceful crowd; they went on firing until their ammunition was exhausted. But the most astonishing feature of the British Raj is that a tiny sliver of British soldiers and administrators somehow managed to govern a subcontinent populated by roughly 250 million subjects. Force alone could not have done this. The Raj depended on indirect rule, on adroit accommodation to local pressures. It would not have survived without the collaboration of Indian elites, and the price of collaboration was a willingness to temper the wind of imperial power to the shorn lamb of Indian hopes and fears.

***

 

The Anglo-British story echoed the Indian story. The political, administrative and financial elites in Westminster, Whitehall and the City of London viewed the kingdom they presided over through an Indian lens. British subjects in the mother country were treated like Indian subjects in the Raj. Force lurked in the background, but most of the time it stayed in the background. The Peterloo Massacre of 1819, in which mounted cavalry charged into a crowd of as many as 80,000 people demonstrating for greater parliamentary representation at St Peter’s Field in Manchester, was a paler precursor of the Amritsar Massacre; the Rhondda township of Tonypandy, where hussars helped crush a “riot” by striking miners in 1910, lived on in the folk memory of the labour movement for decades. Yet these were exceptions, just as Amritsar was an exception.

Co-option, accommodation and collaboration between the governing elites and lesser elites beyond them were the real hallmarks of British governance. The French saying that there is more in common between two deputies, one of whom is a communist, than there is between two communists, one of whom is a deputy, also applied to Britain. In the cosy Westminster village, insurgent tribunes of the people, from the popular radical John Bright to the fulminating socialist Michael Foot, slowly morphed into grand and harmless old men. Outside the village, subjects were inescapably subjects, not citizens, just as their Indian counterparts were. Sovereignty, absolute and inalienable, belonged to the Crown-in-Parliament, not to the people. And the whole edifice was held together by layer upon layer of fudge.

Now the fudge is beginning to dissolve. The Raj disappeared long ago. The fate of steelworkers in South Wales depends on decisions by an Indian multinational whose headquarters are in Mumbai. The empire on which the sun never set is barely a memory. Unlike her great-great-grandmother Queen Victoria, the present Queen is not an empress; she has to make do with leading the Commonwealth. In law, the Crown-in-Parliament remains absolutely sovereign and the peoples of the United Kingdom are still subjects, not citizens. But legal principles and political realities diverge. The Anglo-British state whose capital is London and whose parliament stands on the fringes of the Thames is no longer the sole institution that shapes and reflects the political will of the people over whom it presides. There are now four capital cities, four legislatures, four governments and four political systems in the United Kingdom.

The devolved administrations in the non-English nations of the kingdom control swaths of public policy. The parties that lead them vary enormously in ideology and history. The Scottish National Party, which has governed Scotland for nearly nine years, stands for an independent Scotland. In Wales, Labour has been the strongest party since devolution, but it and Plaid Cymru (the “Party of Wales”) have already formed one coalition and may well form another after the elections to the Welsh Assembly next month. No great changes are likely. Almost certainly Wales will continue to be a social-democratic candle in a naughty world. Since the Belfast Agreement, Northern Ireland has been governed by a power-sharing executive, representing both the republican tradition, embodied in Sinn Fein, and the loyalist tradition, embodied in the Democratic Unionist Party. The sovereign Westminster parliament has the legal right to repeal the devolution statutes, but doing so would amount to a revolution in our uncodified constitution and would destroy the Union.

England is a stranger at the feast. It towers above the others in wealth, in population and in political clout. It has almost 84 per cent of the UK population. Scotland has just under 8.5 per cent, Wales just under 5 per cent and Northern Ireland less than 3 per cent. Yet there is no English parliament or government. In times past, English people have often treated the words “English” and “British” as synonyms, but devolution to Scottish, Welsh and Northern Irish legislatures and administrations has made a nonsense of this lazy conflation.

***

England and the English now face the primordial questions that face all self-conscious political communities: “Who are we?”, “Who do we want to be?” At bottom, these questions are philosophical, in a profound sense moral, not economic or institutional. They have to do with the intangibles of culture and sentiment, not the outward forms that clothe them. In stable and settled political communities they are rarely discussed. They don’t need to be. But the political community that is England is neither stable nor settled. Fuelled in part by resentment of the alleged unfairness of the devolution process and in part by the psychic wound left by the end of the Anglo-British empire, an inchoate, grouchy English nationalism is now a force to be reckoned with. St George’s flags flying on 23 April; the extraordinary rise of Ukip; David Cameron’s panic-stricken attempt to “renegotiate” Britain’s role in the European Union – all tell the same story: the “secret people of England”, as G K Chesterton called them, are secret no longer.

But that is not an answer to my questions. It only shows that they are urgent. At the moment, two answers hold the field. The first – the answer embodied in the Cameron government’s “Project Fear” over the UK’s membership of the EU – is essentially deracinated. For the globetrotting super-rich, the financial services sector, the Bank of England and the managers of the Union state, England consists of London and the more salubrious parts of the south-east. The answer to the English Question is that there is no such question. The notion that the English have to decide who they are and who they want to be is a backward-looking fantasy. Globalisation has overwhelmed the specificities of English culture and experience. The English buy and sell in the global marketplace and they face global threats. Membership of an EU made safe for market fundamentalism offers the best available route to security and prosperity in an ever more globalised world.

The second answer – the answer implicit in Eurosceptic rhetoric – is romantically ­archaic. At its heart is a vision of England as a sea-girt and providential nation, cut off from the European mainland by a thousand years of history and a unique constitutional arrangement. It harks back to Shakespeare’s hymn to England as a “jewel set in the silver sea”; to Henry Newbolt’s poem “Drake’s Drum”, evoking the memory of gallant English mariners driving the top-heavy galleons of the Spanish Armada up the Channel to their doom; and to Nelson dying gloriously at Trafalgar at the climax of his greatest victory. It fortified Margaret Thatcher during the nail-biting weeks of the Falklands War; it inspired Enoch Powell’s passionate depiction of post-imperial England as the reincarnation of the England of Edward the Confessor: an England whose unity was “effortless and unconstrained” and which accepted the “unlimited supremacy of Crown-in-Parliament so naturally as not to be aware of it”. As Powell saw more clearly than anyone else, this vision rules out EU membership.

No one with progressive instincts can possibly be satisfied with either of these answers. The great question is whether there is a better one. I think there is, but I can’t pretend that it is easy or comfortable. It is republican in spirit – which does not entail getting rid of the monarchy, as the many Continental monarchies show. It embodies a tradition stretching back to England’s brief but inspiring republican experiment during the civil wars of the 17th century, and before that to Renaissance Italy and Republican Rome. Central to it is the notion of “neo-Roman liberty”: of liberty as freedom from domination, from dependence on another’s will. John Milton was its most eloquent English exponent, in prose and verse, but it also inspired Tom Paine’s contempt for hereditary rule and the “foppery” that went with it. In the 20th century its most engaging champion was R H Tawney, the ethical socialist, economic historian and foe of the “religion of inequality”, its “great God Mumbo-Jumbo” and the “servile respect for wealth and social position” it inculcated.

The goal is clear: a republican England in a republican Britain and a republican Britain in a republican Europe. The obstacles are formidable. As the founders of the American republic discovered, republican liberty entails federal union, combining diversity at the base with unity at the centre; and for that there are few takers. But Gramsci was right. Pessimism of the intellect should go hand in hand with optimism of the will. There is all too much pessimism of the intellect on the British left. It is time for some optimism of the will.

David Marquand’s most recent book is “Mammon’s Kingdom: an Essay on Britain, Now” (Allen Lane)

This article first appeared in the 08 April 2016 issue of the New Statesman, The Tories at war