Has global warming really stopped?

Mark Lynas responds to a controversial article on newstatesman.com which argued global warming has s

On 19 December the New Statesman website published an article which, judging by the 633 comments (and counting) received so far, must go down in history as possibly the most controversial ever. Not surprising really – it covered one of the most talked-about issues of our time: climate change. Penned by science writer David Whitehouse, it was guaranteed to get a big response: the article claimed that global warming has ‘stopped’.

As the New Statesman’s environmental correspondent, I have since been deluged with queries asking if this represents a change of heart by the magazine, which has to date published many editorials steadfastly supporting urgent action to reduce carbon emissions. Why bother doing that if global warming has ‘stopped’, and therefore might have little or nothing to do with greenhouse gas emissions, which are clearly rising?

I’ll deal with this editorial question later. First let’s ask whether Whitehouse is wholly or partially correct in his analysis. To quote:

"The fact is that the global temperature of 2007 is statistically the same as 2006 as well as every year since 2001. Global warming has, temporarily or permanently, ceased. Temperatures across the world are not increasing as they should according to the fundamental theory behind global warming – the greenhouse effect. Something else is happening and it is vital that we find out what or else we may spend hundreds of billions of pounds needlessly."

I’ll be blunt. Whitehouse got it wrong – completely wrong. The article is based on a very elementary error: a confusion between year-on-year variability and the long-term average. Although CO2 levels in the atmosphere are increasing each year, no-one ever argued that temperatures would do likewise. Why? Because the planet’s atmosphere is a chaotic system, which expresses a great deal of interannual variability due to the interplay of many complex and interconnected variables. Some years are warmer and cooler than others. 1998, for example, was a very warm year because an El Nino event in the Pacific released a lot of heat from the ocean. 2001, by contrast, was somewhat cooler, though still a long way above the long-term average. 1992 was particularly cool, because of the eruption of a large volcano in the Philippines called Mount Pinatubo.

‘Climate’ is defined by averaging out all this variability over a longer term period. So you won’t, by definition, see climate change from one year to the next - or even necessarily from one decade to the next. But look at the change in the average over the long term, and the trend is undeniable: the planet is getting hotter.

Look at the graph below, showing global temperatures over the last 25 years. These are NASA figures, using a global-mean temperature dataset known as GISSTEMP. (Other datasets are available, for example from the UK Met Office. These fluctuate slightly due to varying assumptions and methodology, but show nearly identical trends.) Now imagine you were setting out to write Whitehouse’s article at some point in the past. You could plausibly have written that global warming had ‘stopped’ between 1983 and 1985, between 1990 and 1995, and, if you take the anomalously warm 1998 as the base year, between 1998 and 2004. Note, however, the general direction of the red line over this quarter-century period. Average it out and the trend is clear: up.

Note also the blue lines, scattered like matchsticks across the graph. These, helpfully added by the scientists at RealClimate.org (from where this graph is copied), partly in response to the Whitehouse article, show 8-year trend lines – what the temperature trend is for every 8-year period covered in the graph.

You’ll notice that some of the lines, particularly in the earlier part of the period, point downwards. These are the periods when global warming ‘stopped’ for a whole 8 years (on average), in the flawed Whitehouse definition – although, as astute readers will have quickly spotted, the crucial thing is what year you start with. Start with a relatively warm year, and the average of the succeeding eight might trend downwards. In scientific parlance, this is called ‘cherry picking’, and explains how Whitehouse can assert that "since [1998] the global temperature has been flat" – although he is even wrong on this point of fact, because as the graph above shows, 2005 was warmer.

Note also how none of the 8-year trend lines point downwards in the last decade or so. This illustrates clearly how, far from having ‘stopped’, global warming has actually accelerated in more recent times. Hence the announcement by the World Meteorological Organisation on 13 December, as the Bali climate change meeting was underway, that the decade of 1998-2007 was the “warmest on record”. Whitehouse, and his fellow contrarians, are going to have to do a lot better than this if they want to disprove (or even dispute) the accepted theory of greenhouse warming.

The New Statesman’s position on climate change

Every qualified scientific body in the world, from the American Association for the Advancement of Science to the Royal Society, agrees unequivocally that global warming is both a reality, and caused by man-made greenhouse gas emissions. But this doesn’t make them right, of course. Science, in the best Popperian definition, is only tentatively correct, until someone comes along who can disprove the prevailing theory. This leads to a frequent source of confusion, one which is repeated in the Whitehouse article – that because we don’t know everything, therefore we know nothing, and therefore we should do nothing. Using that logic we would close down every hospital in the land. Yes, every scientific fact is falsifiable – but that doesn’t make it wrong. On the contrary, the fact that it can be challenged (and hasn’t been successfully) is what makes it right.

Bearing all this in mind, what should a magazine like the New Statesman do in its coverage of the climate change issue? Newspapers and magazines have a difficult job of trying, often with limited time and information, to sort out truth from fiction on a daily basis, and communicating this to the public – quite an awesome responsibility when you think about it. Sometimes even a viewpoint which is highly likely to be wrong gets published anyway, because it sparks a lively debate and is therefore interesting. A publication that kept to a monotonous party line on all of the day’s most controversial issues would be very boring indeed.

However, readers of my column will know that I give contrarians, or sceptics, or deniers (call them what you will) short shrift, and as a close follower of the scientific debate on this subject I can state without doubt that there is no dispute whatsoever within the expert community as to the reality or causes of manmade global warming. But even then, just because all the experts agree doesn’t make them right – it just makes them extremely unlikely to be wrong. That in turn means that if someone begs to disagree, they need to have some very strong grounds for doing so – not misreading a basic graph or advancing silly conspiracy theories about IPCC scientists receiving paycheques from the New World Order, as some of Whitehouse’s respondents do.

So, a mistaken article reached a flawed conclusion. Intentionally or not, readers were misled, and the good name of the New Statesman has been used all over the internet by climate contrarians seeking to support their entrenched positions. This is regrettable. Good journalism should never exclude legitimate voices from a debate of public interest, but it also needs to distinguish between carefully-checked fact and distorted misrepresentations in complex and divisive areas like this. The magazine’s editorial policy is unchanged: we want to see aggressive action to reduce carbon emissions, and support global calls for planetary temperatures to be stabilised at under two degrees above pre-industrial levels.

Yes, scientific uncertainties remain in every area of the debate. But consider how high the stakes are here. If the 99% of experts who support the mainstream position are right, then we have to take urgent action to reduce emissions or face some pretty catastrophic consequences. If the 99% are wrong, and the 1% right, we will be making some unnecessary efforts to shift away from fossil fuels, which in any case have lots of other drawbacks and will soon run out. I’d hate to offend anyone here, but that’s what I’d call a no-brainer.

Mark Lynas has is an environmental activist and a climate change specialist. His books on the subject include High Tide: News from a warming world and Six Degree: Our future on a hotter planet.
Martin O’Neill
Show Hide image

How nostalgic literature became an agent in American racism

From To Kill a Mockingbird to Gone With The Wind, literary mythmaking has long veiled the ugly truth of the American South.

Alongside this summer’s debate over the meanings of the Confederate flag, sparked by the Charleston shootings, another story about the history of American racism flared up. Harper Lee’s Go Set a Watchman, an early draft of To Kill a Mockingbird, revealed further discomfiting truths, this time about the background to one of the nation’s most beloved fictional standard-bearers for racial equality.

In Watchman, Atticus Finch is discovered, twenty years after the action of Mockingbird, fighting against desegregation during the civil rights era. The shock readers have felt is akin to finding an early draft of Nineteen Eighty-Four in which Orwell defends surveillance in the name of national security – but stumbling upon a segregationist Atticus Finch should not surprise readers who know their history. In the end, he proves just as problematic a symbol as the Confederate flag, and for exactly the same reasons: both flew in the face of civil rights to defend white prerogative in the South. And both were gradually romanticised through the process of revision, wrapping America in comforting white lies.

Like our euphemistic habit of calling Jim Crow laws “segregation” rather than apartheid, all of this partakes in a time-honoured communal practice of deracination through whitewashing, through popular fictions of innocence such as Mockingbird and ­popular histories such as folk tales about the innocence of the Confederate flag. America keeps willing its own innocence back into being, even as racist ghosts continue to haunt our nation’s dreams of the pastoral South, a society whose fantasy of gentility was purchased at the cost of brutal chattel slavery and its malignant repercussions.

An essential aspect of the history of American racism is also the history of its deliberate mystification. Many Americans continue to accept an idealised view of the antebellum and Jim Crow South, which emerged as part of a national romance known as the Lost Cause, a legend that turned terrorism into a lullaby. In the aftermath of the South’s crushing defeat in the civil war, southerners began trying to reclaim what they had lost – namely, an old social order of unquestioned racial and economic hierarchies. They told stories idealising the nobility of their cause against the so-called War of Northern Aggression, in which northerners had invaded the peaceful South out of a combination of greed, arrogance, ignorance and spite. Gentle southerners heroically rallied to protect their way of life, with loyal slaves cheering them all the way. This Edenic dream of a lost agrarian paradise, in which virtuous aristocrats and hard-working farmers coexisted peacefully with devoted slaves, pre-dated the war: merging with Jeffersonian ideals of the yeoman farmer, it formed the earliest propagandistic defences of slavery. One of the most powerful broadsides against this spurious fantasy was launched in 1852 by Harriet Beecher Stowe, a north-eastern preacher’s daughter who had lived in border states and seen what really happened to ­fugitive slaves. The South responded to Uncle Tom’s Cabin with a furious denunciation of Stowe’s knowledge and motives, insisting on the benevolence of their “peculiar institution” – an especially sordid ­euphemism that recast plantation slavery as an endearing regional quirk.

Once institutional slavery was irrevocably destroyed, nostalgia prevailed, and southerners began producing novels, ­poems and songs romanticising the lost, halcyon days of the antebellum era. Reaching their apotheosis in Margaret Mitchell’s Gone with the Wind in 1936, and its film version in 1939, they have become known as the “moonlight and magnolia” school of plantation fiction, exemplified by the novels of Thomas Nelson Page, whose books, such as In Ole Virginia (1887) and Red Rock (1898), helped establish the formula: devoted former slaves recount (in dialect) their memories of an idyllic plantation culture wantonly destroyed by militant northern abolitionists and power-crazed federalists. A small band of honourable soldiers fought bravely on the battlefield and lost; the vindictive North installed incompetent or corrupt black people to subjugate the innocent whites; southern scalawags and northern carpetbaggers descended to exploit battle-ravaged towns. Pushed to the limits of forbearance, the Confederate army rose again to defend honour and decency – in the noble form of the Ku Klux Klan.

This thoroughly specious story is familiar to anyone who knows Gone with the Wind, but Mitchell learned it from Page and other novelists, including Mary Johnston and Thomas Dixon, Jr. Dixon wrote a series of books celebrating the rise of the Ku Klux Klan, of which the most notorious remains The Clansman (1905), for the simple reason that ten years later a director named D W Griffith decided to adapt it into a film he called The Birth of a Nation. Released exactly a century ago, it faithfully follows the same narrative blueprint. This was neither an accident nor a coincidence: Griffith was himself the son of a Confederate colonel (memorably known as “Roaring Jake”) and had listened to legends of the Lost Cause at his father’s knee. Reading The Clansman, Griffith said, brought back “all that my father had told me . . . [the film] had all the deep incisive emotionalism of the highest patriotic sentiment . . . I felt driven to tell the story – the truth about the South, touched by its eternal romance which I had learned to know so well.” That “truth” was a story in which the cavalry that rode to the rescue of an imperilled maiden was the KKK, saving her from a fate worse than death at the hands of a white actor in blackface. Griffith had considered hiring black actors for the film, he told an interviewer in 1916, but after “careful weighing of every detail concerned”, none of which he imparted, “the decision was to have no black blood among the principals”, a phrase that says it all by using the so-called one-drop rule to justify racist hiring practices in a film glorifying racism.

The Birth of a Nation was a national phenomenon, becoming the first film ever screened at the White House, in March 1915, for President Woodrow Wilson. Born in Virginia, raised in Georgia and South Carolina, Wilson was the first southern president since before the civil war. He attended Johns Hopkins University with Thomas Dixon, who became a political supporter of his; Wilson also appointed Thomas Nelson Page his US ambassador to Italy. Many members of his administration were equally avowed segregationists; their influence reverberated for generations. Wilson’s history books were quoted verbatim by Griffith in some of The Birth of a Nation’s titles, including: “‘The white men were roused by a mere instinct of self-preservation . . . until at last there had sprung into existence a great Ku Klux Klan, a veritable empire of the South, to protect the Southern country.’ – Woodrow Wilson”.

Wilson does not, however, actually seem to have said the quotation most frequently attributed to him, that The Birth of a Nation was “like writing history with lightning, and my only regret is that it is all so terribly true”. The source for this appears to be Griffith, who told a 1916 movie magazine: “The motion picture can impress upon a people as much of the truth of history in an evening as many months of study will accomplish. As one eminent divine said of [moving] pictures, ‘they teach history by lightning!’” They can teach myths the same way, as Griffith’s film regrettably proved. Northern white audiences cheered it; black audiences wept at the malevolence it celebrated; it prompted riots and racist vigilante mobs in cities across America, and at least one racially motivated murder. Together, Griffith and Dixon worked to elevate a regional legend, designed to save face after a catastrophic loss, into a symbolic myth accepted by much of the nation.

 

***

The Birth of a Nation helped to invent the feature film. It also helped to reinvent the Ku Klux Klan, which enjoyed a huge resurgence following the film’s release, spreading up through the north-east and Midwest. The Klan had a strong presence on Long Island in the 1920s, for example, which is why F Scott Fitzgerald made Tom Buchanan a believer in “scientific racism” in The Great Gatsby, a story set there in 1922, the same year in which Thomas Nelson Page died while working on a novel about the Klan called The Red Riders.

Four years later, a young writer named Margaret Mitchell began her own tale of the Lost Cause, a story that follows the same pattern, to the extent of describing Scarlett O’Hara on its first page in coded terms as a woman with “magnolia-white skin”. Gone with the Wind became a global sensation when it was published in 1936. Mitchell’s ideas of southern history were deeply influenced by Dixon’s racist ideology, although she was considerably more knowledgeable about the workings of plantation slavery than is often recognised. In fact, the great majority of antebellum southern planters (some 75 per cent) were what historians class as yeoman farmers (growing primarily for domestic use, selling only a small portion of their crop), not aristocratic owners of vast plantations. Such farmers often did not even own a slave, but rented or borrowed one or two during harvest time. But although the Hollywood version dispensed entirely with Mitchell’s sense of history (and her considerably more interesting take on historical sexism), it retained her racism, despite the stated intentions of its producers. The film of Gone with the Wind contributed greatly to the gradual displacement and dislocation of the historically specific time and place that had given rise to the antebellum plantation myth. Ben Hecht’s famed floating preface to the 1939 big-screen version labours at making the story timeless, a legend dissipating in the mists of history:

There was a land of Cavaliers and Cotton Fields called the Old South . . . Here in this pretty world Gallantry took its last bow . . . Here was the last ever to be seen of Knights and their Ladies Fair, of Master and of Slave . . . Look for it only in books, for it is no more than a dream remembered. A Civilisation gone with the wind . . .

as history recedes into ellipses.

Yet even this feudalist imagery has a specific history of its own, forming another crucial part of the popular culture of the Lost Cause. It was Mark Twain, whose works remain one of our most powerful literary antidotes to the toxin of moonlight and magnolia, who most famously named its source: the novels of Sir Walter Scott. Scott “did measureless harm”, Twain insisted in his 1883 Life on the Mississippi, “more real and lasting harm, perhaps, than any other individual that ever wrote”. Scott’s bestselling romances filled southerners’ heads with enchanted “dreams and phantoms . . . with the sillinesses and emptinesses, sham grandeurs, sham gauds and sham chivalries of a brainless and worthless long-vanished society”. In fact, Twain concluded acidly, “Sir Walter had so large a hand in making Southern character, as it existed before the war, that he is in great measure responsible for the war.” This is only slightly exaggerated: Scott’s novels were as popular in early-19th-century America as were the films of The Birth of a Nation and Gone with the Wind a century later. It is certainly thanks to the popularity of novels such as Ivanhoe and Waverley that the cult of the “clan” was distorted into the Ku Klux Klan, with its bogus orders and knights. Scott’s novels may even have given rise to antebellum America’s use of the medieval word “minstrel” to describe itinerant blackface musicians; the OED offers no logic for the coinage, but its earliest citation is from 1833, when the mania for Scott’s work in the American South was at its highest. The feudal fantasy of “gallantry”, “cavalier knights” and “ladies fair” is a powerful one, not least, doubtless, because it gives America a sense of a much older past than it has, merging our history with a faux-feudalism in which slaves are rewritten as serfs, bound by devotion to the land and the family they serve.

***

But even as Gone with the Wind was in its ascendancy, another national narrative was slowly gaining traction. Black writers including W E B DuBois, Zora Neale Hurston, Langston Hughes and Richard Wright were vigorously challenging the dominant racist narrative, while white southern writers such as Ellen Glasgow and, especially, William Faulkner, also began debunking and complicating this facile tradition. Faulkner took from the Lost Cause legends that he, too, had heard as a child in Mississippi, some much darker truths about memory, history, distortion, perspective. Faulkner’s Absalom, Absalom! is probably his greatest meditation on the processes of myth-making and how they intersect with national history, writing a Homeric epic of America. Absalom came out in 1936, the same year as Gone with the Wind, and in many ways can be read as a corrective to it, shaking off the dreams of moonlight and magnolia to show the Gothic nightmare underneath.

Just twenty years later, a young woman who had been born in the year Margaret Mitchell began to write Gone with the Wind, named Nelle Harper Lee, produced her own version of the story of America’s struggles with its racist past. To Kill a Mockingbird takes place between 1933 and 1935 (just before Gone with the Wind was published), during which time Atticus Finch tells his young daughter, Scout, that the Ku Klux Klan was “a political organisation” that existed “way back about 1920” but “couldn’t find anybody to scare”, relegating the Klan to ancient history, instead of a mere dozen years before Lee’s story.

Nor is it any coincidence that Mrs Dubose, the racist old lady addicted to morphine, forces Jem Finch to read Walter Scott’s Ivanhoe to her as a punishment: there are more encoded traces of Lost Cause history in Mockingbird than many readers have recognised. Mockingbird resists this history on balance, but it also participates in collective myth-making, most significantly in its depiction of lynching as a furtive, isolated practice in the dead of night. It would be pretty to think so. In Lee’s version, a lynch mob is dispersed by the innocent prattle of the child Scout, who unwittingly reminds its members of their basic “decency”. In fact, by the 1920s, lynching in the South was often publicly marketed as a tourist attraction, in a practice historians now refer to as “spectacle lynching”, which took place in the cold light of day, with plenty of advance warning so people could travel from outlying areas for the fun. There were billboards and advertisements letting tourists know when and where the lynching would take place; families brought children and had picnics; postcards were sold and sent (horrific images of which are easy to find online). Burning at the stake was common; victims were often tortured or castrated, or had limbs amputated, or were otherwise mutilated first; pregnant women were burned to death in front of popcorn-crunching crowds. This was so well established that in 1922 the National Association for the Advancement of Colored People took out a full-page advertisement in the New York Times with the headline “The shame of America”, demanding, in underscored outrage: “Do you know that the United States is the Only Land on Earth where human beings are BURNED AT THE STAKE?” It went on to outline a few salient facts: between 1889 and 1922, 3,436 people had been lynched in America. Of these, “only 571, or less than 17 per cent, were even accused of rape”. Eighty-three of the victims were women, which further undermined the myth that rape led to lynching.

Lee’s picture of lynching in To Kill a Mockingbird is not merely sentimental, but an active falsification: it participates in a wider American story seeking to minimise, even exonerate, the reactions of white southerners to African Americans’ claims to equal rights under the law. It functions as a covert apology, suggesting that lynching was anomalous, perpetrated by well-meaning men who could be reminded of common humanity – when, in fact, members of lynch mobs in the 1920s posed for photographs in front of their victims. But Atticus Finch stands up to racism, and Scout persuades Cunningham and the rest of the lynch mob to go home. This is why it matters that an earlier version of Atticus advocated segregation: it shows that the process of revising Mockingbird was a process of idealisation, promising readers that systemic racism could be solved by the compassionate actions of noble individuals.

This is the consolatory promise of individualism, that the nation can be redeemed collectively by isolated instances of benign action. The romance of individualism has always been how America manages injustice and unrest, how it pastes over irre­concilable differences and squares imaginary circles: the heroic figure who temporarily, occasionally, overcomes all structural and social impediments is taken as communal evidence that these obstructions don’t exist, or aren’t very obstructive. It is, of course, a deeply Christian idea: the redemptive individual who expiates collective sin, the original lost cause, sacrificed for the good of all.

The doctrine of individualism is further tangled up in yet another exculpatory logic: that of states’ rights. The remnants of this rationale also surface at the end of Go Set a Watchman: Atticus Finch argues that segregation is really a matter of states’ rights, and Jean Louise, his now adult daughter, though nauseated at his racism, accepts the legitimacy of that argument. What right has the federal government to insist that the people within its borders adhere to its laws, or even to the principles (truth, justice, equality, democracy) they purport to uphold? Some might conclude from reading this passage that, like paradise, causes are invented to be lost. The idea of the Lost Cause redeems a squalid past; it is an act of purely revisionist history, disavowing the notion that slavery or racism had anything to do with the civil war and its vicious, lingering aftermath. Glorifying that history as “gallantry” is not merely dishonest: it is ruinous.

Dismissing all of this as ancient history, or mere fiction, is not the solution, it is part of the problem. As Faulkner famously observed and this brief account shows, the past isn’t dead; it isn’t even past. The word “legend” comes from the Latin legere, to read. Many insist that the stories we consume endlessly are harmless entertainment, but that, too, is part of the mystification, pretending that these legends did not arise precisely to veil an ugly truth with moonlight, to smother with the scent of magnolia the stink of old, decomposing lies.

Now listen to Sarah Churchwell discussing the fiction of the American South with the NS's Tom Gatti:

An American in London, Sarah Churchwell is an author and professor of American literature. Her latest book, “Careless People: Murder, Mayhem and the Invention of the Great Gatsby”, is published by Virago.

This article first appeared in the 20 August 2015 issue of the New Statesman, Corbyn wars