Has global warming really stopped?

Mark Lynas responds to a controversial article on newstatesman.com which argued global warming has s

On 19 December the New Statesman website published an article which, judging by the 633 comments (and counting) received so far, must go down in history as possibly the most controversial ever. Not surprising really – it covered one of the most talked-about issues of our time: climate change. Penned by science writer David Whitehouse, it was guaranteed to get a big response: the article claimed that global warming has ‘stopped’.

As the New Statesman’s environmental correspondent, I have since been deluged with queries asking if this represents a change of heart by the magazine, which has to date published many editorials steadfastly supporting urgent action to reduce carbon emissions. Why bother doing that if global warming has ‘stopped’, and therefore might have little or nothing to do with greenhouse gas emissions, which are clearly rising?

I’ll deal with this editorial question later. First let’s ask whether Whitehouse is wholly or partially correct in his analysis. To quote:

"The fact is that the global temperature of 2007 is statistically the same as 2006 as well as every year since 2001. Global warming has, temporarily or permanently, ceased. Temperatures across the world are not increasing as they should according to the fundamental theory behind global warming – the greenhouse effect. Something else is happening and it is vital that we find out what or else we may spend hundreds of billions of pounds needlessly."

I’ll be blunt. Whitehouse got it wrong – completely wrong. The article is based on a very elementary error: a confusion between year-on-year variability and the long-term average. Although CO2 levels in the atmosphere are increasing each year, no-one ever argued that temperatures would do likewise. Why? Because the planet’s atmosphere is a chaotic system, which expresses a great deal of interannual variability due to the interplay of many complex and interconnected variables. Some years are warmer and cooler than others. 1998, for example, was a very warm year because an El Nino event in the Pacific released a lot of heat from the ocean. 2001, by contrast, was somewhat cooler, though still a long way above the long-term average. 1992 was particularly cool, because of the eruption of a large volcano in the Philippines called Mount Pinatubo.

‘Climate’ is defined by averaging out all this variability over a longer term period. So you won’t, by definition, see climate change from one year to the next - or even necessarily from one decade to the next. But look at the change in the average over the long term, and the trend is undeniable: the planet is getting hotter.

Look at the graph below, showing global temperatures over the last 25 years. These are NASA figures, using a global-mean temperature dataset known as GISSTEMP. (Other datasets are available, for example from the UK Met Office. These fluctuate slightly due to varying assumptions and methodology, but show nearly identical trends.) Now imagine you were setting out to write Whitehouse’s article at some point in the past. You could plausibly have written that global warming had ‘stopped’ between 1983 and 1985, between 1990 and 1995, and, if you take the anomalously warm 1998 as the base year, between 1998 and 2004. Note, however, the general direction of the red line over this quarter-century period. Average it out and the trend is clear: up.

Note also the blue lines, scattered like matchsticks across the graph. These, helpfully added by the scientists at RealClimate.org (from where this graph is copied), partly in response to the Whitehouse article, show 8-year trend lines – what the temperature trend is for every 8-year period covered in the graph.

You’ll notice that some of the lines, particularly in the earlier part of the period, point downwards. These are the periods when global warming ‘stopped’ for a whole 8 years (on average), in the flawed Whitehouse definition – although, as astute readers will have quickly spotted, the crucial thing is what year you start with. Start with a relatively warm year, and the average of the succeeding eight might trend downwards. In scientific parlance, this is called ‘cherry picking’, and explains how Whitehouse can assert that "since [1998] the global temperature has been flat" – although he is even wrong on this point of fact, because as the graph above shows, 2005 was warmer.

Note also how none of the 8-year trend lines point downwards in the last decade or so. This illustrates clearly how, far from having ‘stopped’, global warming has actually accelerated in more recent times. Hence the announcement by the World Meteorological Organisation on 13 December, as the Bali climate change meeting was underway, that the decade of 1998-2007 was the “warmest on record”. Whitehouse, and his fellow contrarians, are going to have to do a lot better than this if they want to disprove (or even dispute) the accepted theory of greenhouse warming.

The New Statesman’s position on climate change

Every qualified scientific body in the world, from the American Association for the Advancement of Science to the Royal Society, agrees unequivocally that global warming is both a reality, and caused by man-made greenhouse gas emissions. But this doesn’t make them right, of course. Science, in the best Popperian definition, is only tentatively correct, until someone comes along who can disprove the prevailing theory. This leads to a frequent source of confusion, one which is repeated in the Whitehouse article – that because we don’t know everything, therefore we know nothing, and therefore we should do nothing. Using that logic we would close down every hospital in the land. Yes, every scientific fact is falsifiable – but that doesn’t make it wrong. On the contrary, the fact that it can be challenged (and hasn’t been successfully) is what makes it right.

Bearing all this in mind, what should a magazine like the New Statesman do in its coverage of the climate change issue? Newspapers and magazines have a difficult job of trying, often with limited time and information, to sort out truth from fiction on a daily basis, and communicating this to the public – quite an awesome responsibility when you think about it. Sometimes even a viewpoint which is highly likely to be wrong gets published anyway, because it sparks a lively debate and is therefore interesting. A publication that kept to a monotonous party line on all of the day’s most controversial issues would be very boring indeed.

However, readers of my column will know that I give contrarians, or sceptics, or deniers (call them what you will) short shrift, and as a close follower of the scientific debate on this subject I can state without doubt that there is no dispute whatsoever within the expert community as to the reality or causes of manmade global warming. But even then, just because all the experts agree doesn’t make them right – it just makes them extremely unlikely to be wrong. That in turn means that if someone begs to disagree, they need to have some very strong grounds for doing so – not misreading a basic graph or advancing silly conspiracy theories about IPCC scientists receiving paycheques from the New World Order, as some of Whitehouse’s respondents do.

So, a mistaken article reached a flawed conclusion. Intentionally or not, readers were misled, and the good name of the New Statesman has been used all over the internet by climate contrarians seeking to support their entrenched positions. This is regrettable. Good journalism should never exclude legitimate voices from a debate of public interest, but it also needs to distinguish between carefully-checked fact and distorted misrepresentations in complex and divisive areas like this. The magazine’s editorial policy is unchanged: we want to see aggressive action to reduce carbon emissions, and support global calls for planetary temperatures to be stabilised at under two degrees above pre-industrial levels.

Yes, scientific uncertainties remain in every area of the debate. But consider how high the stakes are here. If the 99% of experts who support the mainstream position are right, then we have to take urgent action to reduce emissions or face some pretty catastrophic consequences. If the 99% are wrong, and the 1% right, we will be making some unnecessary efforts to shift away from fossil fuels, which in any case have lots of other drawbacks and will soon run out. I’d hate to offend anyone here, but that’s what I’d call a no-brainer.

Mark Lynas has is an environmental activist and a climate change specialist. His books on the subject include High Tide: News from a warming world and Six Degree: Our future on a hotter planet.
STUART KINLOUGH
Show Hide image

Head in the cloud

As we download ever more of our lives on to electronic devices, are we destroying our own internal memory?

I do not remember my husband’s tele­phone number, or my best friend’s address. I have forgotten my cousin’s birthday, my seven times table, the date my grandfather died. When I write, I keep at least a dozen internet tabs open to look up names and facts I should easily be able to recall. There are so many things I no longer know, simple things that matter to me in practical and personal ways, yet I usually get by just fine. Apart from the few occasions when my phone has run out of battery at a crucial moment, or the day I accidentally plunged it into hot tea, or the evening my handbag was stolen, it hasn’t seemed to matter that I have downloaded most of my working memory on to electronic devices. It feels a small inconvenience, given that I can access information equivalent to tens of billions of books on a gadget that fits into my back pocket.

For thousands of years, human beings have relied on stone tablets, scrolls, books or Post-it notes to remember things that their minds cannot retain, but there is something profoundly different about the way we remember and forget in the internet age. It is not only our memory of facts that is changing. Our episodic memory, the mind’s ability to relive past experiences – the surprising sting of an old humiliation revisited, the thrill and discomfort of a first kiss, those seemingly endless childhood summers – is affected, too. The average Briton now spends almost nine hours a day staring at their phone, computer or television, and when more of our lives are lived on screen, more of our memories will be formed there. We are recording more about ourselves and our experiences than ever before, and though in the past this required deliberate effort, such as sitting down to write a diary, or filing away a letter, or posing for a portrait, today this process can be effortless, even unintentional. Never before have people had access to such comprehensive and accurate personal histories – and so little power to rewrite them.

My internet history faithfully documents my desktop meanderings, even when I resurface from hours of browsing with little memory of where I have been or what I have read. My Gmail account now contains over 35,000 emails received since 2005. It has preserved the banal – long-expired special offers, obsolete arrangements for post-work drinks – alongside the life-changing. Loves and break-ups are chronicled here; jobs, births and weddings are announced; deaths are grieved. My Facebook profile page has developed into a crowdsourced, if assiduously edited, photo album of my social life over the past decade. My phone is a museum of quick-fire text exchanges. With a few clicks, I can retrieve, in mind-numbing detail, information about my previous movements, thoughts and feelings. So could someone else. Even my most private digital memories are not mine alone. They have become data to be restructured, repackaged, aggregated, copied, deleted, monetised or sold by internet firms. Our digital memories extend far beyond our reach.

In the late 1990s the philosopher David Chalmers coined the term “the extended mind” to describe how when we use pen and paper, calculators, or laptops to help us think or remember, these external objects are incorporated into our cognitive processes. “The technology we use becomes part of our minds, extending our minds and indeed our selves into the world,” Chalmers said in a 2011 Ted talk. Our iPhones have not been physically implanted into our brains, he explained, but it’s as if they have been. There’s a big difference between offloading memory on to a notepad and doing it on to a smartphone. One is a passive receptacle, the other is active. A notebook won’t reorganise the information you give it or ping you an alert; its layout and functions won’t change overnight; its contents aren’t part-owned by the stationery firm that made it. The more we extend our minds online, the harder it is becoming to keep control of our digital pasts, or to tell where our memories begin or end. And, while society’s collective memory is expanding at an astonishing rate, our internal, individual ones are shrinking.

***

Our brains are lazy; we are reluctant to remember things when we can in effect delegate the task to someone or something else. You can observe this by listening to couples, who often consult one another’s memories: “What was the name of that nice Chinese restaurant we went to the other day?” Subconsciously, partners distribute responsibility for remembering information according to each other’s strengths. I ask my husband for directions, he consults me on people’s names.

In one study conducted in 1991, psychologists assigned a series of memory exercises to pairs of students, some of whom had been dating for at least three months and some of whom did not know one another. The dating couples remembered more than the non-dating pairs. They also remembered more unique information; when a fact fell into their partner’s area of expertise, they were more likely to forget it.

In a similar way, when we know that a computer can remember something for us we are less likely to remember it ourselves. For a study published by the journal Science in 1991, people were asked to type some trivia facts into a computer. Those who believed the facts would be saved at the end of the experiment remembered less than those who thought they would be deleted – even when they were explicitly asked to memorise them. In an era when technology is doing ever more remembering, it is unsurprising that we are more inclined to forget.

It is sometimes suggested that in time the worry that the internet is making us forgetful will sound as silly as early fears that books would do the same. But the internet is not an incremental step in the progression of written culture, it is revolutionising the way we consume information. When you pull an encyclopaedia down from a library shelf, it is obvious that you are retrieving a fact you have forgotten, or never knew. Google is so fast and easy to use that we can forget we have consulted it at all: we are at risk of confusing the internet’s memory with our own. A Harvard University project in 2013 found that when people were allowed to use Google to check their answers to trivia questions they rated their own intelligence and memories more highly – even if they were given artificially low test results. Students usually believed more often that Google was confirming a fact they already knew, rather than providing them with new information.

This changed when Adrian Ward, now an assistant professor at the University of Austin, who designed the study as part of his PhD research, mimicked a slow internet connection so that students were forced to wait 25 seconds to read the answer to a Google query. The delay, he noted, stripped them of the “feeling of knowing” because they became more aware that they were consulting an external source. In the internet age, Ward writes, people “may offload more and more information while losing sight of the distinction between information stored in their minds and information stored online”.

By blurring the distinction between our personal and our digital memories, modern technology could encourage intellectual complacency, making people less curious about new information because they feel they already know it, and less likely to pay attention to detail because our computers are remembering it. What if the same could be said for our own lives: are we less attentive to our experiences because we know that computers will record them for us?

An experiment by the American psychologist Linda Henkel suggests this could be the case; she has found that when people take photographs at museums they are more likely to forget details of what they have seen. To some extent, we’re all tourists exploring the world from behind a camera, too distracted by our digital memories to inhabit our analogue lives fully. Relying on computers to remember telephone numbers or trivia does not seem to deprive our internal memories of too much – provided you can remember where you’ve stored it, factual information is fairly straightforward to retrieve. Yet a digital memory is a poor substitute for the richness of a personal experience revisited, and our autobiographical memories cannot be “retrieved” by opening the relevant online file.

Our relationship with the past is capricious. Sometimes an old photograph can appear completely unfamiliar, while at other times the faintest hint – the smell of an ex-lover’s perfume on a crowded Tube carriage – can induce overwhelming nostalgia. Remembering is in part a feeling, of recognition, of having been there, of reinhabiting a former self. This feeling is misleading; we often imagine memories offer an authentic insight into our past. They do not.

Memory is closely linked to self-identity, but it is a poor personal record. Remembering is a creative act. It is closely linked to imagining. When people suffer from dementia they are often robbed not only of the past but also of the future; without memory it is hard to construct an idea of future events. We often mistakenly convert our imaginings into memories – scientists call the process “imagination inflation”. This puts biological memories at odds with digital ones. While memories stored online can be retrieved intact, our internal memories are constantly changing and evolving. Each time we relive a memory, we reconfigure it to suit our present needs and world-view. In his book Pieces of Light, an exploration of the new science of memory, the neuroscientist Charles Fernyhough compares the construction of memory to storytelling. To impose meaning on to our chaotic, complex lives we need to know which sections to abridge and which details can be ignored. “We are all natural born storytellers. We are constantly editing and remaking our memory stories as our knowledge and emotions change. They may be fictions, but they are our fictions,” Fernyhough writes.

We do not write these stories alone. The human mind is suggestible. In 2013, scientists at MIT made international headlines when they said they had successfully implanted a false memory into a mouse using a form of light stimulation, but human beings implant false memories into each other all the time, using more low-tech methods. Friends and family members are forever distorting one another’s memories. I remember distinctly being teased for my Dutch accent at school and indignantly telling my mother when I arrived home that, “It’s pronounced one, two, three. Not one, two, tree.” My brother is sure it was him. The anecdote is tightly woven into the story of our pasts, but one of us must be wrong. When we record our personal memories online we open up new possibilities for their verification but we also create different opportunities for their distortion. In subtle ways, internet firms are manipulating our digital memories all the time – and we are often dangerously unaware of it.

***

Facebook occasionally gives me a reminder of Mahmoud Tlissy, the caretaker at my former office in Libya who died quietly of pancreatic cancer in 2011 while the civil war was raging. Every so often he sends me a picture of a multicoloured heart via a free app that outlived him. Mahmoud was a kind man with a sardonic sense of humour, a deep smoker’s laugh and a fondness for recounting his wild days as a student in Prague. I am always pleased to be reminded of him, but I feel uncomfortable because I doubt he would have chosen such a naff way to communicate with me after death. Our digital lives will survive us, sending out e-hearts and populating databases long after we have dropped off the census. When we deposit our life memories online, they start to develop lives of their own.

Those who want to limit the extent to which their lives are recorded digitally are swimming against the tide. Internet firms have a commercial interest in encouraging us not only to offload more personal information online, but also to use digital technology to reflect on our lives. Take Face­book, which was developed as a means of communicating but is becoming a tool for remembering and memorialising, too. The first Facebook users, who were university students in 2004, are mostly in their thirties now. Their graduations, first jobs, first loves, marriages and first children are likely to be recorded on the site; friends who have died young are likely to be mourned on it. The website understands that nostalgia is a powerful marketing tool, and so it has released gimmicky tools, such as automated videos, to help people “look back”.

These new online forms of remembrance are becoming popular. On Instagram and Twitter it is common for users to post sentimental old snaps under the hashtag #tbt, which stands for “Throwback Thursday”. Every day, seven million people check Timehop, an app that says it “helps you see the best moments of your past” by showing you old tweets, photos and online messages. Such tools are presented as a way of enriching our ability to relive the past but they are limiting. We can use them to tell stories about our lives, but the pace and structure of the narrative is defined for us. Remembering is an imaginative act, but internet firms are selling nostalgia by algorithm – and we’re buying it.

At their most ambitious, tech companies are offering the possibility of objective and complete insight into our pasts. In the future, “digital memories” could “[enhance] personal reflection in much the same way as the internet has aided scientific investigations”, the computer scientists Gordon Bell and Jim Gemmell wrote in the magazine Scientific American in 2006. The assumption is that our complex, emotional autobiographic memories can be captured as data to be ordered, quantified and analysed – and that computer programs could make better sense of them than our own, flawed brains. The pair have been collaborating on a Microsoft Research “life-logging” project since 2001, in which Bell logs everything he has said, written, seen and heard into a specially designed database.

Bell understood that the greatest challenge is finding a way to make digital archives usable. Without a program to help us extract information, digital memories are virtually useless: imagine trying to retrieve a telephone number from a month’s worth of continuous video footage. In our increasingly life-logged futures, we will all depend on powerful computer programs to index, analyse, repackage and retrieve our digital memories for us. The act of remembering will become automated. We will no longer make our “own fictions”.

This might sound like a distant sci-fi fantasy, but we are a long way there. Billions of people share their news and views by email or on social media daily, and unwittingly leave digital trails as they browse the web. The use of tracking devices to measure and record sleep, diet, exercise patterns, health and even mood is increasing. In the future, these comprehensive databases could prove very useful. When you go to the doctor, you might be able to provide details of your precise diet, exercise and sleep patterns. When a relationship breaks down you could be left with many gigabytes of digital memory to explore and make sense of. Did you really ­always put him down? Should you have broken up four years ago? In a few years’ time there could be an app for that.

Our reliance on digital memories is self-perpetuating: the more we depend on computer memories to provide us with detailed personal data, the more inadequate our own minds seem. Yet the fallibility of the human memory isn’t a design flaw, it is one of its best features. Recently, I typed the name of an ex-boyfriend into my Gmail search bar. This wasn’t like opening a box of old letters. For a start, I could access both sides of our email correspondence. Second, I could browse dozens of G-chats, instant messaging conversations so mundane and spontaneous that reading them can feel more like eavesdropping on a former self, or a stranger. The messages surprised me. I had remembered the relationship as short-lived and volatile but lacking any depth of feeling. So why had I sent those long, late-night emails? And what could explain his shorter, no less dramatic replies, “Will u ever speak to me again? You will ignore this I suspect but I love you.” Did he love me? Was I really so hurt? I barely recognise myself as the author of my messages; the feelings seem to belong to someone else.

My digital archives will offer a very different narrative from the half-truths and lies I tell myself, but I am more at home with my fictions. The “me” at the centre of my own memories is constantly evolving, but my digital identity is frozen in time. I feel a different person now; my computer suggests otherwise. Practically, this can pose problems (many of us are in possession of teenage social media posts we hope will never be made public) and psychologically it matters, too. To a greater or lesser extent, we all want to shed our former selves – but digital memories keep us firmly connected to our past. Forgetting and misremembering is a source of freedom: the freedom to reinvent oneself, to move on, to rewrite our stories. It means that old wounds need not hurt for ever, that love can be allowed to fade, that people can change.

With every passing year, we are shackling ourselves more tightly to our digital legacies, and relying more heavily on computer programs to narrate our personal histories for us. It is becoming ever harder to escape the past, or remake the future. Years from now, our digitally enhanced memories could allow us to have near-perfect recall, but who would want to live with their head in the cloud?

Sophie McBain is an NS contributing writer. This article was a runner-up in the 2015 Bodley Head FT Essay Prize

Sophie McBain is a freelance writer based in Cairo. She was previously an assistant editor at the New Statesman.

This article first appeared in the 18 February 2016 issue of the New Statesman, A storm is coming