Has global warming really stopped?

Mark Lynas responds to a controversial article on newstatesman.com which argued global warming has s

On 19 December the New Statesman website published an article which, judging by the 633 comments (and counting) received so far, must go down in history as possibly the most controversial ever. Not surprising really – it covered one of the most talked-about issues of our time: climate change. Penned by science writer David Whitehouse, it was guaranteed to get a big response: the article claimed that global warming has ‘stopped’.

As the New Statesman’s environmental correspondent, I have since been deluged with queries asking if this represents a change of heart by the magazine, which has to date published many editorials steadfastly supporting urgent action to reduce carbon emissions. Why bother doing that if global warming has ‘stopped’, and therefore might have little or nothing to do with greenhouse gas emissions, which are clearly rising?

I’ll deal with this editorial question later. First let’s ask whether Whitehouse is wholly or partially correct in his analysis. To quote:

"The fact is that the global temperature of 2007 is statistically the same as 2006 as well as every year since 2001. Global warming has, temporarily or permanently, ceased. Temperatures across the world are not increasing as they should according to the fundamental theory behind global warming – the greenhouse effect. Something else is happening and it is vital that we find out what or else we may spend hundreds of billions of pounds needlessly."

I’ll be blunt. Whitehouse got it wrong – completely wrong. The article is based on a very elementary error: a confusion between year-on-year variability and the long-term average. Although CO2 levels in the atmosphere are increasing each year, no-one ever argued that temperatures would do likewise. Why? Because the planet’s atmosphere is a chaotic system, which expresses a great deal of interannual variability due to the interplay of many complex and interconnected variables. Some years are warmer and cooler than others. 1998, for example, was a very warm year because an El Nino event in the Pacific released a lot of heat from the ocean. 2001, by contrast, was somewhat cooler, though still a long way above the long-term average. 1992 was particularly cool, because of the eruption of a large volcano in the Philippines called Mount Pinatubo.

‘Climate’ is defined by averaging out all this variability over a longer term period. So you won’t, by definition, see climate change from one year to the next - or even necessarily from one decade to the next. But look at the change in the average over the long term, and the trend is undeniable: the planet is getting hotter.

Look at the graph below, showing global temperatures over the last 25 years. These are NASA figures, using a global-mean temperature dataset known as GISSTEMP. (Other datasets are available, for example from the UK Met Office. These fluctuate slightly due to varying assumptions and methodology, but show nearly identical trends.) Now imagine you were setting out to write Whitehouse’s article at some point in the past. You could plausibly have written that global warming had ‘stopped’ between 1983 and 1985, between 1990 and 1995, and, if you take the anomalously warm 1998 as the base year, between 1998 and 2004. Note, however, the general direction of the red line over this quarter-century period. Average it out and the trend is clear: up.

Note also the blue lines, scattered like matchsticks across the graph. These, helpfully added by the scientists at RealClimate.org (from where this graph is copied), partly in response to the Whitehouse article, show 8-year trend lines – what the temperature trend is for every 8-year period covered in the graph.

You’ll notice that some of the lines, particularly in the earlier part of the period, point downwards. These are the periods when global warming ‘stopped’ for a whole 8 years (on average), in the flawed Whitehouse definition – although, as astute readers will have quickly spotted, the crucial thing is what year you start with. Start with a relatively warm year, and the average of the succeeding eight might trend downwards. In scientific parlance, this is called ‘cherry picking’, and explains how Whitehouse can assert that "since [1998] the global temperature has been flat" – although he is even wrong on this point of fact, because as the graph above shows, 2005 was warmer.

Note also how none of the 8-year trend lines point downwards in the last decade or so. This illustrates clearly how, far from having ‘stopped’, global warming has actually accelerated in more recent times. Hence the announcement by the World Meteorological Organisation on 13 December, as the Bali climate change meeting was underway, that the decade of 1998-2007 was the “warmest on record”. Whitehouse, and his fellow contrarians, are going to have to do a lot better than this if they want to disprove (or even dispute) the accepted theory of greenhouse warming.

The New Statesman’s position on climate change

Every qualified scientific body in the world, from the American Association for the Advancement of Science to the Royal Society, agrees unequivocally that global warming is both a reality, and caused by man-made greenhouse gas emissions. But this doesn’t make them right, of course. Science, in the best Popperian definition, is only tentatively correct, until someone comes along who can disprove the prevailing theory. This leads to a frequent source of confusion, one which is repeated in the Whitehouse article – that because we don’t know everything, therefore we know nothing, and therefore we should do nothing. Using that logic we would close down every hospital in the land. Yes, every scientific fact is falsifiable – but that doesn’t make it wrong. On the contrary, the fact that it can be challenged (and hasn’t been successfully) is what makes it right.

Bearing all this in mind, what should a magazine like the New Statesman do in its coverage of the climate change issue? Newspapers and magazines have a difficult job of trying, often with limited time and information, to sort out truth from fiction on a daily basis, and communicating this to the public – quite an awesome responsibility when you think about it. Sometimes even a viewpoint which is highly likely to be wrong gets published anyway, because it sparks a lively debate and is therefore interesting. A publication that kept to a monotonous party line on all of the day’s most controversial issues would be very boring indeed.

However, readers of my column will know that I give contrarians, or sceptics, or deniers (call them what you will) short shrift, and as a close follower of the scientific debate on this subject I can state without doubt that there is no dispute whatsoever within the expert community as to the reality or causes of manmade global warming. But even then, just because all the experts agree doesn’t make them right – it just makes them extremely unlikely to be wrong. That in turn means that if someone begs to disagree, they need to have some very strong grounds for doing so – not misreading a basic graph or advancing silly conspiracy theories about IPCC scientists receiving paycheques from the New World Order, as some of Whitehouse’s respondents do.

So, a mistaken article reached a flawed conclusion. Intentionally or not, readers were misled, and the good name of the New Statesman has been used all over the internet by climate contrarians seeking to support their entrenched positions. This is regrettable. Good journalism should never exclude legitimate voices from a debate of public interest, but it also needs to distinguish between carefully-checked fact and distorted misrepresentations in complex and divisive areas like this. The magazine’s editorial policy is unchanged: we want to see aggressive action to reduce carbon emissions, and support global calls for planetary temperatures to be stabilised at under two degrees above pre-industrial levels.

Yes, scientific uncertainties remain in every area of the debate. But consider how high the stakes are here. If the 99% of experts who support the mainstream position are right, then we have to take urgent action to reduce emissions or face some pretty catastrophic consequences. If the 99% are wrong, and the 1% right, we will be making some unnecessary efforts to shift away from fossil fuels, which in any case have lots of other drawbacks and will soon run out. I’d hate to offend anyone here, but that’s what I’d call a no-brainer.

Mark Lynas has is an environmental activist and a climate change specialist. His books on the subject include High Tide: News from a warming world and Six Degree: Our future on a hotter planet.
Show Hide image

The age of loneliness

Profound changes in technology, work and community are transforming our ultrasocial species into a population of loners.

Our dominant ideology is based on a lie. A series of lies, in fact, but I’ll focus on just one. This is the claim that we are, above all else, self-interested – that we seek to enhance our own wealth and power with little regard for the impact on others.

Some economists use a term to describe this presumed state of being – Homo economicus, or self-maximising man. The concept was formulated, by J S Mill and others, as a thought experiment. Soon it became a modelling tool. Then it became an ideal. Then it evolved into a description of who we really are.

It could not be further from the truth. To study human behaviour is to become aware of how weird we are. Many species will go to great lengths to help and protect their close kin. One or two will show occasional altruism towards unrelated members of their kind. But no species possesses a capacity for general altruism that is anywhere close to our own.

With the possible exception of naked mole-rats, we have the most social minds of all mammals. These minds evolved as an essential means of survival. Slow, weak, armed with rounded teeth and flimsy nails in a world of fangs and claws and horns and tusks, we survived through co-operation, reciprocity and mutual defence, all of which developed to a remarkable degree.

A review paper in the journal Frontiers in Psychology observes that Homo economicus  might be a reasonable description of chimpanzees. “Outsiders . . . would not expect to receive offers of food or solicitude; rather, they would be fiercely attacked . . . food is shared only under harassment; even mothers will not voluntarily offer novel foods to their own infants unless the infants beg for them.” But it is an unreasonable description of human beings.

How many of your friends, colleagues and neighbours behave like chimpanzees? A few, perhaps. If so, are they respected or reviled? Some people do appear to act as if they have no interests but their own – Philip Green and Mike Ashley strike me as possible examples – but their behaviour ­attracts general revulsion. The news is filled with spectacular instances of human viciousness: although psychopaths are rare, their deeds fill the papers. Daily acts of kindness are seldom reported, because they are everywhere.

Every day, I see people helping others with luggage, offering to cede their place in a queue, giving money to the homeless, setting aside time for others, volunteering for causes that offer no material reward. Alongside these quotidian instances are extreme and stunning cases. I think of my Dutch mother-in-law, whose family took in a six-year-old Jewish boy – a stranger – and hid him in their house for two years during the German occupation of the Netherlands. Had he been discovered, they would all have been sent to a concentration camp.

Studies suggest that altruistic tendencies are innate: from the age of 14 months, children try to help each other, attempting to hand over objects another child can’t reach. At the age of two, they start to share valued possessions. By the time they are three, they begin to protest against other people’s violation of moral norms.

Perhaps because we are told by the media, think tanks and politicians that competition and self-interest are the defining norms of human life, we disastrously mischaracterise the way in which other people behave. A survey commissioned by the Common Cause Foundation reported that 78 per cent of respondents believe others to be more selfish than they really are.

I do not wish to suggest that this mythology of selfishness is the sole or even principal cause of the epidemic of loneliness now sweeping the world. But it is likely to contribute to the plague by breeding suspicion and a sense of threat. It also appears to provide a doctrine of justification for those afflicted by isolation, a doctrine that sees individualism as a higher state of existence than community. Perhaps it is hardly surprising that Britain, the European nation in which neoliberalism is most advanced, is, according to government figures, the loneliness capital of Europe.

There are several possible reasons for the atomisation now suffered by the supremely social mammal. Work, which used to bring us together, now disperses us: many people have neither fixed workplaces nor regular colleagues and regular hours. Our leisure time has undergone a similar transformation: cinema replaced by television, sport by computer games, time with friends by time on Facebook.

Social media seems to cut both ways: it brings us together and sets us apart. It helps us to stay in touch, but also cultivates a tendency that surely enhances other people’s sense of isolation: a determination to persuade your followers that you’re having a great time. FOMO – fear of missing out – seems, at least in my mind, to be closely ­associated with loneliness.

Children’s lives in particular have been transformed: since the 1970s, their unaccompanied home range (in other words, the area they roam without adult supervision) has declined in Britain by almost 90 per cent. Not only does this remove them from contact with the natural world, but it limits their contact with other children. When kids played out on the street or in the woods, they quickly formed their own tribes, learning the social skills that would see them through life.

An ageing population, family and community breakdown, the decline of institutions such as churches and trade unions, the switch from public transport to private, inequality, an alienating ethic of consumerism, the loss of common purpose: all these are likely to contribute to one of the most dangerous epidemics of our time.

Yes, I do mean dangerous. The stress response triggered by loneliness raises blood pressure and impairs the immune system. Loneliness enhances the risk of depression, paranoia, addiction, cognitive decline, dem­entia, heart disease, stroke, viral infection, accidents and suicide. It is as potent a cause of early death as smoking 15 cigarettes a day, and can be twice as deadly as obesity.

Perhaps because we are in thrall to the ideology that helps to cause the problem, we turn to the market to try to solve it. Over the past few weeks, the discovery of a new American profession, the people-walker (taking human beings for walks), has caused a small sensation in the media. In Japan there is a fully fledged market for friendship: you can hire friends by the hour with whom to chat and eat and watch TV; or, more disturbingly, to pose for pictures that you can post on social media. They are rented as mourners at funerals and guests at weddings. A recent article describes how a fake friend was used to replace a sister with whom the bride had fallen out. What would the bride’s mother make of it? No problem: she had been rented, too. In September we learned that similar customs have been followed in Britain for some time: an early foray into business for the Home Secretary, Amber Rudd, involved offering to lease her posh friends to underpopulated weddings.



My own experience fits the current pattern: the high incidence of loneliness suffered by people between the ages of 18 and 34. I have sometimes been lonely before and after that period, but it was during those years that I was most afflicted. The worst episode struck when I returned to Britain after six years working in West Papua, Brazil and East Africa. In those parts I sometimes felt like a ghost, drifting through societies to which I did not belong. I was often socially isolated, but I seldom felt lonely, perhaps because the issues I was investigating were so absorbing and the work so frightening that I was swept along by adrenalin and a sense of purpose.

When I came home, however, I fell into a mineshaft. My university friends, with their proper jobs, expensive mortgages and settled, prematurely aged lives, had become incomprehensible to me, and the life I had been leading seemed incomprehensible to everyone. Though feeling like a ghost abroad was in some ways liberating – a psychic decluttering that permitted an intense process of discovery – feeling like a ghost at home was terrifying. I existed, people acknowledged me, greeted me cordially, but I just could not connect. Wherever I went, I heard my own voice bouncing back at me.

Eventually I made new friends. But I still feel scarred by that time, and fearful that such desolation may recur, particularly in old age. These days, my loneliest moments come immediately after I’ve given a talk, when I’m surrounded by people congratulating me or asking questions. I often experience a falling sensation: their voices seem to recede above my head. I think it arises from the nature of the contact: because I can’t speak to anyone for more than a few seconds, it feels like social media brought to life.

The word “sullen” evolved from the Old French solain, which means “lonely”. Loneliness is associated with an enhanced perception of social threat, so one of its paradoxical consequences is a tendency to shut yourself off from strangers. When I was lonely, I felt like lashing out at the society from which I perceived myself excluded, as if the problem lay with other people. To read any comment thread is, I feel, to witness this tendency: you find people who are plainly making efforts to connect, but who do so by insulting and abusing, alienating the rest of the thread with their evident misanthropy. Perhaps some people really are rugged individualists. But others – especially online – appear to use that persona as a rationale for involuntary isolation.

Whatever the reasons might be, it is as if a spell had been cast on us, transforming this ultrasocial species into a population of loners. Like a parasite enhancing the conditions for its own survival, loneliness impedes its own cure by breeding shame and shyness. The work of groups such as Age UK, Mind, Positive Ageing and the Campaign to End Loneliness is life-saving.

When I first wrote about this subject, and the article went viral, several publishers urged me to write a book on the theme. Three years sitting at my desk, studying isolation: what’s the second prize? But I found another way of working on the issue, a way that engages me with others, rather than removing me. With the brilliant musician Ewan McLennan, I have written a concept album (I wrote the first draft of the lyrics; he refined them and wrote the music). Our aim is to use it to help break the spell, with performances of both music and the spoken word designed to bring people together –which, we hope, will end with a party at the nearest pub.

By itself, our work can make only a tiny contribution to addressing the epidemic. But I hope that, both by helping people to acknowledge it and by using the power of music to create common sentiment, we can at least begin to identify the barriers that separate us from others, and to remember that we are not the selfish, ruthless beings we are told we are.

“Breaking the Spell of Loneliness” by Ewan McLennan and George Monbiot is out now. For a full list of forthcoming gigs visit: monbiot.com/music/

This article first appeared in the 20 October 2016 issue of the New Statesman, Brothers in blood