This is a metaphor. Photograph: Getty Images
Show Hide image

Your brain on pseudoscience: the rise of popular neurobollocks

The “neuroscience” shelves in bookshops are groaning. But are the works of authors such as Malcolm Gladwell and Jonah Lehrer just self-help books dressed up in a lab coat?

An intellectual pestilence is upon us. Shop shelves groan with books purporting to explain, through snazzy brain-imaging studies, not only how thoughts and emotions function, but how politics and religion work, and what the correct answers are to age-old philosophical controversies. The dazzling real achievements of brain research are routinely pressed into service for questions they were never designed to answer. This is the plague of neuroscientism – aka neurobabble, neurobollocks, or neurotrash – and it’s everywhere.

In my book-strewn lodgings, one literally trips over volumes promising that “the deepest mysteries of what makes us who we are are gradually being unravelled” by neuroscience and cognitive psychology. (Even practising scientists sometimes make such grandiose claims for a general audience, perhaps urged on by their editors: that quotation is from the psychologist Elaine Fox’s interesting book on “the new science of optimism”, Rainy Brain, Sunny Brain, published this summer.) In general, the “neural” explanation has become a gold standard of non-fiction exegesis, adding its own brand of computer-assisted lab-coat bling to a whole new industry of intellectual quackery that affects to elucidate even complex sociocultural phenomena. Chris Mooney’s The Republican Brain: the Science of Why They Deny Science – and Reality disavows “reductionism” yet encourages readers to treat people with whom they disagree more as pathological specimens of brain biology than as rational interlocutors.

The New Atheist polemicist Sam Harris, in The Moral Landscape, interprets brain and other research as showing that there are objective moral truths, enthusiastically inferring – almost as though this were the point all along – that science proves “conservative Islam” is bad.

Happily, a new branch of the neuroscienceexplains everything genre may be created at any time by the simple expedient of adding the prefix “neuro” to whatever you are talking about. Thus, “neuroeconomics” is the latest in a long line of rhetorical attempts to sell the dismal science as a hard one; “molecular gastronomy” has now been trumped in the scientised gluttony stakes by “neurogastronomy”; students of Republican and Democratic brains are doing “neuropolitics”; literature academics practise “neurocriticism”. There is “neurotheology”, “neuromagic” (according to Sleights of Mind, an amusing book about how conjurors exploit perceptual bias) and even “neuromarketing”. Hoping it’s not too late to jump on the bandwagon, I have decided to announce that I, too, am skilled in the newly minted fields of neuroprocrastination and neuroflâneurship.

Illumination is promised on a personal as well as a political level by the junk enlightenment of the popular brain industry. How can I become more creative? How can I make better decisions? How can I be happier? Or thinner? Never fear: brain research has the answers. It is self-help armoured in hard science. Life advice is the hook for nearly all such books. (Some cram the hard sell right into the title – such as John B Arden’s Rewire Your Brain: Think Your Way to a Better Life.) Quite consistently, heir recommendations boil down to a kind of neo- Stoicism, drizzled with brain-juice. In a selfcongratulatory egalitarian age, you can no longer tell people to improve themselves morally. So self-improvement is couched in instrumental, scientifically approved terms.

The idea that a neurological explanation could exhaust the meaning of experience was already being mocked as “medical materialism” by the psychologist William James a century ago. And today’s ubiquitous rhetorical confidence about how the brain works papers over a still-enormous scientific uncertainty. Paul Fletcher, professor of health neuroscience at the University of Cambridge, says that he gets “exasperated” by much popular coverage of neuroimaging research, which assumes that “activity in a brain region is the answer to some profound question about psychological processes. This is very hard to justify given how little we currently know about what different regions of the brain actually do.” Too often, he tells me in an email correspondence, a popular writer will “opt for some sort of neuro-flapdoodle in which a highly simplistic and questionable point is accompanied by a suitably grand-sounding neural term and thus acquires a weightiness that it really doesn’t deserve. In my view, this is no different to some mountebank selling quacksalve by talking about the physics of water molecules’ memories, or a beautician talking about action liposomes.”

Shades of grey

The human brain, it is said, is the most complex object in the known universe. That a part of it “lights up” on an fMRI scan does not mean the rest is inactive; nor is it obvious what any such lighting-up indicates; nor is it straightforward to infer general lessons about life from experiments conducted under highly artificial conditions. Nor do we have the faintest clue about the biggest mystery of all – how does a lump of wet grey matter produce the conscious experience you are having right now, reading this paragraph? How come the brain gives rise to the mind? No one knows.

So, instead, here is a recipe for writing a hit popular brain book. You start each chapter with a pat anecdote about an individual’s professional or entrepreneurial success, or narrow escape from peril. You then mine the neuroscientific research for an apparently relevant specific result and narrate the experiment, perhaps interviewing the scientist involved and describing his hair. You then climax in a fit of premature extrapolation, inferring from the scientific result a calming bromide about what it is to function optimally as a modern human being. Voilà, a laboratory-sanctioned Big Idea in digestible narrative form. This is what the psychologist Christopher Chabris has named the “story-study-lesson” model, perhaps first perfected by one Malcolm Gladwell. A series of these threesomes may be packaged into a book, and then resold again and again as a stand-up act on the wonderfully lucrative corporate lecture circuit.

Such is the rigid formula of Imagine: How Creativity Works, published in March this year by the American writer Jonah Lehrer. The book is a shatteringly glib mishmash of magazine yarn, bizarrely incompetent literary criticism, inspiring business stories about mops and dolls and zany overinterpretation of research findings in neuroscience and psychology. Lehrer responded to my hostile review of the book by claiming that I thought the science he was writing about was “useless”, but such garbage needs to be denounced precisely in defence of the achievements of science. (In a sense, as Paul Fletcher points out, such books are “anti science, given that science is supposed to be  our protection against believing whatever we find most convenient, comforting or compelling”.) More recently, Lehrer admitted fabricating quotes by Bob Dylan in Imagine, which was hastily withdrawn from sale, and he resigned from his post at the New Yorker. To invent things supposedly said by the most obsessively studied popular artist of our age is a surprising gambit. Perhaps Lehrer misunderstood his own advice about creativity.

Mastering one’s own brain is also the key to survival in a dog-eat-dog corporate world, as promised by the cognitive scientist Art Markman’s Smart Thinking: How to Think Big, Innovate and Outperform Your Rivals. Meanwhile, the field (or cult) of “neurolinguistic programming” (NLP) sells techniques not only of self-overcoming but of domination over others. (According to a recent NLP handbook, you can “create virtually any and all states” in other people by using “embedded commands”.) The employee using such arcane neurowisdom will get promoted over the heads of his colleagues; the executive will discover expert-sanctioned ways to render his underlings more docile and productive, harnessing “creativity” for profit.

Waterstones now even has a display section labelled “Smart Thinking”, stocked with pop brain tracts. The true function of such books, of course, is to free readers from the responsibility of thinking for themselves. This is made eerily explicit in the psychologist Jonathan Haidt’s The Righteous Mind, published last March, which claims to show that “moral knowledge” is best obtained through “intuition” (arising from unconscious brain processing) rather than by explicit reasoning. “Anyone who values truth should stop worshipping reason,” Haidt enthuses, in a perverse manifesto for autolobotomy. I made an Olympian effort to take his advice seriously, and found myself rejecting the reasoning of his entire book.

Modern neuro-self-help pictures the brain as a kind of recalcitrant Windows PC. You know there is obscure stuff going on under the hood, so you tinker delicately with what you can see to try to coax it into working the way you want. In an earlier age, thinkers pictured the brain as a marvellously subtle clockwork mechanism, that being the cutting-edge high technology of the day. Our own brain-as-computer metaphor has been around for decades: there is the “hardware”, made up of different physical parts (the brain), and the “software”, processing routines that use different neuronal “circuits”. Updating things a bit for the kids, the evolutionary psychologist Robert Kurzban, in Why Everyone (Else) Is a Hypocrite, explains that the brain is like an iPhone running a bunch of different apps.

Such metaphors are apt to a degree, as long as you remember to get them the right way round. (Gladwell, in Blink – whose motivational selfhelp slogan is that “we can control rapid cognition” – burblingly describes the fusiform gyrus as “an incredibly sophisticated piece of brain software”, though the fusiform gyrus is a physical area of the brain, and so analogous to “hardware” not “software”.) But these writers tend to reach for just one functional story about a brain subsystem – the story that fits with their Big Idea – while ignoring other roles the same system might play. This can lead to a comical inconsistency across different books, and even within the oeuvre of a single author.

Is dopamine “the molecule of intuition”, as Jonah Lehrer risibly suggested in The Decisive Moment (2009), or is it the basis of “the neural highway that’s responsible for generating the pleasurable emotions”, as he wrote in Imagine? (Meanwhile, Susan Cain’s Quiet: the Power of Introverts in a World That Can’t Stop Talking calls dopamine the “reward chemical” and postulates that extroverts are more responsive to it.) Other recurring stars of the pop literature are the hormone oxytocin (the “love chemical”) and mirror neurons, which allegedly explain empathy. Jonathan Haidt tells the weirdly unexplanatory micro-story that, in one experiment, “The subjects used their mirror neurons, empathised, and felt the other’s pain.” If I tell you to use your mirror neurons, do you know what to do? Alternatively, can you do as Lehrer advises and “listen to” your prefrontal cortex? Self-help can be a tricky business.

Cherry-picking

Distortion of what and how much we know is bound to occur, Paul Fletcher points out, if the literature is cherry-picked.

“Having outlined your theory,” he says, “you can then cite a finding from a neuroimaging study identifying, for example, activity in a brain region such as the insula . . . You then select from among the many theories of insula function, choosing the one that best fits with your overall hypothesis, but neglecting to mention that nobody really knows what the insula does or that there are many ideas about its possible function.”

But the great movie-monster of nearly all the pop brain literature is another region: the amygdala. It is routinely described as the “ancient” or “primitive” brain, scarily atavistic. There is strong evidence for the amygdala’s role in fear, but then fear is one of the most heavily studied emotions; popularisers downplay or ignore the amygdala’s associations with the cuddlier emotions and memory. The implicit picture is of our uneasy coexistence with a beast inside the head, which needs to be controlled if we are to be happy, or at least liberal. (In The Republican Brain, Mooney suggests that “conservatives and authoritarians” might be the nasty way they are because they have a “more active amygdala”.) René Descartes located the soul in the pineal gland; the moral of modern pop neuroscience is that original sin is physical – a bestial, demonic proto-brain lurking at the heart of darkness within our own skulls. It’s an angry ghost in the machine.

Indeed, despite their technical paraphernalia of neurotransmitters and anterior temporal gyruses, modern pop brain books are offering a spiritual topography. Such is the seductive appeal of fMRI brain scans, their splashes of red, yellow and green lighting up what looks like a black intracranial vacuum. In mass culture, the fMRI scan (usually merged from several individuals) has become a secular icon, the converse of a Hubble Space Telescope image. The latter shows us awe-inspiring vistas of distant nebulae, as though painstakingly airbrushed by a sci-fi book-jacket artist; the former peers the other way, into psychedelic inner space. And the pictures, like religious icons, inspire uncritical devotion: a 2008 study, Fletcher notes, showed that “people – even neuroscience undergrads – are more likely to believe a brain scan than a bar graph”.

In The Invisible Gorilla, Christopher Chabris and his collaborator Daniel Simons advise readers to be wary of such “brain porn”, but popular magazines, science websites and books are frenzied consumers and hypers of these scans. “This is your brain on music”, announces a caption to a set of fMRI images, and we are invited to conclude that we now understand more about the experience of listening to music. The “This is your brain on” meme, it seems, is indefinitely extensible: Google results offer “This is your brain on poker”, “This is your brain on metaphor”, “This is your brain on diet soda”, “This is your brain on God” and so on, ad nauseam. I hereby volunteer to submit to a functional magnetic-resonance imaging scan while reading a stack of pop neuroscience volumes, for an illuminating series of pictures entitled This Is Your Brain on Stupid Books About Your Brain.

None of the foregoing should be taken to imply that fMRI and other brain-investigation techniques are useless: there is beautiful and amazing science in how they work and what well-designed experiments can teach us. “One of my favourites,” Fletcher says, “is the observation that one can take measures of brain activity (either using fMRI or EEG) while someone is learning . . . a list of words, and that activity can actually predict whether particular words will be remembered when the person is tested later (even the next day). This to me demonstrates something important – that observing activity in the brain can tell us something about how somebody is processing stimuli in ways that the person themselves is unable to report. With measures like that, we can begin to see how valuable it is to measure brain activity – it is giving us information that would otherwise be hidden from us.”

In this light, one might humbly venture a preliminary diagnosis of the pop brain hacks’ chronic intellectual error. It is that they misleadingly assume we always know how to interpret such “hidden” information, and that it is always more reliably meaningful than what lies in plain view. The hucksters of neuroscientism are the conspiracy theorists of the human animal, the 9/11 Truthers of the life of the mind.

Steven Poole is the author of the forthcoming book “You Aren’t What You Eat”, which will be published by Union Books in October.

This article was updated on 18 September 2012.

This article first appeared in the 10 September 2012 issue of the New Statesman, Autumn politics special

RALPH STEADMAN
Show Hide image

The age of outrage

Why are we so quick to take offence? The Private Eye editor on Orwell, Trump and the death of debate in post-truth politics.

Anyone who thinks that “post-truth politics” is anything new needs to be reminded that George Orwell was writing about this phenomenon 70 years before Donald Trump.

Audiences listening to President-Elect Trump’s extraordinary disregard for anything resembling objective truth – and his astonishing ability to proclaim the absolute opposite today of what he said yesterday – will be forcibly reminded of the slogans that George Orwell gave to his political ­dictators: Black is White, War is Peace, ­Freedom is Slavery, Ignorance is Strength (the last of which turned out to be true in the US election). But any journalist trying to work out what the speeches actually mean, amidst the mad syntax and all the repetition (“gonna happen, gonna happen”), cannot help but fall back on Orwell’s contention that “political chaos is connected with the decay of language”. And the sight of Trump praising Secretary Clinton for her years of public service in his post-election victory speech while the crowd was still chanting his campaign catchphrase of “Lock her up” was surely a perfect example of Doublethink.

No wonder Trump is an admirer of Vladimir Putin, who is an admirer of the Soviet strongmen whom Orwell satirised so well. These echoes from the past are very strong in America at present but there are plenty of them reverberating through British and European politics as well. Our Foreign Secretary managed to accuse other European leaders of a “whinge-o-rama” when they issued qualified statements of congratulation to the new president-elect, even though he himself had previously accused Trump of being “nuts”. Black is White, Remain is Leave, a Wall is a Fence, two plus two equals five: but Brexit means Brexit.

You may find this reassuring, in that we have been here before and survived – or distressing to think that we are regressing to a grimmer Orwellian age. But one of the worrying developments attached to these “post-truth” political figures is the increasing intolerance in public debate of dissent – or even disagreement – about what objective truth might be.

A great deal has been written recently about the influence of social media in helping people to become trapped in their own echo chambers, talking only to those who reinforce their views and dismissing not only other opinions, but also facts offered by those who disagree with them. When confronted by a dissenting voice, people get offended and then angry. They do not want to argue, they want the debate to be shut down. Trump supporters are furious with anyone who expresses reservations about their candidate. Pro-Brexit supporters are furious with anyone who expresses doubts about the way the process of leaving the European Union is going.

I edit the magazine Private Eye, which I sometimes think Orwell would have dismissed as “a tuppeny boys’ fortnightly”, and after the recent legal challenge to the government about Article 50 being put before parliament, we published the cover reproduced on page 25.

It was a fairly obvious joke, a variant of the “wheels coming off” gag. But it led to a large postbag of complaints, including a letter from a man who said he thought the cover was “repulsive”. He also said he wanted to come around and smash up the office and then shove our smug opinions so far up our arses that we choked our guts out.

There was one from a vicar, too, who told me that it was time to accept the victory of the majority of the people and to stop complaining. Acceptance was a virtue, he said. I wrote back and told him that this argument was a bit much, coming from a church that had begun with a minority of 12. (Or, on Good Friday, a minority of one.)

This has become a trend in those who complain: the magazine should be shouted down or, better still, closed down. In the light of this it was interesting to read again what Orwell said in his diary long before internet trolls had been invented:

 

We are all drowning in filth. When I talk to anyone or read the writings of anyone who has any axe to grind, I feel that intellectual honesty and balanced judgement have simply disappeared from the face of the earth. Everyone’s thought is forensic, everyone is simply putting a “case” with deliberate suppression of his opponent’s point of view, and, what is more, with complete insensitiveness to any sufferings except those of himself and his friends.

 

This was in 1942, when the arguments were about war and peace, life and death, and there were real fascists and Stalinists around rather than, say, people who disagree with you about the possibility of reconciling freedom of movement with access to the single European market.

Orwell also made clear, in an essay called “As I Please” in Tribune in 1944, that what we think of as the new online tendency to call everyone who disagrees with you a fascist is nothing new. He wrote then:

 

It will be seen that, as used, the word “Fascism” is almost entirely meaningless. In conversation, of course, it is used even more wildly than in print. I have heard it applied to farmers, shopkeepers, Social Credit, corporal punishment, fox-hunting, bull-fighting, the 1922 Committee [a Tory group], the 1941 Committee [a left-liberal group], Kipling, Gandhi, Chiang Kai-Shek, homosexuality, Priestley’s broadcasts, Youth Hostels, astrology, women, dogs and I do not know what else.

 

When Orwell writes like this about the level of public debate, one is unsure whether to feel relieved at the sense of déjà vu or worried about the possibility of history repeating itself, not as farce, but as tragedy again.

The mood and tone of public opinion is an important force in the way our society and our media function. Orwell wrote about this in an essay called “Freedom of the Park”, published in Tribune in December 1945. Five people had been arrested outside Hyde Park for selling pacifist and anarchist publications. Orwell was worried that, though they had been allowed to publish and sell these periodicals throughout the entire Second World War, there had been a shift in public opinion that meant that the police felt confident to arrest these people for “obstruction” and no one seemed to mind this curtailment of freedom of speech except him. He wrote:

 

The relative freedom which we enjoy depends on public opinion. The law is no protection. Governments make laws, but whether they are carried out, and how the police behave, depends on the general temper in the country. If large numbers of people are interested in freedom of speech, there will be freedom of speech, even if the law forbids it; if public opinion is sluggish, inconvenient minorities will be persecuted, even if laws exist to protect them.

 

This is certainly true for the press today, whose reputation in the past few years has swung violently between the lows of phone-hacking and the highs of exposing MPs’ expenses. In 2011 I remember at one point a football crowd shouting out the name of Ryan Giggs, who had a so-called superinjunction in place forbidding anyone to mention that he was cheating on his wife and also forbidding anyone to mention the fact that he had taken out a superinjunction. He was named on Twitter 75,000 times. It seemed clear that public opinion had decided that his private life should be made public. The freedom of the press was briefly popular. Later the same year it was revealed that the murdered schoolgirl Milly Dowler’s phone had been hacked by the News of the World, along with those of a number of high-profile celebrities, and the public decided that actually journalists were all scumbags and the government should get Lord Leveson to sort them out. Those who maintained that the problem was that the existing laws (on trespass, contempt, etc) were not enforced because of an unhealthy relationship between the police, the press and the politicians were not given much credence.

In a proposed preface to his 1945 novel, Animal Farm, Orwell wrote: “If liberty means anything at all, it means the right to tell people what they do not want to hear.”

This is the quotation that will accompany the new statue of Orwell that has now been commissioned by the BBC and which will stand as a sort of rebuke to the corporation whenever it fails to live up to it. The BBC show on which I appear regularly, Have I Got News for You, has been described simultaneously in the online comments section as “overprivileged, right-wing Tory boys sneering at the working class ” and “lefty, metropolitan liberal elite having a Labour luvvie whinge-fest”. Disturbing numbers of complainants feel that making jokes about the new president-elect should not be allowed, since he has won the election. Humour is not meant to be political, assert the would-be censors – unless it attacks the people who lost the vote: then it is impartial and neutral. This role for comedy would have surprised Orwell, who was keen on jokes. He wrote of Charles Dickens:

 

A joke worth laughing at always has an idea behind it, and usually a subversive idea. Dickens is able to go on being funny because he is in revolt against authority, and authority is always there to be laughed at. There is always room for one more custard pie.

 

I think there is also room for a custard pie or two to be thrown against those who claim to be outsiders, against authority and “the system”, and use this as a way to take power. The American billionaire property developer who is the champion of those dispossessed by global capitalism seems a reasonable target for a joke. Just like his British friend, the ex-public-school boy City trader-turned-critic of the Home Counties elite.

The emblematic quotation on liberty is from a preface that was not published until 1972 in the Times Literary Supplement. A preface about freedom of speech that was censored? It is almost too neatly Orwellian to be true, and in fact no one seems to know exactly why it did not appear. Suffice to say that it is fascinating to read Orwell complaining that a novel which we all now assume to be a masterpiece – accurate about the nature of revolution and dictatorship and perfect for teaching to children in schools – was once considered to be unacceptably, offensively satirical.

The target of the satire was deemed to be our wartime allies the Russians. It is difficult to imagine a time, pre-Putin, pre-Cold War, when they were not seen as the enemy. But of course the Trump presidency may change all that. Oceania may not be at war with Eurasia any more. Or it may always have been at war with Eastasia. It is difficult to guess, but in those days the prevailing opinion was that it was “not done” to be rude about the Russians.

Interestingly there is now a significant faction on the British left, allied with the current leader of the Labour Party, who share this view.

 

The right to tell people what they do not want to hear is still the basis of freedom of expression. If that sounds like I am stating the obvious – I am. But, in my defence, Orwell once wrote in a review of a book by Bertrand Russell published in the Adelphi magazine in January 1939:

 

. . . we have now sunk to a depth at which the restatement of the obvious is the first duty of intelligent men.

 

Orwell himself managed to come round to a position of accepting that an author could write well and truthfully about a subject even if one disapproved of the author’s politics: both Kipling and Swift were allowed to be right even though they were not left enough. So I am hoping that we can allow Orwell to be right about the principles of freedom of expression.

In the unpublished preface to Animal Farm he writes:

 

The issue involved here is quite a simple one: Is every opinion, however unpopular – however foolish, even – entitled to a hearing? Put it in that form and nearly any English intellectual will feel that he ought to say “Yes”. But give it a concrete shape, and ask, “How about an attack on Stalin? Is that entitled to a hearing?”, and the answer more often than not will be “No”. In that case the current orthodoxy happens to be challenged, and so the principle of free speech lapses.

 

One can test oneself by substituting contemporary names for Stalin and seeing how you feel. Putin? Assange? Mandela? Obama? Snowden? Hillary Clinton? Angela Merkel? Prince Harry? Mother Teresa? Camila Batmanghelidjh? The Pope? David Bowie? Martin Luther King? The Queen?

Orwell was always confident that the populist response would be in favour of everyone being allowed their own views. That might be different now. If you were to substitute the name “Trump” or “Farage” and ask the question, you might not get such a liberal response. You might get a version of: “Get over it! Suck it up! You lost the vote! What bit of ‘democracy’ do you not understand?”

Orwell quotes from Voltaire (the attribution is now contested): “I detest what you say; I will defend to the death your right to say it.” Most of us would agree with the sentiment, but there is a worrying trend in universities that is filtering through into the media and the rest of society. Wanting a “safe space” in which you do not have to hear views that might upset you and demanding trigger warnings about works of art that might display attitudes which you find offensive are both part of an attempt to redefine as complex and negotiable what Orwell thought was simple and non-negotiable. And this creates problems.

Cartoon: "Voltaire goes to uni", by Russell and originally published in Private Eye.

We ran a guide in Private Eye as to what a formal debate in future universities might look like.

 

The proposer puts forward a motion to the House.

The opposer agrees with the proposer’s motion.

The proposer wholeheartedly agrees that the opposer was right to support the motion.

The opposer agrees that the proposer couldn’t be more right about agreeing that they were both right to support the motion.

When the debate is opened up to the floor, the audience puts it to the proposer and the opposer that it isn’t really a debate if everyone is just agreeing with each other.

The proposer and the opposer immediately agree to call security and have the audience ejected from the debating hall.

And so it goes on, until the motion is carried unanimously.

 

This was dismissed as “sneering” and, inevitably, “fascist” by a number of student commentators. Yet it was only a restatement of something that Orwell wrote in the unpublished preface:

 

. . . everyone shall have the right to say and to print what he believes to be the truth, provided only that it does not harm the rest of the community in some quite unmistakable way. Both capitalist democracy and the western versions of socialism have till recently taken that principle for granted. Our Government, as I have already pointed out, still makes some show of respecting it.

 

This is not always the case nowadays. It is always worth a comparison with the attitudes of other countries that we do not wish to emulate. The EU’s failure to confront President Erdogan’s closure of newspapers and arrests of journalists in Turkey because it wants his help to solve the refugee crisis is one such obvious example. An old German law to prosecute those making fun of foreign leaders was invoked by Erdogan and backed by Mrs Merkel. This led Private Eye to run a competition for Turkish jokes. My favourites were:

 

“Knock knock!”

“Who’s there.”

“The secret police.”

 

What do you call a satirist in Turkey?

An ambulance.

 

As Orwell wrote in even more dangerous times, again in the proposed preface:

 

. . . the chief danger to freedom of thought and speech at this moment is not the direct interference of the [Ministry of Information] or any official body. If publishers and editors exert themselves to keep certain topics out of print, it is not because they are frightened of prosecution but because they are frightened of public opinion.

 

I return to stating the obvious, because it seems to be less and less obvious to some of the current generation. This is particularly true for those who have recently become politically engaged for the first time. Voters energised by Ukip and the EU referendum debate, or by the emergence of Jeremy Corbyn as leader of the Labour Party, or by the resurgence of Scottish nationalism or by the triumph of Trump, have the zeal of the newly converted. This is all very admirable, and a wake-up call to their opponents – the Tartan Tories and the Remoaners and the NeoBlairites and the Washington Liberal Elite – but it is not admirable when it is accompanied by an overpowering desire to silence any criticism of their ideas, policies and leading personalities. Perhaps the supporters of the mainstream parties have simply become accustomed to the idea over the decades, but I have found in Private Eye that there is not much fury from the Tory, New Labour or Liberal camps when their leaders or policies are criticised, often in much harsher ways than the newer, populist movements.

 

 

So, when Private Eye suggested that some of the claims that the Scottish National Party was making for the future of an independent Scotland might be exaggerated, there were one or two readers who quoted Orwell’s distinction between patriotism being the love of one’s country and nationalism being the hatred of others – but on the whole it was mostly: “When if ever will you ignorant pricks on the Eye be sharp enough to burst your smug London bubble?”

Those who disagreed with the SNP were beneath contempt if English and traitors if Scottish. This was matched by the sheer fury of the Corbyn loyalists at coverage of his problems with opposition in his own party. When we suggested that there might be something a bit fishy about his video on the lack of seats on the train to Newcastle, responses included: “I had hoped Private Eye was outside the media matrix. Have you handed over control to Rupert Murdoch?”

Their anger was a match for that of the Ukippers when we briefly ran a strip called At Home With the Ukippers and then made a few jokes about their leader Mr Farage: “Leave it out, will you? Just how much of grant/top up/dole payment do you lot get from the EU anyway? Are you even a British publication?”

In 1948, in an essay in the Socialist Leader, Orwell wrote:

 

Threats to freedom of speech, writing and action, though often trivial in isolation, are cumulative in their effect and, unless checked, lead to a general disrespect for the rights of the citizen.

 

In other words, the defence of freedom of speech and expression is not just special pleading by journalists, writers, commentators and satirists, but a more widespread conviction that it protects “the intellectual liberty which without a doubt has been one of the distinguishing marks of Western civilisation”.

In gloomy times, there was one letter to Private Eye that I found offered some cheer – a willingness to accept opposing viewpoints and some confirmation of a belief in the common sense of Orwell’s common man or woman. In response to the cartoon below, our correspondent wrote:

 

Dear sir,

I suffer from a bipolar condition and when I saw your cartoon I was absolutely disgusted. I looked at it a few days later and thought it was hilarious.

 

Ian Hislop is the editor of Private Eye. This is an edited version of his 2016 Orwell Lecture. For more details, visit: theorwellprize.co.uk

This article first appeared in the 01 December 2016 issue of the New Statesman, Age of outrage