This is a metaphor. Photograph: Getty Images
Show Hide image

Your brain on pseudoscience: the rise of popular neurobollocks

The “neuroscience” shelves in bookshops are groaning. But are the works of authors such as Malcolm Gladwell and Jonah Lehrer just self-help books dressed up in a lab coat?

An intellectual pestilence is upon us. Shop shelves groan with books purporting to explain, through snazzy brain-imaging studies, not only how thoughts and emotions function, but how politics and religion work, and what the correct answers are to age-old philosophical controversies. The dazzling real achievements of brain research are routinely pressed into service for questions they were never designed to answer. This is the plague of neuroscientism – aka neurobabble, neurobollocks, or neurotrash – and it’s everywhere.

In my book-strewn lodgings, one literally trips over volumes promising that “the deepest mysteries of what makes us who we are are gradually being unravelled” by neuroscience and cognitive psychology. (Even practising scientists sometimes make such grandiose claims for a general audience, perhaps urged on by their editors: that quotation is from the psychologist Elaine Fox’s interesting book on “the new science of optimism”, Rainy Brain, Sunny Brain, published this summer.) In general, the “neural” explanation has become a gold standard of non-fiction exegesis, adding its own brand of computer-assisted lab-coat bling to a whole new industry of intellectual quackery that affects to elucidate even complex sociocultural phenomena. Chris Mooney’s The Republican Brain: the Science of Why They Deny Science – and Reality disavows “reductionism” yet encourages readers to treat people with whom they disagree more as pathological specimens of brain biology than as rational interlocutors.

The New Atheist polemicist Sam Harris, in The Moral Landscape, interprets brain and other research as showing that there are objective moral truths, enthusiastically inferring – almost as though this were the point all along – that science proves “conservative Islam” is bad.

Happily, a new branch of the neuroscienceexplains everything genre may be created at any time by the simple expedient of adding the prefix “neuro” to whatever you are talking about. Thus, “neuroeconomics” is the latest in a long line of rhetorical attempts to sell the dismal science as a hard one; “molecular gastronomy” has now been trumped in the scientised gluttony stakes by “neurogastronomy”; students of Republican and Democratic brains are doing “neuropolitics”; literature academics practise “neurocriticism”. There is “neurotheology”, “neuromagic” (according to Sleights of Mind, an amusing book about how conjurors exploit perceptual bias) and even “neuromarketing”. Hoping it’s not too late to jump on the bandwagon, I have decided to announce that I, too, am skilled in the newly minted fields of neuroprocrastination and neuroflâneurship.

Illumination is promised on a personal as well as a political level by the junk enlightenment of the popular brain industry. How can I become more creative? How can I make better decisions? How can I be happier? Or thinner? Never fear: brain research has the answers. It is self-help armoured in hard science. Life advice is the hook for nearly all such books. (Some cram the hard sell right into the title – such as John B Arden’s Rewire Your Brain: Think Your Way to a Better Life.) Quite consistently, heir recommendations boil down to a kind of neo- Stoicism, drizzled with brain-juice. In a selfcongratulatory egalitarian age, you can no longer tell people to improve themselves morally. So self-improvement is couched in instrumental, scientifically approved terms.

The idea that a neurological explanation could exhaust the meaning of experience was already being mocked as “medical materialism” by the psychologist William James a century ago. And today’s ubiquitous rhetorical confidence about how the brain works papers over a still-enormous scientific uncertainty. Paul Fletcher, professor of health neuroscience at the University of Cambridge, says that he gets “exasperated” by much popular coverage of neuroimaging research, which assumes that “activity in a brain region is the answer to some profound question about psychological processes. This is very hard to justify given how little we currently know about what different regions of the brain actually do.” Too often, he tells me in an email correspondence, a popular writer will “opt for some sort of neuro-flapdoodle in which a highly simplistic and questionable point is accompanied by a suitably grand-sounding neural term and thus acquires a weightiness that it really doesn’t deserve. In my view, this is no different to some mountebank selling quacksalve by talking about the physics of water molecules’ memories, or a beautician talking about action liposomes.”

Shades of grey

The human brain, it is said, is the most complex object in the known universe. That a part of it “lights up” on an fMRI scan does not mean the rest is inactive; nor is it obvious what any such lighting-up indicates; nor is it straightforward to infer general lessons about life from experiments conducted under highly artificial conditions. Nor do we have the faintest clue about the biggest mystery of all – how does a lump of wet grey matter produce the conscious experience you are having right now, reading this paragraph? How come the brain gives rise to the mind? No one knows.

So, instead, here is a recipe for writing a hit popular brain book. You start each chapter with a pat anecdote about an individual’s professional or entrepreneurial success, or narrow escape from peril. You then mine the neuroscientific research for an apparently relevant specific result and narrate the experiment, perhaps interviewing the scientist involved and describing his hair. You then climax in a fit of premature extrapolation, inferring from the scientific result a calming bromide about what it is to function optimally as a modern human being. Voilà, a laboratory-sanctioned Big Idea in digestible narrative form. This is what the psychologist Christopher Chabris has named the “story-study-lesson” model, perhaps first perfected by one Malcolm Gladwell. A series of these threesomes may be packaged into a book, and then resold again and again as a stand-up act on the wonderfully lucrative corporate lecture circuit.

Such is the rigid formula of Imagine: How Creativity Works, published in March this year by the American writer Jonah Lehrer. The book is a shatteringly glib mishmash of magazine yarn, bizarrely incompetent literary criticism, inspiring business stories about mops and dolls and zany overinterpretation of research findings in neuroscience and psychology. Lehrer responded to my hostile review of the book by claiming that I thought the science he was writing about was “useless”, but such garbage needs to be denounced precisely in defence of the achievements of science. (In a sense, as Paul Fletcher points out, such books are “anti science, given that science is supposed to be  our protection against believing whatever we find most convenient, comforting or compelling”.) More recently, Lehrer admitted fabricating quotes by Bob Dylan in Imagine, which was hastily withdrawn from sale, and he resigned from his post at the New Yorker. To invent things supposedly said by the most obsessively studied popular artist of our age is a surprising gambit. Perhaps Lehrer misunderstood his own advice about creativity.

Mastering one’s own brain is also the key to survival in a dog-eat-dog corporate world, as promised by the cognitive scientist Art Markman’s Smart Thinking: How to Think Big, Innovate and Outperform Your Rivals. Meanwhile, the field (or cult) of “neurolinguistic programming” (NLP) sells techniques not only of self-overcoming but of domination over others. (According to a recent NLP handbook, you can “create virtually any and all states” in other people by using “embedded commands”.) The employee using such arcane neurowisdom will get promoted over the heads of his colleagues; the executive will discover expert-sanctioned ways to render his underlings more docile and productive, harnessing “creativity” for profit.

Waterstones now even has a display section labelled “Smart Thinking”, stocked with pop brain tracts. The true function of such books, of course, is to free readers from the responsibility of thinking for themselves. This is made eerily explicit in the psychologist Jonathan Haidt’s The Righteous Mind, published last March, which claims to show that “moral knowledge” is best obtained through “intuition” (arising from unconscious brain processing) rather than by explicit reasoning. “Anyone who values truth should stop worshipping reason,” Haidt enthuses, in a perverse manifesto for autolobotomy. I made an Olympian effort to take his advice seriously, and found myself rejecting the reasoning of his entire book.

Modern neuro-self-help pictures the brain as a kind of recalcitrant Windows PC. You know there is obscure stuff going on under the hood, so you tinker delicately with what you can see to try to coax it into working the way you want. In an earlier age, thinkers pictured the brain as a marvellously subtle clockwork mechanism, that being the cutting-edge high technology of the day. Our own brain-as-computer metaphor has been around for decades: there is the “hardware”, made up of different physical parts (the brain), and the “software”, processing routines that use different neuronal “circuits”. Updating things a bit for the kids, the evolutionary psychologist Robert Kurzban, in Why Everyone (Else) Is a Hypocrite, explains that the brain is like an iPhone running a bunch of different apps.

Such metaphors are apt to a degree, as long as you remember to get them the right way round. (Gladwell, in Blink – whose motivational selfhelp slogan is that “we can control rapid cognition” – burblingly describes the fusiform gyrus as “an incredibly sophisticated piece of brain software”, though the fusiform gyrus is a physical area of the brain, and so analogous to “hardware” not “software”.) But these writers tend to reach for just one functional story about a brain subsystem – the story that fits with their Big Idea – while ignoring other roles the same system might play. This can lead to a comical inconsistency across different books, and even within the oeuvre of a single author.

Is dopamine “the molecule of intuition”, as Jonah Lehrer risibly suggested in The Decisive Moment (2009), or is it the basis of “the neural highway that’s responsible for generating the pleasurable emotions”, as he wrote in Imagine? (Meanwhile, Susan Cain’s Quiet: the Power of Introverts in a World That Can’t Stop Talking calls dopamine the “reward chemical” and postulates that extroverts are more responsive to it.) Other recurring stars of the pop literature are the hormone oxytocin (the “love chemical”) and mirror neurons, which allegedly explain empathy. Jonathan Haidt tells the weirdly unexplanatory micro-story that, in one experiment, “The subjects used their mirror neurons, empathised, and felt the other’s pain.” If I tell you to use your mirror neurons, do you know what to do? Alternatively, can you do as Lehrer advises and “listen to” your prefrontal cortex? Self-help can be a tricky business.

Cherry-picking

Distortion of what and how much we know is bound to occur, Paul Fletcher points out, if the literature is cherry-picked.

“Having outlined your theory,” he says, “you can then cite a finding from a neuroimaging study identifying, for example, activity in a brain region such as the insula . . . You then select from among the many theories of insula function, choosing the one that best fits with your overall hypothesis, but neglecting to mention that nobody really knows what the insula does or that there are many ideas about its possible function.”

But the great movie-monster of nearly all the pop brain literature is another region: the amygdala. It is routinely described as the “ancient” or “primitive” brain, scarily atavistic. There is strong evidence for the amygdala’s role in fear, but then fear is one of the most heavily studied emotions; popularisers downplay or ignore the amygdala’s associations with the cuddlier emotions and memory. The implicit picture is of our uneasy coexistence with a beast inside the head, which needs to be controlled if we are to be happy, or at least liberal. (In The Republican Brain, Mooney suggests that “conservatives and authoritarians” might be the nasty way they are because they have a “more active amygdala”.) René Descartes located the soul in the pineal gland; the moral of modern pop neuroscience is that original sin is physical – a bestial, demonic proto-brain lurking at the heart of darkness within our own skulls. It’s an angry ghost in the machine.

Indeed, despite their technical paraphernalia of neurotransmitters and anterior temporal gyruses, modern pop brain books are offering a spiritual topography. Such is the seductive appeal of fMRI brain scans, their splashes of red, yellow and green lighting up what looks like a black intracranial vacuum. In mass culture, the fMRI scan (usually merged from several individuals) has become a secular icon, the converse of a Hubble Space Telescope image. The latter shows us awe-inspiring vistas of distant nebulae, as though painstakingly airbrushed by a sci-fi book-jacket artist; the former peers the other way, into psychedelic inner space. And the pictures, like religious icons, inspire uncritical devotion: a 2008 study, Fletcher notes, showed that “people – even neuroscience undergrads – are more likely to believe a brain scan than a bar graph”.

In The Invisible Gorilla, Christopher Chabris and his collaborator Daniel Simons advise readers to be wary of such “brain porn”, but popular magazines, science websites and books are frenzied consumers and hypers of these scans. “This is your brain on music”, announces a caption to a set of fMRI images, and we are invited to conclude that we now understand more about the experience of listening to music. The “This is your brain on” meme, it seems, is indefinitely extensible: Google results offer “This is your brain on poker”, “This is your brain on metaphor”, “This is your brain on diet soda”, “This is your brain on God” and so on, ad nauseam. I hereby volunteer to submit to a functional magnetic-resonance imaging scan while reading a stack of pop neuroscience volumes, for an illuminating series of pictures entitled This Is Your Brain on Stupid Books About Your Brain.

None of the foregoing should be taken to imply that fMRI and other brain-investigation techniques are useless: there is beautiful and amazing science in how they work and what well-designed experiments can teach us. “One of my favourites,” Fletcher says, “is the observation that one can take measures of brain activity (either using fMRI or EEG) while someone is learning . . . a list of words, and that activity can actually predict whether particular words will be remembered when the person is tested later (even the next day). This to me demonstrates something important – that observing activity in the brain can tell us something about how somebody is processing stimuli in ways that the person themselves is unable to report. With measures like that, we can begin to see how valuable it is to measure brain activity – it is giving us information that would otherwise be hidden from us.”

In this light, one might humbly venture a preliminary diagnosis of the pop brain hacks’ chronic intellectual error. It is that they misleadingly assume we always know how to interpret such “hidden” information, and that it is always more reliably meaningful than what lies in plain view. The hucksters of neuroscientism are the conspiracy theorists of the human animal, the 9/11 Truthers of the life of the mind.

Steven Poole is the author of the forthcoming book “You Aren’t What You Eat”, which will be published by Union Books in October.

This article was updated on 18 September 2012.

This article first appeared in the 10 September 2012 issue of the New Statesman, Autumn politics special

Picture: Bridgeman Images
Show Hide image

The people is sublime: the long history of populism, from Robespierre to Trump

If liberal democracy is to survive, the tide of populism will have to be turned back. The question is: how?

A spectre of populism is haunting the world’s liberal democracies. Donald Trump’s victory in the US presidential election, the narrow Leave majority in the EU referendum, Theresa May’s decision to call a snap election – breaking the spirit of the Fixed-Term Parliaments Act passed by the government of which she was a member – and Recep Tayyip Erdogan’s victory in the recent Turkish referendum all testify to the strength of the populist tide that is sweeping through the North Atlantic world. The consequences have been calamitous: a shrunken public realm, a demeaned civic culture, threatened minorities, contempt for the rule of law and an increasingly ugly public mood. If liberal democracy is to survive, the tide will have to be turned back. The question is: how?

The first essential is to understand the nature of the beast. This is more difficult than it sounds. Most democratic politicians seek popularity, but populism and popularity are not the same. Today’s populism is the descendant of a long line of ancestors. The first unmistakably populist movement in history appeared well over two centuries ago during the later stages of the French Revolution. It was led by Robespierre (Thomas Carlyle’s “sea-green incorruptible”) and the Jacobins who promised a reign of “virtue”. They were inspired by the cloudy prose of Jean-Jacques Rousseau, who believed that mere individuals should be subject to the general will of the social whole and – if necessary – “forced to be free”. As the revolution gathered pace and foreign armies mustered on France’s frontiers, the Jacobins launched the first organised, state-led and ideologically legitimised Terror in history. Chillingly, Robespierre declared, “The people is sublime, but individuals are weak.” That is the cry of populists through the ages. Appropriately, the Terror ended with Robespierre lying on a plank, screaming with pain before he was executed by guillotine.

The French Revolution – which began with the storming of the Bastille and ended with Napoleon’s ascent to an ersatz imperial throne – has an epic quality about it missing from later chapters in the populist story. Ironically, the second chapter, which opened half a century later, was the work of Louis Bonaparte, nephew of the great Napoleon. In 1848 came a second revolution and a second Republic; Louis Bonaparte was elected president by a huge majority. He tried and failed to amend the constitution to make it possible for him to have a second term; and then seized power in a coup d’état. Soon afterwards he became emperor as Napoleon III. (“Napoleon le petit”, in Victor Hugo’s savage phrase.) The whole story provoked one of Karl Marx’s best aphorisms: “History repeats itself; the first time as tragedy and the second as farce.”

There have been plenty of tragedies since – and plenty of farces, too. Trump’s victory was a tragedy, but farcical elements are already in evidence. Erdogan’s victory was even more tragic than Trump’s, but farce is conspicuously absent. The Leave victory in the referendum was tragic: arguably, the greatest tragedy in the three-century history of Britain’s union state. As with Trump, farce is already in evidence – the agitated comings and goings that have followed Theresa May’s loss of her Commons majority; the inane debate over the nature of the Brexit that Britain should seek; and the preposterous suggestion that, freed of the “Brussels” incubus, Britain will be able to conclude costless trade deals with the state-capitalist dictatorship of China and the “America First” neo-isolationists in Washington, DC. Unlike the French farce of Napoleon III’s Second Empire, however, the British farce now in progress is more likely to provoke tears than laughter.


Picture: André Carrilho

Populism is not a doctrine or a governing philosophy, still less an ideology. It is a disposition, perhaps a mood, a set of attitudes and above all a style. The People’s Party, which played a significant part in American politics in the late 19th century, is a case in point. The farmers whose grievances inspired the People’s Party wanted cheaper credit and transport to carry their products to markets in the eastern states. Hence the party’s two main proposals. One was the nationalisation of the railways, to cheapen transport costs; the other was “free silver” – the use of silver as well as gold as currency, supposedly to cheapen credit. Even then, this was not a particularly radical programme. It was designed to reform capitalism, not to replace it, as the largely Marxist social-democratic parties of Europe were seeking to do.

Rhetoric was a different matter. Mary Elizabeth Lease, a prominent member of the People’s Party, declared that America’s was no longer a government of the people by the people and for the people, but “a government of Wall Street, by Wall Street and for Wall Street”. The common people of America, she added, “are slaves and monopoly is the master”.

The Georgian populist Tom Watson once asked if Thomas Jefferson had dreamed that the party he founded would be “prostituted to the vilest purposes of monopoly” or that it would be led by “red-eyed Jewish millionaires”. The People’s Party’s constitutive Omaha Platform accused the two main parties of proposing “to sacrifice our homes, lives and children on the altar of Mammon; to destroy the multitude in order to secure corruption funds from the millionaires”. The party’s aim was “to restore the government of the Republic to the hands of ‘the plain people’ with which class it originated”. Theodore Roosevelt promised “to walk softly and carry a big stick”. The People’s Party walked noisily and carried a small stick. Jeremy Corbyn would have been at home in it.

Almost without exception, populists promise national regeneration in place of decline, decay and the vacillations and tergiversations of a corrupt establishment and the enervated elites that belong to it. Trump’s call to “make America great again” is an obvious recent case. His attacks on “crooked Hillary”, on the courts that have impeded his proposed ban on Muslim immigrants from capriciously chosen Middle Eastern and African countries, on the “fake news” of journalists seeking to hold his administration to account, and, most of all, his attack on the constitutional checks and balances that have been fundamental to US governance for more than 200 years, are the most alarming examples of populist practice, not just in American history but in the history of most of the North Atlantic world.

There are intriguing parallels between Trump’s regime and Erdogan’s. Indeed, Trump went out of his way to congratulate Erdogan on Turkey’s referendum result in April – which gives him the right to lengthen his term of office to ten years, to strengthen his control over the judiciary and to decide when to impose a state of emergency. Even before the referendum, he had dismissed more than 100,000 public servants, including teachers, prosecutors, judges and army officers; 4,000 were imprisoned. The Kurdish minority was – and is – repressed. True, none of this applies to Trump. But the rhetoric of the thin-skinned, paranoid US president and his equally thin-skinned and paranoid Turkish counterpart comes from the same repertoire. In the Turkish referendum Erdogan declared: “My nation stood upright and undivided.” It might have been Trump clamorously insisting that the crowd at his inauguration was bigger than it was.

***

The best-known modern British populists – Margaret Thatcher, Nigel Farage and David Owen – form a kind of counterpoint. In some ways, all three have harked back to the themes of the 19th-century American populists. Thatcher insisted that she was “a plain, straightforward provincial”, adding that her “Bloomsbury” was Grantham – “Methodism, the grocer’s shop, Rotary and all the serious, sober virtues, cultivated and esteemed in that environment”. Farage declared that the EU referendum was “a victory for ‘the real people’ of Britain” – implying, none too subtly, that the 48 per cent who voted Remain were somehow unreal or, indeed, un-British.

On a holiday job on a building site during the Suez War, Owen experienced a kind of epiphany. Hugh Gaitskell was criticising Anthony Eden, the prime minister, on television and in the House of Commons, but Owen’s workmates were solidly in favour of Eden. That experience, he said, made him suspicious of “the kind of attitude which splits the difference on everything. The rather defeatist, even traitorous attitude reflected in the pre-war Apostles at Cambridge.” (Owen voted for Brexit in 2016.)

Did he really believe that Bertrand Russell, John Maynard Keynes and George Moore were traitorous? Did he not know that they were Apostles? Or was he simply lashing out, Trump-like, at an elite that disdained him – and to which he yearned to belong?

Thatcher’s Grantham, Farage’s real people and David Owen’s workmates came from the same rhetorical stable as the American populists’ Omaha Platform. But the American populists really were plain, in their sense of the word, whereas Thatcher, Farage and Owen could hardly have been less so. Thatcher (at that stage Roberts) left Grantham as soon as she could and never looked back. She went to Somerville College, Oxford, where she was a pupil of the Nobel laureate Dorothy Hodgkin. She married the dashing and wealthy Denis Thatcher and abandoned science to qualify as a barrister before being elected to parliament and eventually becoming prime minister. Farage worked as a metals trader in the City before becoming leader of the UK Independence Party. Owen went to the private Bradfield College before going up to Cambridge to read medicine. Despite his Welsh antecedents, he looks and sounds like a well-brought-up English public school boy. He was elected to parliament in 1966 at the age of 28 and was appointed under-secretary for the navy at 30. He then served briefly as foreign secretary in James Callaghan’s miserable Labour government in the 1970s.

Much the same is true of Marine Le Pen in France. She is a hereditary populist – something that seems self-contradictory. The Front National (FN) she heads was founded by her father, Jean-Marie Le Pen – Holocaust denier, anti-Semite, former street brawler and sometime Poujadist. In the jargon of public relations, she has worked hard to “de-toxify” the FN brand. But the Front is still the Front; it appeals most strongly to the ageing and insecure in the de-industrialised areas of the north-east. Marine Le Pen applauded the Leave victory in Britain’s referendum – she seeks to limit immigration, just as Ukip did in the referendum and as the May government does now.

Above all, the Front National appeals to a mythologised past, symbolised by the figure of Joan of Arc. Joan was a simple, illiterate peasant from an obscure village in north-eastern France, who led the French king’s forces to a decisive victory over the English in the later stages of the Hundred Years War. She was captured by England’s Burgundian allies, and the English burned her at the stake at the age of 19. She was beatified in 1909 and canonised in 1920. For well over a century, she has been a heroine for the Catholic French right, for whom the revolutionary triad of liberté, egalité, fraternité is either vacuous or menacing.

***

The past to which the FN appeals is uniquely French. It is also contentious. A struggle over the ownership of the French past has been a theme of French politics ever since the French Revolution. But other mythologised pasts have figured again and again in populist rhetoric and still do. Mussolini talked of returning to the time of the Roman empire when the Mediterranean was Mare Nostrum. Trump’s “Make America great again” presupposes a past when America was great, and from which present-day Americans have strayed, thanks to Clintonesque crooks and the pedlars of fake news. “Take back control” – the mantra of the Brexiteers in the referendum – presupposes a past in which the British had control; Owen’s bizarre pre-referendum claim that, if Britain left the EU, she would be free to “rediscover the skills of blue water diplomacy” presupposed a time when she practised those skills. Vladimir Putin, another populist of sorts, is patently trying to harness memories of tsarist glory to his chariot wheels. Margaret Thatcher, the “plain, straightforward provincial” woman, sought to revive the “vigorous virtues” of her Grantham childhood and the “Victorian values” that underpinned them.

As well as mythologising the past, populists mythologise the people. Those for whom they claim to speak are undifferentiated, homogeneous and inert. Populists have nothing but contempt for de Tocqueville’s insight that the ever-present threat of majority tyranny can be kept at bay only by a rich array of intermediate institutions, including townships, law courts and a free press, underpinned by the separation of powers.

For populists, the threat of majority tyranny is a phantom, invented by out-of-touch and craven elitists. Law courts that stand in the way of the unmediated popular will are “enemies of the people”, as the Daily Mail put it. There is no need to protect minorities against the tyranny of the majority: minorities are either part of the whole, in which case they don’t need protection, or self-excluded from it, in which case they don’t deserve to be protected.

Apparent differences of interest or value that cut across the body of the people, that divide the collective sovereign against itself, are products of elite manipulation or, in Thatcher’s notorious phrase, of “the enemy within”. For there is a strong paranoid streak in the populist mentality. Against the pure, virtuous people stand corrupt, privileged elites and sinister, conspiratorial subversives. The latter are forever plotting to do down the former.

Like pigs searching for truffles, populists search for subversives. Inevitably, they find what they are looking for. Joe McCarthy was one of the most squalid examples of the populist breed: for years, McCarthyism was a baneful presence in Hollywood, in American universities, newspaper offices and in the public service, ruining lives, restricting free expression and making it harder for the United States to win the trust of its European allies. The barrage of hatred and contempt that the tabloid press unleashed on opponents of Theresa May’s pursuit of a “hard” Brexit is another example. Her astounding claim that a mysterious entity known as “Brussels” was seeking to interfere in the British general election is a third.

As the Princeton political scientist Jan-Werner Müller argues, all of this strikes at the heart of democratic governance. Democracy depends on open debate, on dialogue between the bearers of different values, in which the protagonists learn from each other and from which they emerge as different people. For the Nobel laureate, philosopher and economist Amartya Sen, democracy is, above all, “public reasoning”; and that is impossible without social spaces in which reasoning can take place. Populism is singular; democracy is plural. The great question for non-populists is how to respond to the populist threat.

Two answers are in contention. The first is Theresa May’s. It amounts to appeasement. May’s purported reason for calling a snap general election was that the politicians were divided, whereas the people were united. It is hard to think of a better – or more frightening – summary of the spirit of populism. The second answer is Emmanuel Macron’s. For the moment, at least, he is astonishingly popular in France. More important, his victory over Le Pen has shown that, given intelligence, courage and generosity of spirit, the noxious populist tide can be resisted and, perhaps, turned back. 

David Marquand’s most recent book is “Mammon’s Kingdom”: an Essay on Britain Now” (Allen Lane)

This article first appeared in the 10 September 2012 issue of the New Statesman, Autumn politics special