This is a metaphor. Photograph: Getty Images
Show Hide image

Your brain on pseudoscience: the rise of popular neurobollocks

The “neuroscience” shelves in bookshops are groaning. But are the works of authors such as Malcolm Gladwell and Jonah Lehrer just self-help books dressed up in a lab coat?

An intellectual pestilence is upon us. Shop shelves groan with books purporting to explain, through snazzy brain-imaging studies, not only how thoughts and emotions function, but how politics and religion work, and what the correct answers are to age-old philosophical controversies. The dazzling real achievements of brain research are routinely pressed into service for questions they were never designed to answer. This is the plague of neuroscientism – aka neurobabble, neurobollocks, or neurotrash – and it’s everywhere.

In my book-strewn lodgings, one literally trips over volumes promising that “the deepest mysteries of what makes us who we are are gradually being unravelled” by neuroscience and cognitive psychology. (Even practising scientists sometimes make such grandiose claims for a general audience, perhaps urged on by their editors: that quotation is from the psychologist Elaine Fox’s interesting book on “the new science of optimism”, Rainy Brain, Sunny Brain, published this summer.) In general, the “neural” explanation has become a gold standard of non-fiction exegesis, adding its own brand of computer-assisted lab-coat bling to a whole new industry of intellectual quackery that affects to elucidate even complex sociocultural phenomena. Chris Mooney’s The Republican Brain: the Science of Why They Deny Science – and Reality disavows “reductionism” yet encourages readers to treat people with whom they disagree more as pathological specimens of brain biology than as rational interlocutors.

The New Atheist polemicist Sam Harris, in The Moral Landscape, interprets brain and other research as showing that there are objective moral truths, enthusiastically inferring – almost as though this were the point all along – that science proves “conservative Islam” is bad.

Happily, a new branch of the neuroscienceexplains everything genre may be created at any time by the simple expedient of adding the prefix “neuro” to whatever you are talking about. Thus, “neuroeconomics” is the latest in a long line of rhetorical attempts to sell the dismal science as a hard one; “molecular gastronomy” has now been trumped in the scientised gluttony stakes by “neurogastronomy”; students of Republican and Democratic brains are doing “neuropolitics”; literature academics practise “neurocriticism”. There is “neurotheology”, “neuromagic” (according to Sleights of Mind, an amusing book about how conjurors exploit perceptual bias) and even “neuromarketing”. Hoping it’s not too late to jump on the bandwagon, I have decided to announce that I, too, am skilled in the newly minted fields of neuroprocrastination and neuroflâneurship.

Illumination is promised on a personal as well as a political level by the junk enlightenment of the popular brain industry. How can I become more creative? How can I make better decisions? How can I be happier? Or thinner? Never fear: brain research has the answers. It is self-help armoured in hard science. Life advice is the hook for nearly all such books. (Some cram the hard sell right into the title – such as John B Arden’s Rewire Your Brain: Think Your Way to a Better Life.) Quite consistently, heir recommendations boil down to a kind of neo- Stoicism, drizzled with brain-juice. In a selfcongratulatory egalitarian age, you can no longer tell people to improve themselves morally. So self-improvement is couched in instrumental, scientifically approved terms.

The idea that a neurological explanation could exhaust the meaning of experience was already being mocked as “medical materialism” by the psychologist William James a century ago. And today’s ubiquitous rhetorical confidence about how the brain works papers over a still-enormous scientific uncertainty. Paul Fletcher, professor of health neuroscience at the University of Cambridge, says that he gets “exasperated” by much popular coverage of neuroimaging research, which assumes that “activity in a brain region is the answer to some profound question about psychological processes. This is very hard to justify given how little we currently know about what different regions of the brain actually do.” Too often, he tells me in an email correspondence, a popular writer will “opt for some sort of neuro-flapdoodle in which a highly simplistic and questionable point is accompanied by a suitably grand-sounding neural term and thus acquires a weightiness that it really doesn’t deserve. In my view, this is no different to some mountebank selling quacksalve by talking about the physics of water molecules’ memories, or a beautician talking about action liposomes.”

Shades of grey

The human brain, it is said, is the most complex object in the known universe. That a part of it “lights up” on an fMRI scan does not mean the rest is inactive; nor is it obvious what any such lighting-up indicates; nor is it straightforward to infer general lessons about life from experiments conducted under highly artificial conditions. Nor do we have the faintest clue about the biggest mystery of all – how does a lump of wet grey matter produce the conscious experience you are having right now, reading this paragraph? How come the brain gives rise to the mind? No one knows.

So, instead, here is a recipe for writing a hit popular brain book. You start each chapter with a pat anecdote about an individual’s professional or entrepreneurial success, or narrow escape from peril. You then mine the neuroscientific research for an apparently relevant specific result and narrate the experiment, perhaps interviewing the scientist involved and describing his hair. You then climax in a fit of premature extrapolation, inferring from the scientific result a calming bromide about what it is to function optimally as a modern human being. Voilà, a laboratory-sanctioned Big Idea in digestible narrative form. This is what the psychologist Christopher Chabris has named the “story-study-lesson” model, perhaps first perfected by one Malcolm Gladwell. A series of these threesomes may be packaged into a book, and then resold again and again as a stand-up act on the wonderfully lucrative corporate lecture circuit.

Such is the rigid formula of Imagine: How Creativity Works, published in March this year by the American writer Jonah Lehrer. The book is a shatteringly glib mishmash of magazine yarn, bizarrely incompetent literary criticism, inspiring business stories about mops and dolls and zany overinterpretation of research findings in neuroscience and psychology. Lehrer responded to my hostile review of the book by claiming that I thought the science he was writing about was “useless”, but such garbage needs to be denounced precisely in defence of the achievements of science. (In a sense, as Paul Fletcher points out, such books are “anti science, given that science is supposed to be  our protection against believing whatever we find most convenient, comforting or compelling”.) More recently, Lehrer admitted fabricating quotes by Bob Dylan in Imagine, which was hastily withdrawn from sale, and he resigned from his post at the New Yorker. To invent things supposedly said by the most obsessively studied popular artist of our age is a surprising gambit. Perhaps Lehrer misunderstood his own advice about creativity.

Mastering one’s own brain is also the key to survival in a dog-eat-dog corporate world, as promised by the cognitive scientist Art Markman’s Smart Thinking: How to Think Big, Innovate and Outperform Your Rivals. Meanwhile, the field (or cult) of “neurolinguistic programming” (NLP) sells techniques not only of self-overcoming but of domination over others. (According to a recent NLP handbook, you can “create virtually any and all states” in other people by using “embedded commands”.) The employee using such arcane neurowisdom will get promoted over the heads of his colleagues; the executive will discover expert-sanctioned ways to render his underlings more docile and productive, harnessing “creativity” for profit.

Waterstones now even has a display section labelled “Smart Thinking”, stocked with pop brain tracts. The true function of such books, of course, is to free readers from the responsibility of thinking for themselves. This is made eerily explicit in the psychologist Jonathan Haidt’s The Righteous Mind, published last March, which claims to show that “moral knowledge” is best obtained through “intuition” (arising from unconscious brain processing) rather than by explicit reasoning. “Anyone who values truth should stop worshipping reason,” Haidt enthuses, in a perverse manifesto for autolobotomy. I made an Olympian effort to take his advice seriously, and found myself rejecting the reasoning of his entire book.

Modern neuro-self-help pictures the brain as a kind of recalcitrant Windows PC. You know there is obscure stuff going on under the hood, so you tinker delicately with what you can see to try to coax it into working the way you want. In an earlier age, thinkers pictured the brain as a marvellously subtle clockwork mechanism, that being the cutting-edge high technology of the day. Our own brain-as-computer metaphor has been around for decades: there is the “hardware”, made up of different physical parts (the brain), and the “software”, processing routines that use different neuronal “circuits”. Updating things a bit for the kids, the evolutionary psychologist Robert Kurzban, in Why Everyone (Else) Is a Hypocrite, explains that the brain is like an iPhone running a bunch of different apps.

Such metaphors are apt to a degree, as long as you remember to get them the right way round. (Gladwell, in Blink – whose motivational selfhelp slogan is that “we can control rapid cognition” – burblingly describes the fusiform gyrus as “an incredibly sophisticated piece of brain software”, though the fusiform gyrus is a physical area of the brain, and so analogous to “hardware” not “software”.) But these writers tend to reach for just one functional story about a brain subsystem – the story that fits with their Big Idea – while ignoring other roles the same system might play. This can lead to a comical inconsistency across different books, and even within the oeuvre of a single author.

Is dopamine “the molecule of intuition”, as Jonah Lehrer risibly suggested in The Decisive Moment (2009), or is it the basis of “the neural highway that’s responsible for generating the pleasurable emotions”, as he wrote in Imagine? (Meanwhile, Susan Cain’s Quiet: the Power of Introverts in a World That Can’t Stop Talking calls dopamine the “reward chemical” and postulates that extroverts are more responsive to it.) Other recurring stars of the pop literature are the hormone oxytocin (the “love chemical”) and mirror neurons, which allegedly explain empathy. Jonathan Haidt tells the weirdly unexplanatory micro-story that, in one experiment, “The subjects used their mirror neurons, empathised, and felt the other’s pain.” If I tell you to use your mirror neurons, do you know what to do? Alternatively, can you do as Lehrer advises and “listen to” your prefrontal cortex? Self-help can be a tricky business.

Cherry-picking

Distortion of what and how much we know is bound to occur, Paul Fletcher points out, if the literature is cherry-picked.

“Having outlined your theory,” he says, “you can then cite a finding from a neuroimaging study identifying, for example, activity in a brain region such as the insula . . . You then select from among the many theories of insula function, choosing the one that best fits with your overall hypothesis, but neglecting to mention that nobody really knows what the insula does or that there are many ideas about its possible function.”

But the great movie-monster of nearly all the pop brain literature is another region: the amygdala. It is routinely described as the “ancient” or “primitive” brain, scarily atavistic. There is strong evidence for the amygdala’s role in fear, but then fear is one of the most heavily studied emotions; popularisers downplay or ignore the amygdala’s associations with the cuddlier emotions and memory. The implicit picture is of our uneasy coexistence with a beast inside the head, which needs to be controlled if we are to be happy, or at least liberal. (In The Republican Brain, Mooney suggests that “conservatives and authoritarians” might be the nasty way they are because they have a “more active amygdala”.) René Descartes located the soul in the pineal gland; the moral of modern pop neuroscience is that original sin is physical – a bestial, demonic proto-brain lurking at the heart of darkness within our own skulls. It’s an angry ghost in the machine.

Indeed, despite their technical paraphernalia of neurotransmitters and anterior temporal gyruses, modern pop brain books are offering a spiritual topography. Such is the seductive appeal of fMRI brain scans, their splashes of red, yellow and green lighting up what looks like a black intracranial vacuum. In mass culture, the fMRI scan (usually merged from several individuals) has become a secular icon, the converse of a Hubble Space Telescope image. The latter shows us awe-inspiring vistas of distant nebulae, as though painstakingly airbrushed by a sci-fi book-jacket artist; the former peers the other way, into psychedelic inner space. And the pictures, like religious icons, inspire uncritical devotion: a 2008 study, Fletcher notes, showed that “people – even neuroscience undergrads – are more likely to believe a brain scan than a bar graph”.

In The Invisible Gorilla, Christopher Chabris and his collaborator Daniel Simons advise readers to be wary of such “brain porn”, but popular magazines, science websites and books are frenzied consumers and hypers of these scans. “This is your brain on music”, announces a caption to a set of fMRI images, and we are invited to conclude that we now understand more about the experience of listening to music. The “This is your brain on” meme, it seems, is indefinitely extensible: Google results offer “This is your brain on poker”, “This is your brain on metaphor”, “This is your brain on diet soda”, “This is your brain on God” and so on, ad nauseam. I hereby volunteer to submit to a functional magnetic-resonance imaging scan while reading a stack of pop neuroscience volumes, for an illuminating series of pictures entitled This Is Your Brain on Stupid Books About Your Brain.

None of the foregoing should be taken to imply that fMRI and other brain-investigation techniques are useless: there is beautiful and amazing science in how they work and what well-designed experiments can teach us. “One of my favourites,” Fletcher says, “is the observation that one can take measures of brain activity (either using fMRI or EEG) while someone is learning . . . a list of words, and that activity can actually predict whether particular words will be remembered when the person is tested later (even the next day). This to me demonstrates something important – that observing activity in the brain can tell us something about how somebody is processing stimuli in ways that the person themselves is unable to report. With measures like that, we can begin to see how valuable it is to measure brain activity – it is giving us information that would otherwise be hidden from us.”

In this light, one might humbly venture a preliminary diagnosis of the pop brain hacks’ chronic intellectual error. It is that they misleadingly assume we always know how to interpret such “hidden” information, and that it is always more reliably meaningful than what lies in plain view. The hucksters of neuroscientism are the conspiracy theorists of the human animal, the 9/11 Truthers of the life of the mind.

Steven Poole is the author of the forthcoming book “You Aren’t What You Eat”, which will be published by Union Books in October.

This article was updated on 18 September 2012.

This article first appeared in the 10 September 2012 issue of the New Statesman, Autumn politics special

AKG-IMAGES/ULLSTEIN BILD
Show Hide image

A nervous breakdown in the body politic

Are we too complacent in thinking that the toxic brew of paranoia and populism that brought Hitler to power will never be repeated?

The conventional wisdom holds that “all that is necessary for the triumph of evil is that good men do nothing”, in Edmund Burke’s familiar phrase; but this is at best a half-truth. Studying the biography of a moral monster triumphantly unleashed on the political and international stage points us to another perspective, no less important. What is necessary for the triumph of evil is that the ground should have been thoroughly prepared by countless small or not-so-small acts of petty malice, unthinking prejudice and collusion. Burke’s axiom, though it represents a powerful challenge to apathy, risks crediting evil with too much of a life of its own: out there, there are evil agencies, hostile to “us”, and we (good men and women) must mobilise to resist.

No doubt; but mobilising intelligently demands being willing to ask what habits and assumptions, as well as what chances and conditions, have made possible the risk of evil triumphing. And that leads us into deep waters, to a recognition of how what we tolerate or ignore or underestimate opens the way for disaster, the ways in which we are at least half-consciously complicit. If this is not to be the silly we-are-all-guilty response that has rightly been so much mocked, nor an absolution for the direct agents of great horrors, it needs a careful and unsparing scrutiny of the processes by which cultures become corruptible, vulnerable to the agendas of damaged and obsessional individuals.

This can be uncomfortable. It raises the awkward issue of what philosophers have learned to call “moral luck” – the fact that some people with immense potential for evil don’t actualise it, because the circumstances don’t present them with the chance, and that some others who might have spent their lives in blameless normality end up supervising transports to Auschwitz. Or, to take a sharply contemporary example, that one Muslim youth from a disturbed or challenging background becomes a suicide bomber but another from exactly the same background doesn’t. It is as though there were a sort of diabolical mirror image for the biblical Parable of the Sower: some seeds grow and some don’t, depending on the ground they fall on, or what chance external stimulus touches them at critical moments.

If what interests us is simply how to assign individuals rapidly and definitively to the categories of sheep and goats, saved and damned, this is offensively frustrating. But if we recognise that evil is in important respects a shared enterprise, we may be prompted to look harder at those patterns of behaviour and interaction that – in the worst cases – give permission to those who are most capable of extreme destructiveness, and to examine our personal, political and social life in the light of this.

***

It would be possible to argue that the anti-Semitism of a lot of German culture – as of European Christian culture overall – was never (at least in the modern period) genocidal and obsessed with absolute racial purity; limited but real possibilities of integration were taken for granted, converts to Christianity were not disadvantaged merely because of their race, and so on. Yet the truth is that this cultural hinterland offered a foothold to the mania of Adolf Hitler; that it gave him just enough of the permission he needed to identify his society’s problems with this clearly definable “alien” presence. In his new book, Hitler: the Ascent, Volker Ullrich compellingly tells us once again that no one could have been under any illusion about Hitler’s general intentions towards the Jews from his very first appearance as a political figure, even if the detailed planning of genocide (lucidly traced in the late David Cesarani’s recent, encyclopaedic Final Solution) took some time to solidify. Yet so much of the German public heard Hitler’s language as the slightly exaggerated version of a familiar trope and felt able to treat it as at worst an embarrassing overstatement of a common, even a common-sense, view. One of the most disturbing things about this story is the failure of so many (inside and outside Germany) to grasp that Hitler meant what he said; and this failure in turn reinforced the delusion of those who thought they could use and then sideline Hitler.

To say that Hitler “meant what he said”, however, can be misleading. It is one of the repeated and focal themes in Ullrich’s book that Hitler was a brazen, almost compulsive liar – or, perhaps better, a compulsive and inventive actor, devising a huge range of dramatic roles for himself: frustrated artist, creative patron, philosopher-king (there is a fine chapter on the intellectual and artistic circle he assembled frequently at his Berchtesgaden residence), workers’ friend, martyr for his people (he constantly insinuated that he believed himself doomed to a tragic and premature death), military or economic messiah and a good deal else besides. His notorious outbursts of hysterical rage seem to have been skilfully orchestrated as instruments of intimidation (though this did not exactly indicate that he was otherwise predictable). Ullrich devotes a fair measure of attention to the literal staging of National Socialism, the architectural gigantism of Albert Speer which gave the Führer the sophisticated theatre he craved. In all sorts of ways, Hitler’s regime was a profoundly theatrical exercise, from the great public displays at Nuremberg and the replanning of Berlin to the various private fantasies enacted by him and his close associates (Göring above all), and from the emotional roller coaster he created for his circle to the dangerously accelerated rate of military-industrial expansion with which he concealed the void at the centre of the German economy.

Theatre both presupposes and creates a public. In the anxiety and despair of post-Versailles Germany, there was a ready audience for the high drama of Nazism, including its scapegoating of demonic enemies within and without. And in turn, the shrill pitch of Hitler’s quasi-liturgies normalised a whole set of bizarre and fantastic constructions of reality. A N Wilson’s challenging novel Winnie and Wolf, a fantasia on Hitler’s relations with Winifred Wagner, culminates in a scene at the end of the war where refugees and destitute citizens in Bayreuth raid the wardrobe of the opera house and wander the streets dressed in moth-eaten costumes; it is an unforgettable metaphor for one of the effects of Hitlerian theatre. Ullrich leaves his readers contemplating the picture of a vast collective drama centred on a personality that was not – as some biographers have suggested – something of a cipher, but that of a fantasist on a grand scale, endowed with a huge literal and metaphorical budget for staging his work.

All of this prompts questions about how it is that apparently sophisticated political systems succumb to corporate nervous breakdowns. It is anything but an academic question in a contemporary world where theatrical politics, tribal scapegoating and variegated confusions about the rule of law are increasingly in evidence. On this last point, it is still shocking to realise how rapidly post-Versailles Germany came to regard violent public conflict between heavily armed militias as almost routine, and this is an important background to the embittered negotiations later on around the relation between Hitler’s Sturmabteilung and the official organs of state coercion. Ullrich’s insightful account of a de facto civil war in Bavaria in the early 1920s makes it mercilessly plain that any pretensions to a state monopoly of coercion in Germany in this period were empty.

Yet the idea of such a state monopoly is in fact essential to anything that could be called a legitimate democracy. In effect, the polity of the Third Reich “privatised” coer­cion: again and again in Ullrich’s book, in the struggles for power before 1933, we see Nazi politicians successfully bidding for control of the mechanisms of public order in the German regions, and more or less franchising public order to their own agencies. A classical democratic political philosophy would argue that the state alone has the right to use force because the state is the guarantor of every community’s and every individual’s access to redress for injury or injustice. If state coercion becomes a tool for any one element in the social complex, it loses legitimacy. It is bound up with the rule of law, which is about something more than mere majority consent. One way of reading the rise of Hitler and National Socialism is as the steady and consistent normalising of illegitimate or partisan force, undermining any concept of an independent guarantee of lawfulness in society. It is the deliberate dissolution of the idea of a Rechtsstaat, a law-governed state order that can be recognised by citizens as organised for their common and individual good. Rule by decree, the common pattern of Nazi governmental practice, worked in harness with law enforcement by a force that was essentially a toxic hybrid, combining what was left of an independent police operation with a highly organised party militia system.

So, one of the general imperatives with which Hitler’s story might leave us is the need to keep a clear sense of what the proper work of the state involves. Arguments about the ideal “size” of the state are often spectacularly indifferent to the basic question of what the irreducible functions of state authority are – and so to the question of what cannot be franchised or delegated to non-state actors (it is extraordinary that we have in the UK apparently accepted without much debate the idea that prison security can be sold off to private interests). This is not the same as saying that privatisation in general leads to fascism; the issues around the limits to state direction of an economy are complex. However, a refusal to ask some fundamental questions about the limits of “franchising” corrodes the idea of real democratic legitimacy – the legitimacy that arises from an assurance to every citizen that, whatever their convictions or their purchasing power, the state is there to secure their access to justice. And, connected with this, there are issues about how we legislate: what are the proper processes of scrutiny for legislation, and how is populist and short-view legislation avoided? The Third Reich offers a masterclass in executive tyranny, and we need not only robust and intelligent counter-models, but a clear political theory to make sense of and defend those models.

***

Theatre has always been an aspect of the political. But there are different kinds of theatre. In ancient Athens, the annual Dionysia festival included the performance of tragedies that forced members of the audience to acknowledge the fragility of the political order and encouraged them to meditate on the divine interventions that set a boundary to vendetta and strife. Classical tragedy is, as political theatre, the exact opposite of Hitlerian drama, which repeatedly asserted the solid power of the Reich, the overcoming of weakness and division by the sheer, innate force of popular will as expressed through the Führer.

Contemporary political theatre is not – outside the more nakedly totalitarian states – a matter of Albert Speer-like spectacle and affirmation of a quasi-divine leader; but it is increasingly the product of a populist-oriented market, the parading of celebrities for popular approval, with limited possibilities for deep public discussion of policies advanced, and an assumption that politicians will be, above all, performers. It is not – to warn once again against cliché and exaggeration – that celebrity culture in politics is a short route to fascism. But a political theatre that never deals with the fragility of the context in which law and civility operate, that never admits the internal flaws and conflicts of a society, and never allows some corporate opening-up to the possibilities of reconciliation and reparation, is one that exploits, rather than resolves our anxieties. And, as such, it makes us politically weaker, more confused and fragmented.

The extraordinary mixture of farce and menace in Donald Trump’s campaign is a potent distillation of all this: a political theatre, divorced from realism, patience and human solidarity, bringing to the surface the buried poisons of a whole system and threatening its entire viability and rationality. But it is an extreme version of the way in which modern technology-and-image-driven communication intensifies the risks that beset the ideals of legitimate democracy.

And – think of Trump once again – one of the most seductively available tricks of such a theatre is the rhetoric of what could be called triumphant victimhood: we are menaced by such and such a group (Jews, mig­rants, Muslims, Freemasons, international business, Zionism, Marxism . . .), which has exerted its vast but covert influence to destroy us; but our native strength has brought us through and, given clear leadership, will soon, once and for all, guarantee our safety from these nightmare aliens.

***

This is a rhetoric that depends on ideas of collective guilt or collective malignity: plots ascribed to the agency of some dangerous minority are brandished in order to tarnish the name of entire communities. The dark legacy of much popular Christian language about collective Jewish guilt for the death of Jesus could be translated without much difficulty into talk about the responsibility of Jews for the violence and poverty afflicting Germans in the 1920s. (Shadows of the same myths still affect the way in which – as recent reports suggest – sinister, vague talk about Zionism and assumptions of a collective Jewish guilt for the actions of various Israeli politicians can become part of a climate that condones anti-Semitic bullying, or text messages saying “Hitler had a point”, on university campuses.)

Granted that there is no shortage of other candidates for demonic otherness in Europe and the United States (witness Trump’s language about Muslims and Mexicans), the specific and abiding lesson of Nazi anti-Semitism is the twofold recognition of the ease with which actually disadvantaged communities can be cast in the role of all-powerful subverters, and the way in which the path to violent exclusion of one kind or another can be prepared by cultures of casual bigotry and collective anxiety or self-pity, dramatised by high-temperature styles of media communication.

Marie Luise Knott’s recent short book Unlearning With Hannah Arendt (2014) revisits the controversy over Arendt’s notorious characterisation of the mindset of Nazism as “the banality of evil”, and brilliantly shows how her point is to do with the erosion in Hitlerian Germany of the capacity to think, to understand one’s agency as answerable to more than public pressure and fashion, to hold to notions of honour and dignity independent of status, convention or influence – but also, ultimately, the erosion of a sense of the ridiculous. The victory of public cliché and stereotype is, in Arendt’s terms, a protection against reality, “against the claim on our thinking attention that all events and facts make by virtue of their existence”, as she memorably wrote in The Life of the Mind. Hitler was committed to the destruction of anything that challenged the simple self-identity and self-justification of the race and the nation; hence, as Ullrich shows in an acutely argued chapter of Hitler: a Biography, the Führer’s venom against the churches, despite their (generally) embarrassingly lukewarm resistance to the horrors of the Reich. The problem was that the churches’ rationale entailed just that accountability to more than power and political self-identity that Nazi philosophy treated as absolute. They had grounds for thinking Nazism not only evil, but absurd. Perhaps, then, one of the more unexpected questions we are left with by a study of political nightmare such as Ullrich’s excellent book is how we find the resources for identifying the absurd as well as for clarifying the grounds of law and honour.

The threats now faced by “developed” democracy are not those of the 1920s and 1930s; whatever rough beasts are on their way are unlikely to have the exact features of Hitler’s distinctive blend of criminality and melodrama. But this does not mean that we shouldn’t be looking as hard as we can at the lessons to be learned from the collapse of political legality, the collective panics and myths, the acceptance of delusional and violent public theatre that characterised Hitler’s Germany. For evil to triumph, what is necessary is for societies to stop thinking, to stop developing an eye for the absurd as well as the corrupt in language and action, public or private.

Hitler: a Biography – Volume I: Ascent by Volker Ullrich is published by the Bodley Head

Rowan Williams is an Anglican prelate, theologian and poet, who was Archbishop of Canterbury from 2002 to 2012. He writes on books for the New Statesman

This article first appeared in the 28 April 2016 issue of the New Statesman, The new fascism