This is a metaphor. Photograph: Getty Images
Show Hide image

Your brain on pseudoscience: the rise of popular neurobollocks

The “neuroscience” shelves in bookshops are groaning. But are the works of authors such as Malcolm Gladwell and Jonah Lehrer just self-help books dressed up in a lab coat?

An intellectual pestilence is upon us. Shop shelves groan with books purporting to explain, through snazzy brain-imaging studies, not only how thoughts and emotions function, but how politics and religion work, and what the correct answers are to age-old philosophical controversies. The dazzling real achievements of brain research are routinely pressed into service for questions they were never designed to answer. This is the plague of neuroscientism – aka neurobabble, neurobollocks, or neurotrash – and it’s everywhere.

In my book-strewn lodgings, one literally trips over volumes promising that “the deepest mysteries of what makes us who we are are gradually being unravelled” by neuroscience and cognitive psychology. (Even practising scientists sometimes make such grandiose claims for a general audience, perhaps urged on by their editors: that quotation is from the psychologist Elaine Fox’s interesting book on “the new science of optimism”, Rainy Brain, Sunny Brain, published this summer.) In general, the “neural” explanation has become a gold standard of non-fiction exegesis, adding its own brand of computer-assisted lab-coat bling to a whole new industry of intellectual quackery that affects to elucidate even complex sociocultural phenomena. Chris Mooney’s The Republican Brain: the Science of Why They Deny Science – and Reality disavows “reductionism” yet encourages readers to treat people with whom they disagree more as pathological specimens of brain biology than as rational interlocutors.

The New Atheist polemicist Sam Harris, in The Moral Landscape, interprets brain and other research as showing that there are objective moral truths, enthusiastically inferring – almost as though this were the point all along – that science proves “conservative Islam” is bad.

Happily, a new branch of the neuroscienceexplains everything genre may be created at any time by the simple expedient of adding the prefix “neuro” to whatever you are talking about. Thus, “neuroeconomics” is the latest in a long line of rhetorical attempts to sell the dismal science as a hard one; “molecular gastronomy” has now been trumped in the scientised gluttony stakes by “neurogastronomy”; students of Republican and Democratic brains are doing “neuropolitics”; literature academics practise “neurocriticism”. There is “neurotheology”, “neuromagic” (according to Sleights of Mind, an amusing book about how conjurors exploit perceptual bias) and even “neuromarketing”. Hoping it’s not too late to jump on the bandwagon, I have decided to announce that I, too, am skilled in the newly minted fields of neuroprocrastination and neuroflâneurship.

Illumination is promised on a personal as well as a political level by the junk enlightenment of the popular brain industry. How can I become more creative? How can I make better decisions? How can I be happier? Or thinner? Never fear: brain research has the answers. It is self-help armoured in hard science. Life advice is the hook for nearly all such books. (Some cram the hard sell right into the title – such as John B Arden’s Rewire Your Brain: Think Your Way to a Better Life.) Quite consistently, heir recommendations boil down to a kind of neo- Stoicism, drizzled with brain-juice. In a selfcongratulatory egalitarian age, you can no longer tell people to improve themselves morally. So self-improvement is couched in instrumental, scientifically approved terms.

The idea that a neurological explanation could exhaust the meaning of experience was already being mocked as “medical materialism” by the psychologist William James a century ago. And today’s ubiquitous rhetorical confidence about how the brain works papers over a still-enormous scientific uncertainty. Paul Fletcher, professor of health neuroscience at the University of Cambridge, says that he gets “exasperated” by much popular coverage of neuroimaging research, which assumes that “activity in a brain region is the answer to some profound question about psychological processes. This is very hard to justify given how little we currently know about what different regions of the brain actually do.” Too often, he tells me in an email correspondence, a popular writer will “opt for some sort of neuro-flapdoodle in which a highly simplistic and questionable point is accompanied by a suitably grand-sounding neural term and thus acquires a weightiness that it really doesn’t deserve. In my view, this is no different to some mountebank selling quacksalve by talking about the physics of water molecules’ memories, or a beautician talking about action liposomes.”

Shades of grey

The human brain, it is said, is the most complex object in the known universe. That a part of it “lights up” on an fMRI scan does not mean the rest is inactive; nor is it obvious what any such lighting-up indicates; nor is it straightforward to infer general lessons about life from experiments conducted under highly artificial conditions. Nor do we have the faintest clue about the biggest mystery of all – how does a lump of wet grey matter produce the conscious experience you are having right now, reading this paragraph? How come the brain gives rise to the mind? No one knows.

So, instead, here is a recipe for writing a hit popular brain book. You start each chapter with a pat anecdote about an individual’s professional or entrepreneurial success, or narrow escape from peril. You then mine the neuroscientific research for an apparently relevant specific result and narrate the experiment, perhaps interviewing the scientist involved and describing his hair. You then climax in a fit of premature extrapolation, inferring from the scientific result a calming bromide about what it is to function optimally as a modern human being. Voilà, a laboratory-sanctioned Big Idea in digestible narrative form. This is what the psychologist Christopher Chabris has named the “story-study-lesson” model, perhaps first perfected by one Malcolm Gladwell. A series of these threesomes may be packaged into a book, and then resold again and again as a stand-up act on the wonderfully lucrative corporate lecture circuit.

Such is the rigid formula of Imagine: How Creativity Works, published in March this year by the American writer Jonah Lehrer. The book is a shatteringly glib mishmash of magazine yarn, bizarrely incompetent literary criticism, inspiring business stories about mops and dolls and zany overinterpretation of research findings in neuroscience and psychology. Lehrer responded to my hostile review of the book by claiming that I thought the science he was writing about was “useless”, but such garbage needs to be denounced precisely in defence of the achievements of science. (In a sense, as Paul Fletcher points out, such books are “anti science, given that science is supposed to be  our protection against believing whatever we find most convenient, comforting or compelling”.) More recently, Lehrer admitted fabricating quotes by Bob Dylan in Imagine, which was hastily withdrawn from sale, and he resigned from his post at the New Yorker. To invent things supposedly said by the most obsessively studied popular artist of our age is a surprising gambit. Perhaps Lehrer misunderstood his own advice about creativity.

Mastering one’s own brain is also the key to survival in a dog-eat-dog corporate world, as promised by the cognitive scientist Art Markman’s Smart Thinking: How to Think Big, Innovate and Outperform Your Rivals. Meanwhile, the field (or cult) of “neurolinguistic programming” (NLP) sells techniques not only of self-overcoming but of domination over others. (According to a recent NLP handbook, you can “create virtually any and all states” in other people by using “embedded commands”.) The employee using such arcane neurowisdom will get promoted over the heads of his colleagues; the executive will discover expert-sanctioned ways to render his underlings more docile and productive, harnessing “creativity” for profit.

Waterstones now even has a display section labelled “Smart Thinking”, stocked with pop brain tracts. The true function of such books, of course, is to free readers from the responsibility of thinking for themselves. This is made eerily explicit in the psychologist Jonathan Haidt’s The Righteous Mind, published last March, which claims to show that “moral knowledge” is best obtained through “intuition” (arising from unconscious brain processing) rather than by explicit reasoning. “Anyone who values truth should stop worshipping reason,” Haidt enthuses, in a perverse manifesto for autolobotomy. I made an Olympian effort to take his advice seriously, and found myself rejecting the reasoning of his entire book.

Modern neuro-self-help pictures the brain as a kind of recalcitrant Windows PC. You know there is obscure stuff going on under the hood, so you tinker delicately with what you can see to try to coax it into working the way you want. In an earlier age, thinkers pictured the brain as a marvellously subtle clockwork mechanism, that being the cutting-edge high technology of the day. Our own brain-as-computer metaphor has been around for decades: there is the “hardware”, made up of different physical parts (the brain), and the “software”, processing routines that use different neuronal “circuits”. Updating things a bit for the kids, the evolutionary psychologist Robert Kurzban, in Why Everyone (Else) Is a Hypocrite, explains that the brain is like an iPhone running a bunch of different apps.

Such metaphors are apt to a degree, as long as you remember to get them the right way round. (Gladwell, in Blink – whose motivational selfhelp slogan is that “we can control rapid cognition” – burblingly describes the fusiform gyrus as “an incredibly sophisticated piece of brain software”, though the fusiform gyrus is a physical area of the brain, and so analogous to “hardware” not “software”.) But these writers tend to reach for just one functional story about a brain subsystem – the story that fits with their Big Idea – while ignoring other roles the same system might play. This can lead to a comical inconsistency across different books, and even within the oeuvre of a single author.

Is dopamine “the molecule of intuition”, as Jonah Lehrer risibly suggested in The Decisive Moment (2009), or is it the basis of “the neural highway that’s responsible for generating the pleasurable emotions”, as he wrote in Imagine? (Meanwhile, Susan Cain’s Quiet: the Power of Introverts in a World That Can’t Stop Talking calls dopamine the “reward chemical” and postulates that extroverts are more responsive to it.) Other recurring stars of the pop literature are the hormone oxytocin (the “love chemical”) and mirror neurons, which allegedly explain empathy. Jonathan Haidt tells the weirdly unexplanatory micro-story that, in one experiment, “The subjects used their mirror neurons, empathised, and felt the other’s pain.” If I tell you to use your mirror neurons, do you know what to do? Alternatively, can you do as Lehrer advises and “listen to” your prefrontal cortex? Self-help can be a tricky business.


Distortion of what and how much we know is bound to occur, Paul Fletcher points out, if the literature is cherry-picked.

“Having outlined your theory,” he says, “you can then cite a finding from a neuroimaging study identifying, for example, activity in a brain region such as the insula . . . You then select from among the many theories of insula function, choosing the one that best fits with your overall hypothesis, but neglecting to mention that nobody really knows what the insula does or that there are many ideas about its possible function.”

But the great movie-monster of nearly all the pop brain literature is another region: the amygdala. It is routinely described as the “ancient” or “primitive” brain, scarily atavistic. There is strong evidence for the amygdala’s role in fear, but then fear is one of the most heavily studied emotions; popularisers downplay or ignore the amygdala’s associations with the cuddlier emotions and memory. The implicit picture is of our uneasy coexistence with a beast inside the head, which needs to be controlled if we are to be happy, or at least liberal. (In The Republican Brain, Mooney suggests that “conservatives and authoritarians” might be the nasty way they are because they have a “more active amygdala”.) René Descartes located the soul in the pineal gland; the moral of modern pop neuroscience is that original sin is physical – a bestial, demonic proto-brain lurking at the heart of darkness within our own skulls. It’s an angry ghost in the machine.

Indeed, despite their technical paraphernalia of neurotransmitters and anterior temporal gyruses, modern pop brain books are offering a spiritual topography. Such is the seductive appeal of fMRI brain scans, their splashes of red, yellow and green lighting up what looks like a black intracranial vacuum. In mass culture, the fMRI scan (usually merged from several individuals) has become a secular icon, the converse of a Hubble Space Telescope image. The latter shows us awe-inspiring vistas of distant nebulae, as though painstakingly airbrushed by a sci-fi book-jacket artist; the former peers the other way, into psychedelic inner space. And the pictures, like religious icons, inspire uncritical devotion: a 2008 study, Fletcher notes, showed that “people – even neuroscience undergrads – are more likely to believe a brain scan than a bar graph”.

In The Invisible Gorilla, Christopher Chabris and his collaborator Daniel Simons advise readers to be wary of such “brain porn”, but popular magazines, science websites and books are frenzied consumers and hypers of these scans. “This is your brain on music”, announces a caption to a set of fMRI images, and we are invited to conclude that we now understand more about the experience of listening to music. The “This is your brain on” meme, it seems, is indefinitely extensible: Google results offer “This is your brain on poker”, “This is your brain on metaphor”, “This is your brain on diet soda”, “This is your brain on God” and so on, ad nauseam. I hereby volunteer to submit to a functional magnetic-resonance imaging scan while reading a stack of pop neuroscience volumes, for an illuminating series of pictures entitled This Is Your Brain on Stupid Books About Your Brain.

None of the foregoing should be taken to imply that fMRI and other brain-investigation techniques are useless: there is beautiful and amazing science in how they work and what well-designed experiments can teach us. “One of my favourites,” Fletcher says, “is the observation that one can take measures of brain activity (either using fMRI or EEG) while someone is learning . . . a list of words, and that activity can actually predict whether particular words will be remembered when the person is tested later (even the next day). This to me demonstrates something important – that observing activity in the brain can tell us something about how somebody is processing stimuli in ways that the person themselves is unable to report. With measures like that, we can begin to see how valuable it is to measure brain activity – it is giving us information that would otherwise be hidden from us.”

In this light, one might humbly venture a preliminary diagnosis of the pop brain hacks’ chronic intellectual error. It is that they misleadingly assume we always know how to interpret such “hidden” information, and that it is always more reliably meaningful than what lies in plain view. The hucksters of neuroscientism are the conspiracy theorists of the human animal, the 9/11 Truthers of the life of the mind.

Steven Poole is the author of the forthcoming book “You Aren’t What You Eat”, which will be published by Union Books in October.

This article was updated on 18 September 2012.

This article first appeared in the 10 September 2012 issue of the New Statesman, Autumn politics special

Show Hide image

The new puritans: What Theresa May and Jeremy Corbyn have in common

In different ways, Jeremy Corbyn and Theresa May are “puritans”. Each has a strict view of what public life should be – and their manners are a rebuke to the low hucksterism that has disfigured our politics.

A puritan revival is under way. It explains the success of Jeremy Corbyn and, in a subtler way, the rise of Theresa May. It also underpins the hatred of figures such as Tony Blair and Boris Johnson, and the disgust one feels as one gazes at a Mediterranean view, spoiled by the superyachts of plutocrats who wish to proclaim their unbounded wealth and utter lack of taste.

Take Corbyn first. The puritan distrust of theatre is plainly what inhibits him from even attempting, most of the time, to make anything in the way of a witty, let alone flamboyant, retort to the Prime Minister. Corbyn’s supporters admire this, for they, too, are puritans. As the shadow chancellor, John McDonnell, recently said, they think he is more upright and honest because he disdains the politics of display. In their eyes, to act a part is to be untruthful and, therefore, sinful: a point confirmed by the pleasure it might give.

Theatre can, of course, be done in many different ways, and whenever one style has prevailed for too long it creates a hunger for something new. Kitchen-sink drama is, at its best, a delightful change from the well-made play or, in Labour’s case, the well-made spin at which Peter Mandelson and Alastair Campbell excelled. Every politician’s mannerisms become wearisome in the end: Stanley Baldwin’s pious bromides, Harold Macmillan’s Edwardian ­affectations, Harold Wilson’s cheeky chappie act.

But to imagine one can get by, in politics, without putting on a performance of some kind is madness. How else is the audience’s interest to be engaged? I mean the wider audience, which for most of the time ignores politics. When Claud Cockburn arrived in Washington as a young man to work for the Times, he was advised by Willmott Lewis, the celebrated correspondent for whom he would be standing in: “I think it well to remember that, when writing for the newspapers, we are writing for an elderly lady in Hastings who has two cats of which she is passionately fond. Unless our stuff can successfully compete for her interest with those cats, it is no good.”

The same is true of political leaders. But whenever one makes the elementary point to a Corbyn supporter that the Labour leader is not only bad at engaging the interest of people who are more interested in their cats, but does not even conceive that it would be a good idea to do so, the supporter takes this as a compliment to Corbyn. He, at least, is pure enough not to engage in the low hucksterism that has disfigured our politics. He may be wilfully understated, Pooterish and dull, but he can congratulate himself on being unspotted by Blair’s worldliness, greed and pro-Americanism.

The Labour Party has not yet split, but is already divided by a gulf of incomprehension. On one side stand the puritans, whose self-righteousness is fortified by criticism, which to them is proof of their virtue. On the other side stand the careerists, who think it pointless to be in politics unless you are at some stage going to win power, but who cannot tell us the point of doing so. Nobody since Tony Crosland has managed to give a persuasive account of the future of socialism (his book was published in 1956), but Corbyn at least enables his followers to believe that puritanism, understood as a return to the original verities of their faith, has a future, even though the policies needed to achieve this remain elusive.

The new spirit of puritanism can be found in the Conservative Party, too. A ruthless purge of the plutocrats has taken place. By holding the EU referendum, David Cameron, an Old Etonian descended from a long line of stockbrokers, took a gamble that did not pay off. He knew he had to go, and Theresa May has since sacked most of his coterie. One of the few to make the transition from the old regime to the new is Gavin Williamson, who served for three years as Cameron’s parliamentary private secretary. He joined May’s campaign as soon as Cameron resigned as prime minister, became her parliamentary campaign manager a day later, and so impressed her with his ability to marshal Tory MPs that she appointed him Chief Whip in July.

Williamson was educated at state schools in Scarborough, read social sciences at the University of Bradford, worked in the pottery industry in Stoke-on-Trent, fought Blackpool North and Fleetwood in 2005, was elected for South Staffordshire in 2010, and in his maiden speech to parliament ­asserted that manufacturers “often have a lot more common sense than bankers”. Under May’s leadership, this sort of proudly provincial background is more in favour than it was under Cameron.

May’s closest adviser, Nick Timothy, is from Birmingham. Both of his parents left school at the age of 14, but he went to King Edward VI in Aston, the grammar school for boys, which he describes as a “transformational” experience with “extraordinarily brilliant teachers”, after which he became the first member of his family to go to university, studying politics at Sheffield. Many people are puzzled that the Prime Minister has taken the risk of deciding to create new grammar schools, and wonder why she has done this. A large part of the answer is surely that she and Timothy think it is the right thing to do. They are true believers who feel themselves called on to show courage in defence of what they know to be right.

Unlike Cameron and George Osborne, they are confident that they are in touch with people of modest means, who cannot dream of paying school fees. It does not occur to them that, with their own fond memories of grammar schools, they may be out of touch with state education as it has evolved over the past 20 years. Towards the end of May’s time there, Holton Girls’ Grammar School in Oxfordshire was turned into the comprehensive Wheatley Park School, and the transition was not, at first, a success.

Timothy drafted May’s first statement as Prime Minister, in which she said: “If you’re from an ordinary working-class family, life is much harder than many people in Westminster realise . . . The government I lead will be driven not by the interests of the privileged few, but by yours.”

This rhetoric does not exactly make May a puritan. She is an Anglican, which is an altogether more complicated thing. Her father trained for the priesthood in the Community of the Resurrection at Mirfield, in West Yorkshire, which promulgates an austere and deeply felt Anglo-Catholicism, with roots in Christian socialism. The Prime Minister’s dress sense cannot be described as austere, but her attitudes usually are. At Mirfield, a monastic foundation, one gets up awfully early, in order to attend the first services before breakfast.


Boris Johnson is the least puritanical figure in British politics. He nevertheless helps to illustrate the rise of puritanism: respectable people often say how entertaining he is and even start laughing as they relate his exploits, but then remember how serious they themselves are and add that his amusement value is, naturally, a disqualification for high office. Johnson is a star performer in the theatre of politics, capable (as he showed during the 2012 London Olympics) of eclipsing his rivals, and this summer he helped swing the referendum result for Brexit. A senior figure in the Leave campaign said that when Johnson attacked President Barack Obama for coming to Britain and telling us how to vote, the polls moved in Leave’s favour, even though (or perhaps in part because) the attack was condemned by high-minded commentators.

Johnson was given the job of Foreign Secretary in order to help reunite the Conservatives, because he might be good at it and also because he had the wit, as soon as Michael Gove deserted his campaign, to recognise that May was going to win the leadership election. But the losing side in the referendum had immediately blamed Johnson for its defeat. It accused him of not only populism, but opportunism: telling lies, stirring up racism and wrecking the economy in order to seize power for himself. For the first time in his life, Johnson’s enemies didn’t just scorn him, they hated him.

Long ago, when he went to Brussels as a correspondent, his rivals accused him of embroidering his news stories for the Daily Telegraph in a way that was not strictly true. This was intensely annoying for them, especially when they were hauled out of bed to follow up reports that turned out to be inaccurate. They were not prepared to accept the defence that Johnson had made these imaginative embellishments in order to dramatise a deeper truth – namely, that Jacques Delors, the then president of the European Commission, was grabbing power at the expense of the nation states.

Puritans cannot accept that it is permissible, or even praiseworthy, to draw a caricature in order to show what a person is really like. They possess a painful literal-mindedness. Their aim is to purify religion by stripping away the corruption of later centuries and getting back to the simple, honest faith of the first believers.

In the United States, a country founded by puritans, each president arrives promising to return the republic to a state of pristine perfection by cleansing Washington of crooked lobbyists. The new president’s mission is to protect the people from the politicians. After a while, it becomes apparent that the president is, after all, a politician, too, and the process starts all over again.

In Britain, the desire to purify the system recurs at similarly frequent intervals. Before the 1970 general election, the then Conservative leader, Edward Heath (the subject of A Singular Life, an absorbing new study by Michael McManus), promised to sweep away the “trivialities and gimmicks” that had characterised Harold Wilson’s six years as Labour prime minister. Douglas Hurd, who was working for Heath, said this declaration, made in the foreword to that year’s Tory manifesto, was entirely sincere:


There runs through it a note of genuine puritan protest, which is familiar in British history, sometimes in one party, sometimes in the other. It is the note struck by Pym against the court of Charles I, by Pitt against the Fox-North coalition, by Gladstone against Disraeli, by the Conservatives in 1922 against Lloyd George. It is the outraged assertion of a strict view of what public life is about, after a period in which its rules have been perverted and its atmosphere corrupted.


To many people’s surprise, though not his own, Heath won the 1970 election. Yet his puritanism was insufficient to guide him through the difficulties that followed, and in 1974 he was out of office again. His astounding bad manners to colleagues, which the following February helped bring about his downfall from the Conservative leadership (won by Margaret Thatcher), sprang in part from his puritanical refusal to accept that courtly behaviour, with its connotations of idleness and insincerity, could ever be worth bothering about.

Andrew Gimson is the author of “Boris: the Adventures of Boris Johnson”, out now in an updated edition (Simon & Schuster)

This article first appeared in the 29 September 2016 issue of the New Statesman, May’s new Tories