This is a metaphor. Photograph: Getty Images
Show Hide image

Your brain on pseudoscience: the rise of popular neurobollocks

The “neuroscience” shelves in bookshops are groaning. But are the works of authors such as Malcolm Gladwell and Jonah Lehrer just self-help books dressed up in a lab coat?

An intellectual pestilence is upon us. Shop shelves groan with books purporting to explain, through snazzy brain-imaging studies, not only how thoughts and emotions function, but how politics and religion work, and what the correct answers are to age-old philosophical controversies. The dazzling real achievements of brain research are routinely pressed into service for questions they were never designed to answer. This is the plague of neuroscientism – aka neurobabble, neurobollocks, or neurotrash – and it’s everywhere.

In my book-strewn lodgings, one literally trips over volumes promising that “the deepest mysteries of what makes us who we are are gradually being unravelled” by neuroscience and cognitive psychology. (Even practising scientists sometimes make such grandiose claims for a general audience, perhaps urged on by their editors: that quotation is from the psychologist Elaine Fox’s interesting book on “the new science of optimism”, Rainy Brain, Sunny Brain, published this summer.) In general, the “neural” explanation has become a gold standard of non-fiction exegesis, adding its own brand of computer-assisted lab-coat bling to a whole new industry of intellectual quackery that affects to elucidate even complex sociocultural phenomena. Chris Mooney’s The Republican Brain: the Science of Why They Deny Science – and Reality disavows “reductionism” yet encourages readers to treat people with whom they disagree more as pathological specimens of brain biology than as rational interlocutors.

The New Atheist polemicist Sam Harris, in The Moral Landscape, interprets brain and other research as showing that there are objective moral truths, enthusiastically inferring – almost as though this were the point all along – that science proves “conservative Islam” is bad.

Happily, a new branch of the neuroscienceexplains everything genre may be created at any time by the simple expedient of adding the prefix “neuro” to whatever you are talking about. Thus, “neuroeconomics” is the latest in a long line of rhetorical attempts to sell the dismal science as a hard one; “molecular gastronomy” has now been trumped in the scientised gluttony stakes by “neurogastronomy”; students of Republican and Democratic brains are doing “neuropolitics”; literature academics practise “neurocriticism”. There is “neurotheology”, “neuromagic” (according to Sleights of Mind, an amusing book about how conjurors exploit perceptual bias) and even “neuromarketing”. Hoping it’s not too late to jump on the bandwagon, I have decided to announce that I, too, am skilled in the newly minted fields of neuroprocrastination and neuroflâneurship.

Illumination is promised on a personal as well as a political level by the junk enlightenment of the popular brain industry. How can I become more creative? How can I make better decisions? How can I be happier? Or thinner? Never fear: brain research has the answers. It is self-help armoured in hard science. Life advice is the hook for nearly all such books. (Some cram the hard sell right into the title – such as John B Arden’s Rewire Your Brain: Think Your Way to a Better Life.) Quite consistently, heir recommendations boil down to a kind of neo- Stoicism, drizzled with brain-juice. In a selfcongratulatory egalitarian age, you can no longer tell people to improve themselves morally. So self-improvement is couched in instrumental, scientifically approved terms.

The idea that a neurological explanation could exhaust the meaning of experience was already being mocked as “medical materialism” by the psychologist William James a century ago. And today’s ubiquitous rhetorical confidence about how the brain works papers over a still-enormous scientific uncertainty. Paul Fletcher, professor of health neuroscience at the University of Cambridge, says that he gets “exasperated” by much popular coverage of neuroimaging research, which assumes that “activity in a brain region is the answer to some profound question about psychological processes. This is very hard to justify given how little we currently know about what different regions of the brain actually do.” Too often, he tells me in an email correspondence, a popular writer will “opt for some sort of neuro-flapdoodle in which a highly simplistic and questionable point is accompanied by a suitably grand-sounding neural term and thus acquires a weightiness that it really doesn’t deserve. In my view, this is no different to some mountebank selling quacksalve by talking about the physics of water molecules’ memories, or a beautician talking about action liposomes.”

Shades of grey

The human brain, it is said, is the most complex object in the known universe. That a part of it “lights up” on an fMRI scan does not mean the rest is inactive; nor is it obvious what any such lighting-up indicates; nor is it straightforward to infer general lessons about life from experiments conducted under highly artificial conditions. Nor do we have the faintest clue about the biggest mystery of all – how does a lump of wet grey matter produce the conscious experience you are having right now, reading this paragraph? How come the brain gives rise to the mind? No one knows.

So, instead, here is a recipe for writing a hit popular brain book. You start each chapter with a pat anecdote about an individual’s professional or entrepreneurial success, or narrow escape from peril. You then mine the neuroscientific research for an apparently relevant specific result and narrate the experiment, perhaps interviewing the scientist involved and describing his hair. You then climax in a fit of premature extrapolation, inferring from the scientific result a calming bromide about what it is to function optimally as a modern human being. Voilà, a laboratory-sanctioned Big Idea in digestible narrative form. This is what the psychologist Christopher Chabris has named the “story-study-lesson” model, perhaps first perfected by one Malcolm Gladwell. A series of these threesomes may be packaged into a book, and then resold again and again as a stand-up act on the wonderfully lucrative corporate lecture circuit.

Such is the rigid formula of Imagine: How Creativity Works, published in March this year by the American writer Jonah Lehrer. The book is a shatteringly glib mishmash of magazine yarn, bizarrely incompetent literary criticism, inspiring business stories about mops and dolls and zany overinterpretation of research findings in neuroscience and psychology. Lehrer responded to my hostile review of the book by claiming that I thought the science he was writing about was “useless”, but such garbage needs to be denounced precisely in defence of the achievements of science. (In a sense, as Paul Fletcher points out, such books are “anti science, given that science is supposed to be  our protection against believing whatever we find most convenient, comforting or compelling”.) More recently, Lehrer admitted fabricating quotes by Bob Dylan in Imagine, which was hastily withdrawn from sale, and he resigned from his post at the New Yorker. To invent things supposedly said by the most obsessively studied popular artist of our age is a surprising gambit. Perhaps Lehrer misunderstood his own advice about creativity.

Mastering one’s own brain is also the key to survival in a dog-eat-dog corporate world, as promised by the cognitive scientist Art Markman’s Smart Thinking: How to Think Big, Innovate and Outperform Your Rivals. Meanwhile, the field (or cult) of “neurolinguistic programming” (NLP) sells techniques not only of self-overcoming but of domination over others. (According to a recent NLP handbook, you can “create virtually any and all states” in other people by using “embedded commands”.) The employee using such arcane neurowisdom will get promoted over the heads of his colleagues; the executive will discover expert-sanctioned ways to render his underlings more docile and productive, harnessing “creativity” for profit.

Waterstones now even has a display section labelled “Smart Thinking”, stocked with pop brain tracts. The true function of such books, of course, is to free readers from the responsibility of thinking for themselves. This is made eerily explicit in the psychologist Jonathan Haidt’s The Righteous Mind, published last March, which claims to show that “moral knowledge” is best obtained through “intuition” (arising from unconscious brain processing) rather than by explicit reasoning. “Anyone who values truth should stop worshipping reason,” Haidt enthuses, in a perverse manifesto for autolobotomy. I made an Olympian effort to take his advice seriously, and found myself rejecting the reasoning of his entire book.

Modern neuro-self-help pictures the brain as a kind of recalcitrant Windows PC. You know there is obscure stuff going on under the hood, so you tinker delicately with what you can see to try to coax it into working the way you want. In an earlier age, thinkers pictured the brain as a marvellously subtle clockwork mechanism, that being the cutting-edge high technology of the day. Our own brain-as-computer metaphor has been around for decades: there is the “hardware”, made up of different physical parts (the brain), and the “software”, processing routines that use different neuronal “circuits”. Updating things a bit for the kids, the evolutionary psychologist Robert Kurzban, in Why Everyone (Else) Is a Hypocrite, explains that the brain is like an iPhone running a bunch of different apps.

Such metaphors are apt to a degree, as long as you remember to get them the right way round. (Gladwell, in Blink – whose motivational selfhelp slogan is that “we can control rapid cognition” – burblingly describes the fusiform gyrus as “an incredibly sophisticated piece of brain software”, though the fusiform gyrus is a physical area of the brain, and so analogous to “hardware” not “software”.) But these writers tend to reach for just one functional story about a brain subsystem – the story that fits with their Big Idea – while ignoring other roles the same system might play. This can lead to a comical inconsistency across different books, and even within the oeuvre of a single author.

Is dopamine “the molecule of intuition”, as Jonah Lehrer risibly suggested in The Decisive Moment (2009), or is it the basis of “the neural highway that’s responsible for generating the pleasurable emotions”, as he wrote in Imagine? (Meanwhile, Susan Cain’s Quiet: the Power of Introverts in a World That Can’t Stop Talking calls dopamine the “reward chemical” and postulates that extroverts are more responsive to it.) Other recurring stars of the pop literature are the hormone oxytocin (the “love chemical”) and mirror neurons, which allegedly explain empathy. Jonathan Haidt tells the weirdly unexplanatory micro-story that, in one experiment, “The subjects used their mirror neurons, empathised, and felt the other’s pain.” If I tell you to use your mirror neurons, do you know what to do? Alternatively, can you do as Lehrer advises and “listen to” your prefrontal cortex? Self-help can be a tricky business.


Distortion of what and how much we know is bound to occur, Paul Fletcher points out, if the literature is cherry-picked.

“Having outlined your theory,” he says, “you can then cite a finding from a neuroimaging study identifying, for example, activity in a brain region such as the insula . . . You then select from among the many theories of insula function, choosing the one that best fits with your overall hypothesis, but neglecting to mention that nobody really knows what the insula does or that there are many ideas about its possible function.”

But the great movie-monster of nearly all the pop brain literature is another region: the amygdala. It is routinely described as the “ancient” or “primitive” brain, scarily atavistic. There is strong evidence for the amygdala’s role in fear, but then fear is one of the most heavily studied emotions; popularisers downplay or ignore the amygdala’s associations with the cuddlier emotions and memory. The implicit picture is of our uneasy coexistence with a beast inside the head, which needs to be controlled if we are to be happy, or at least liberal. (In The Republican Brain, Mooney suggests that “conservatives and authoritarians” might be the nasty way they are because they have a “more active amygdala”.) René Descartes located the soul in the pineal gland; the moral of modern pop neuroscience is that original sin is physical – a bestial, demonic proto-brain lurking at the heart of darkness within our own skulls. It’s an angry ghost in the machine.

Indeed, despite their technical paraphernalia of neurotransmitters and anterior temporal gyruses, modern pop brain books are offering a spiritual topography. Such is the seductive appeal of fMRI brain scans, their splashes of red, yellow and green lighting up what looks like a black intracranial vacuum. In mass culture, the fMRI scan (usually merged from several individuals) has become a secular icon, the converse of a Hubble Space Telescope image. The latter shows us awe-inspiring vistas of distant nebulae, as though painstakingly airbrushed by a sci-fi book-jacket artist; the former peers the other way, into psychedelic inner space. And the pictures, like religious icons, inspire uncritical devotion: a 2008 study, Fletcher notes, showed that “people – even neuroscience undergrads – are more likely to believe a brain scan than a bar graph”.

In The Invisible Gorilla, Christopher Chabris and his collaborator Daniel Simons advise readers to be wary of such “brain porn”, but popular magazines, science websites and books are frenzied consumers and hypers of these scans. “This is your brain on music”, announces a caption to a set of fMRI images, and we are invited to conclude that we now understand more about the experience of listening to music. The “This is your brain on” meme, it seems, is indefinitely extensible: Google results offer “This is your brain on poker”, “This is your brain on metaphor”, “This is your brain on diet soda”, “This is your brain on God” and so on, ad nauseam. I hereby volunteer to submit to a functional magnetic-resonance imaging scan while reading a stack of pop neuroscience volumes, for an illuminating series of pictures entitled This Is Your Brain on Stupid Books About Your Brain.

None of the foregoing should be taken to imply that fMRI and other brain-investigation techniques are useless: there is beautiful and amazing science in how they work and what well-designed experiments can teach us. “One of my favourites,” Fletcher says, “is the observation that one can take measures of brain activity (either using fMRI or EEG) while someone is learning . . . a list of words, and that activity can actually predict whether particular words will be remembered when the person is tested later (even the next day). This to me demonstrates something important – that observing activity in the brain can tell us something about how somebody is processing stimuli in ways that the person themselves is unable to report. With measures like that, we can begin to see how valuable it is to measure brain activity – it is giving us information that would otherwise be hidden from us.”

In this light, one might humbly venture a preliminary diagnosis of the pop brain hacks’ chronic intellectual error. It is that they misleadingly assume we always know how to interpret such “hidden” information, and that it is always more reliably meaningful than what lies in plain view. The hucksters of neuroscientism are the conspiracy theorists of the human animal, the 9/11 Truthers of the life of the mind.

Steven Poole is the author of the forthcoming book “You Aren’t What You Eat”, which will be published by Union Books in October.

This article was updated on 18 September 2012.

This article first appeared in the 10 September 2012 issue of the New Statesman, Autumn politics special

Show Hide image

Why Jeremy Corbyn is a new leader for the New Times

In an inspired election campaign, he confounded his detractors and showed that he was – more than any other leader – in tune with the times.

There have been two great political turning points in postwar Britain. The first was in 1945 with the election of the Attlee government. Driven by a popular wave of determination that peacetime Britain would look very different from the mass unemployment of the 1930s, and built on the foundations of the solidaristic spirit of the war, the Labour government ushered in full employment, the welfare state (including the NHS) and nationalisation of the basic industries, notably coal and the railways. It was a reforming government the like of which Britain had not previously experienced in the first half of the 20th century. The popular support enjoyed by the reforms was such that the ensuing social-democratic consensus was to last until the end of the 1970s, with Tory as well as Labour governments broadly operating within its framework.

During the 1970s, however, opposition to the social-democratic consensus grew steadily, led by the rise of the radical right, which culminated in 1979 in the election of Margaret Thatcher’s first government. In the process, the Thatcherites redefined the political debate, broadening it beyond the rather institutionalised and truncated forms that it had previously taken: they conducted a highly populist campaign that was for individualism and against collectivism; for the market and against the state; for liberty and against trade unionism; for law and order and against crime.

These ideas were dismissed by the left as just an extreme version of the same old Toryism, entirely failing to recognise their novelty and therefore the kind of threat they posed. The 1979 election, followed by Ronald Reagan’s US victory in 1980, began the neoliberal era, which remained hegemonic in Britain, and more widely in the West, for three decades. Tory and Labour governments alike operated within the terms and by the logic of neoliberalism. The only thing new about New Labour was its acquiescence in neoliberalism; even in this sense, it was not new but derivative of Thatcherism.

The financial crisis of 2007-2008 marked the beginning of the end of neoliberalism. Unlike the social-democratic consensus, which was undermined by the ideological challenge posed by Thatcherism, neoliberalism was brought to its knees not by any ideological alternative – such was the hegemonic sway of neoliberalism – but by the biggest financial crisis since 1931. This was the consequence of the fragility of a financial sector left to its own devices as a result of sweeping deregulation, and the corrupt and extreme practices that this encouraged.

The origin of the crisis lay not in the Labour government – complicit though it was in the neoliberal indulgence of the financial sector – but in the deregulation of the banking sector on both sides of the Atlantic in the 1980s. Neoliberalism limped on in the period after 2007-2008 but as real wages stagnated, recovery proved a mirage, and, with the behaviour of the bankers exposed, a deep disillusionment spread across society. During 2015-16, a populist wave of opposition to the establishment engulfed much of Europe and the United States.

Except at the extremes – Greece perhaps being the most notable example – the left was not a beneficiary: on the contrary it, too, was punished by the people in the same manner as the parties of the mainstream right were. The reason was straightforward enough. The left was tarnished with the same brush as the right: almost everywhere social-democratic parties, albeit to varying degrees, had pursued neoliberal policies. Bill Clinton and Tony Blair became – and presented themselves as – leaders of neoliberalism and as enthusiastic advocates of a strategy of hyper-globalisation, which resulted in growing inequality. In this fundamental respect these parties were more or less ­indistinguishable from the right.


The first signs of open revolt against New Labour – the representatives and evangelists of neoliberal ideas in the Labour Party – came in the aftermath of the 2015 ­election and the entirely unpredicted and overwhelming victory of Jeremy Corbyn in the leadership election. Something was happening. Yet much of the left, along with the media, summarily dismissed it as a revival of far-left entryism; that these were for the most part no more than a bunch of Trots. There is a powerful, often overwhelming, tendency to see new phenomena in terms of the past. The new and unfamiliar is much more difficult to understand than the old and familiar: it requires serious intellectual effort and an open and inquiring mind. The left is not alone in this syndrome. The right condemned the 2017 Labour Party manifesto as a replica of Labour’s 1983 manifesto. They couldn’t have been more wrong.

That Corbyn had been a veteran of the far left for so long lent credence to the idea that he was merely a retread of a failed past: there was nothing new about him. In a brilliant election campaign, Corbyn not only gave the lie to this but also demonstrated that he, far more than any of the other party leaders, was in tune with the times, the candidate of modernity.

Crises, great turning points, new conjunctures, new forms of consciousness are by definition incubators of the new. That is one of the great sources of their fascination. We can now see the line of linkage between the thousands of young people who gave Corbyn his overwhelming victory in the leadership election in 2015 and the millions of young people who were enthused by his general election campaign in 2017. It is no accident that it was the young rather than the middle-aged or the seniors who were in the vanguard: the young are the bearers and products of the new, they are the lightning conductors of change. Their elders, by contrast, are steeped in old ways of thinking and doing, having lived through and internalised the values and norms of neoliberalism for more than 30 years.

Yet there is another, rather more important aspect to how we identify the new, namely the way we see politics and how politics is conceived. Electoral politics is a highly institutionalised and tribal activity. There have been, as I argued earlier, two great turning points in postwar politics: the social-democratic era ushered in by the 1945 Labour government and the neoliberal era launched by the Tory government in 1979.

The average Tory MP or activist, no doubt, would interpret history primarily in terms of Tory and Labour governments; Labour MPs and activists would do similarly. But this is a superficial reading of politics based on party labels which ignores the deeper forces that shape different eras, generate crises and result in new paradigms.

Alas, most political journalists and columnists are afflicted with the same inability to distinguish the wood (an understanding of the deeper historical forces at work) from the trees (the day-to-day manoeuvring of parties and politicians). In normal times, this may not be so important, because life continues for the most part as before, but at moments of great paradigmatic change it is absolutely critical.

If the political journalists, and indeed the PLP, had understood the deeper forces and profound changes now at work, they would never have failed en masse to rise above the banal and predictable in their assessment of Corbyn. Something deep, indeed, is happening. A historical era – namely, that of neoliberalism – is in its death throes. All the old assumptions can no longer be assumed. We are in new territory: we haven’t been here before. The smart suits long preferred by New Labour wannabes are no longer a symbol of success and ambition but of alienation from, and rejection of, those who have been left behind; who, from being ignored and dismissed, are in the process of moving to the centre of the political stage.

Corbyn, you may recall, was instantly rejected and ridiculed for his sartorial style, and yet we can now see that, with a little smartening, it conveys an authenticity and affinity with the times that made his style of dress more or less immune from criticism during the general election campaign. Yet fashion is only a way to illustrate a much deeper point.

The end of neoliberalism, once so hegemonic, so commanding, is turning Britain on its head. That is why – extraordinary when you think about it – all the attempts by the right to dismiss Corbyn as a far-left extremist failed miserably, even proved counterproductive, because that was not how people saw him, not how they heard him. He was speaking a language and voicing concerns that a broad cross-section of the public could understand and identify with.


The reason a large majority of the PLP was opposed to Corbyn, desperate to be rid of him, was because they were still living in the neoliberal era, still slaves to its ideology, still in thrall to its logic. They knew no other way of thinking or political being. They accused Corbyn of being out of time when in fact it was most of the PLP – not to mention the likes of Mandelson and Blair – who were still imprisoned in an earlier historical era. The end of neoliberalism marks the death of New Labour. In contrast, Corbyn is aligned with the world as it is rather than as it was. What a wonderful irony.

Corbyn’s success in the general election requires us to revisit some of the assumptions that have underpinned much political commentary over the past several years. The turmoil in Labour ranks and the ridiculing of Corbyn persuaded many, including on the left, that Labour stood on the edge of the abyss and that the Tories would continue to dominate for long into the future. With Corbyn having seized the political initiative, the Tories are now cast in a new light. With Labour in the process of burying its New Labour legacy and addressing a very new conjuncture, then the end of neoliberalism poses a much more serious challenge to the Tories than it does the Labour Party.

The Cameron/Osborne leadership was still very much of a neoliberal frame of mind, not least in their emphasis on austerity. It would appear that, in the light of the new popular mood, the government will now be forced to abandon austerity. Theresa May, on taking office, talked about a return to One Nation Toryism and the need to help the worst-off, but that has never moved beyond rhetoric: now she is dead in the water.

Meanwhile, the Tories are in fast retreat over Brexit. They held a referendum over the EU for narrowly party reasons which, from a national point of view, was entirely unnecessary. As a result of the Brexit vote, the Cameron leadership was forced to resign and the Brexiteers took de facto command. But now, after the election, the Tories are in headlong retreat from anything like a “hard Brexit”. In short, they have utterly lost control of the political agenda and are being driven by events. Above all, they are frightened of another election from which Corbyn is likely to emerge as leader with a political agenda that will owe nothing to neoliberalism.

Apart from Corbyn’s extraordinary emergence as a leader who understands – and is entirely comfortable with – the imperatives of the new conjuncture and the need for a new political paradigm, the key to Labour’s transformed position in the eyes of the public was its 2017 manifesto, arguably its best and most important since 1945. You may recall that for three decades the dominant themes were marketisation, privatisation, trickle-down economics, the wastefulness and inefficiencies of the state, the incontrovertible case for hyper-globalisation, and bankers and financiers as the New Gods.

Labour’s manifesto offered a very different vision: a fairer society, bearing down on inequality, a more redistributive tax system, the centrality of the social, proper funding of public services, nationalisation of the railways and water industry, and people as the priority rather than business and the City. The title captured the spirit – For the Many Not the Few. Or, to put in another way, After Neoliberalism. The vision is not yet the answer to the latter question, but it represents the beginnings of an answer.

Ever since the late 1970s, Labour has been on the defensive, struggling to deal with a world where the right has been hegemonic. We can now begin to glimpse a different possibility, one in which the left can begin to take ownership – at least in some degree – of a new, post-neoliberal political settlement. But we should not underestimate the enormous problems that lie in wait. The relative economic prospects for the country are far worse than they have been at any time since 1945. As we saw in the Brexit vote, the forces of conservatism, nativism, racism and imperial nostalgia remain hugely powerful. Not only has the country rejected continued membership of the European Union, but, along with the rest of the West, it is far from reconciled with the new world that is in the process of being created before our very eyes, in which the developing world will be paramount and in which China will be the global leader.

Nonetheless, to be able to entertain a sense of optimism about our own country is a novel experience after 30 years of being out in the cold. No wonder so many are feeling energised again.

This article first appeared in the 15 June 2017 issue of the New Statesman, Corbyn: revenge of the rebel

Martin Jacques is the former editor of Marxism Today. 

This article first appeared in the 15 June 2017 issue of the New Statesman, Corbyn: revenge of the rebel

0800 7318496