This is a metaphor. Photograph: Getty Images
Show Hide image

Your brain on pseudoscience: the rise of popular neurobollocks

The “neuroscience” shelves in bookshops are groaning. But are the works of authors such as Malcolm Gladwell and Jonah Lehrer just self-help books dressed up in a lab coat?

An intellectual pestilence is upon us. Shop shelves groan with books purporting to explain, through snazzy brain-imaging studies, not only how thoughts and emotions function, but how politics and religion work, and what the correct answers are to age-old philosophical controversies. The dazzling real achievements of brain research are routinely pressed into service for questions they were never designed to answer. This is the plague of neuroscientism – aka neurobabble, neurobollocks, or neurotrash – and it’s everywhere.

In my book-strewn lodgings, one literally trips over volumes promising that “the deepest mysteries of what makes us who we are are gradually being unravelled” by neuroscience and cognitive psychology. (Even practising scientists sometimes make such grandiose claims for a general audience, perhaps urged on by their editors: that quotation is from the psychologist Elaine Fox’s interesting book on “the new science of optimism”, Rainy Brain, Sunny Brain, published this summer.) In general, the “neural” explanation has become a gold standard of non-fiction exegesis, adding its own brand of computer-assisted lab-coat bling to a whole new industry of intellectual quackery that affects to elucidate even complex sociocultural phenomena. Chris Mooney’s The Republican Brain: the Science of Why They Deny Science – and Reality disavows “reductionism” yet encourages readers to treat people with whom they disagree more as pathological specimens of brain biology than as rational interlocutors.

The New Atheist polemicist Sam Harris, in The Moral Landscape, interprets brain and other research as showing that there are objective moral truths, enthusiastically inferring – almost as though this were the point all along – that science proves “conservative Islam” is bad.

Happily, a new branch of the neuroscienceexplains everything genre may be created at any time by the simple expedient of adding the prefix “neuro” to whatever you are talking about. Thus, “neuroeconomics” is the latest in a long line of rhetorical attempts to sell the dismal science as a hard one; “molecular gastronomy” has now been trumped in the scientised gluttony stakes by “neurogastronomy”; students of Republican and Democratic brains are doing “neuropolitics”; literature academics practise “neurocriticism”. There is “neurotheology”, “neuromagic” (according to Sleights of Mind, an amusing book about how conjurors exploit perceptual bias) and even “neuromarketing”. Hoping it’s not too late to jump on the bandwagon, I have decided to announce that I, too, am skilled in the newly minted fields of neuroprocrastination and neuroflâneurship.

Illumination is promised on a personal as well as a political level by the junk enlightenment of the popular brain industry. How can I become more creative? How can I make better decisions? How can I be happier? Or thinner? Never fear: brain research has the answers. It is self-help armoured in hard science. Life advice is the hook for nearly all such books. (Some cram the hard sell right into the title – such as John B Arden’s Rewire Your Brain: Think Your Way to a Better Life.) Quite consistently, heir recommendations boil down to a kind of neo- Stoicism, drizzled with brain-juice. In a selfcongratulatory egalitarian age, you can no longer tell people to improve themselves morally. So self-improvement is couched in instrumental, scientifically approved terms.

The idea that a neurological explanation could exhaust the meaning of experience was already being mocked as “medical materialism” by the psychologist William James a century ago. And today’s ubiquitous rhetorical confidence about how the brain works papers over a still-enormous scientific uncertainty. Paul Fletcher, professor of health neuroscience at the University of Cambridge, says that he gets “exasperated” by much popular coverage of neuroimaging research, which assumes that “activity in a brain region is the answer to some profound question about psychological processes. This is very hard to justify given how little we currently know about what different regions of the brain actually do.” Too often, he tells me in an email correspondence, a popular writer will “opt for some sort of neuro-flapdoodle in which a highly simplistic and questionable point is accompanied by a suitably grand-sounding neural term and thus acquires a weightiness that it really doesn’t deserve. In my view, this is no different to some mountebank selling quacksalve by talking about the physics of water molecules’ memories, or a beautician talking about action liposomes.”

Shades of grey

The human brain, it is said, is the most complex object in the known universe. That a part of it “lights up” on an fMRI scan does not mean the rest is inactive; nor is it obvious what any such lighting-up indicates; nor is it straightforward to infer general lessons about life from experiments conducted under highly artificial conditions. Nor do we have the faintest clue about the biggest mystery of all – how does a lump of wet grey matter produce the conscious experience you are having right now, reading this paragraph? How come the brain gives rise to the mind? No one knows.

So, instead, here is a recipe for writing a hit popular brain book. You start each chapter with a pat anecdote about an individual’s professional or entrepreneurial success, or narrow escape from peril. You then mine the neuroscientific research for an apparently relevant specific result and narrate the experiment, perhaps interviewing the scientist involved and describing his hair. You then climax in a fit of premature extrapolation, inferring from the scientific result a calming bromide about what it is to function optimally as a modern human being. Voilà, a laboratory-sanctioned Big Idea in digestible narrative form. This is what the psychologist Christopher Chabris has named the “story-study-lesson” model, perhaps first perfected by one Malcolm Gladwell. A series of these threesomes may be packaged into a book, and then resold again and again as a stand-up act on the wonderfully lucrative corporate lecture circuit.

Such is the rigid formula of Imagine: How Creativity Works, published in March this year by the American writer Jonah Lehrer. The book is a shatteringly glib mishmash of magazine yarn, bizarrely incompetent literary criticism, inspiring business stories about mops and dolls and zany overinterpretation of research findings in neuroscience and psychology. Lehrer responded to my hostile review of the book by claiming that I thought the science he was writing about was “useless”, but such garbage needs to be denounced precisely in defence of the achievements of science. (In a sense, as Paul Fletcher points out, such books are “anti science, given that science is supposed to be  our protection against believing whatever we find most convenient, comforting or compelling”.) More recently, Lehrer admitted fabricating quotes by Bob Dylan in Imagine, which was hastily withdrawn from sale, and he resigned from his post at the New Yorker. To invent things supposedly said by the most obsessively studied popular artist of our age is a surprising gambit. Perhaps Lehrer misunderstood his own advice about creativity.

Mastering one’s own brain is also the key to survival in a dog-eat-dog corporate world, as promised by the cognitive scientist Art Markman’s Smart Thinking: How to Think Big, Innovate and Outperform Your Rivals. Meanwhile, the field (or cult) of “neurolinguistic programming” (NLP) sells techniques not only of self-overcoming but of domination over others. (According to a recent NLP handbook, you can “create virtually any and all states” in other people by using “embedded commands”.) The employee using such arcane neurowisdom will get promoted over the heads of his colleagues; the executive will discover expert-sanctioned ways to render his underlings more docile and productive, harnessing “creativity” for profit.

Waterstones now even has a display section labelled “Smart Thinking”, stocked with pop brain tracts. The true function of such books, of course, is to free readers from the responsibility of thinking for themselves. This is made eerily explicit in the psychologist Jonathan Haidt’s The Righteous Mind, published last March, which claims to show that “moral knowledge” is best obtained through “intuition” (arising from unconscious brain processing) rather than by explicit reasoning. “Anyone who values truth should stop worshipping reason,” Haidt enthuses, in a perverse manifesto for autolobotomy. I made an Olympian effort to take his advice seriously, and found myself rejecting the reasoning of his entire book.

Modern neuro-self-help pictures the brain as a kind of recalcitrant Windows PC. You know there is obscure stuff going on under the hood, so you tinker delicately with what you can see to try to coax it into working the way you want. In an earlier age, thinkers pictured the brain as a marvellously subtle clockwork mechanism, that being the cutting-edge high technology of the day. Our own brain-as-computer metaphor has been around for decades: there is the “hardware”, made up of different physical parts (the brain), and the “software”, processing routines that use different neuronal “circuits”. Updating things a bit for the kids, the evolutionary psychologist Robert Kurzban, in Why Everyone (Else) Is a Hypocrite, explains that the brain is like an iPhone running a bunch of different apps.

Such metaphors are apt to a degree, as long as you remember to get them the right way round. (Gladwell, in Blink – whose motivational selfhelp slogan is that “we can control rapid cognition” – burblingly describes the fusiform gyrus as “an incredibly sophisticated piece of brain software”, though the fusiform gyrus is a physical area of the brain, and so analogous to “hardware” not “software”.) But these writers tend to reach for just one functional story about a brain subsystem – the story that fits with their Big Idea – while ignoring other roles the same system might play. This can lead to a comical inconsistency across different books, and even within the oeuvre of a single author.

Is dopamine “the molecule of intuition”, as Jonah Lehrer risibly suggested in The Decisive Moment (2009), or is it the basis of “the neural highway that’s responsible for generating the pleasurable emotions”, as he wrote in Imagine? (Meanwhile, Susan Cain’s Quiet: the Power of Introverts in a World That Can’t Stop Talking calls dopamine the “reward chemical” and postulates that extroverts are more responsive to it.) Other recurring stars of the pop literature are the hormone oxytocin (the “love chemical”) and mirror neurons, which allegedly explain empathy. Jonathan Haidt tells the weirdly unexplanatory micro-story that, in one experiment, “The subjects used their mirror neurons, empathised, and felt the other’s pain.” If I tell you to use your mirror neurons, do you know what to do? Alternatively, can you do as Lehrer advises and “listen to” your prefrontal cortex? Self-help can be a tricky business.


Distortion of what and how much we know is bound to occur, Paul Fletcher points out, if the literature is cherry-picked.

“Having outlined your theory,” he says, “you can then cite a finding from a neuroimaging study identifying, for example, activity in a brain region such as the insula . . . You then select from among the many theories of insula function, choosing the one that best fits with your overall hypothesis, but neglecting to mention that nobody really knows what the insula does or that there are many ideas about its possible function.”

But the great movie-monster of nearly all the pop brain literature is another region: the amygdala. It is routinely described as the “ancient” or “primitive” brain, scarily atavistic. There is strong evidence for the amygdala’s role in fear, but then fear is one of the most heavily studied emotions; popularisers downplay or ignore the amygdala’s associations with the cuddlier emotions and memory. The implicit picture is of our uneasy coexistence with a beast inside the head, which needs to be controlled if we are to be happy, or at least liberal. (In The Republican Brain, Mooney suggests that “conservatives and authoritarians” might be the nasty way they are because they have a “more active amygdala”.) René Descartes located the soul in the pineal gland; the moral of modern pop neuroscience is that original sin is physical – a bestial, demonic proto-brain lurking at the heart of darkness within our own skulls. It’s an angry ghost in the machine.

Indeed, despite their technical paraphernalia of neurotransmitters and anterior temporal gyruses, modern pop brain books are offering a spiritual topography. Such is the seductive appeal of fMRI brain scans, their splashes of red, yellow and green lighting up what looks like a black intracranial vacuum. In mass culture, the fMRI scan (usually merged from several individuals) has become a secular icon, the converse of a Hubble Space Telescope image. The latter shows us awe-inspiring vistas of distant nebulae, as though painstakingly airbrushed by a sci-fi book-jacket artist; the former peers the other way, into psychedelic inner space. And the pictures, like religious icons, inspire uncritical devotion: a 2008 study, Fletcher notes, showed that “people – even neuroscience undergrads – are more likely to believe a brain scan than a bar graph”.

In The Invisible Gorilla, Christopher Chabris and his collaborator Daniel Simons advise readers to be wary of such “brain porn”, but popular magazines, science websites and books are frenzied consumers and hypers of these scans. “This is your brain on music”, announces a caption to a set of fMRI images, and we are invited to conclude that we now understand more about the experience of listening to music. The “This is your brain on” meme, it seems, is indefinitely extensible: Google results offer “This is your brain on poker”, “This is your brain on metaphor”, “This is your brain on diet soda”, “This is your brain on God” and so on, ad nauseam. I hereby volunteer to submit to a functional magnetic-resonance imaging scan while reading a stack of pop neuroscience volumes, for an illuminating series of pictures entitled This Is Your Brain on Stupid Books About Your Brain.

None of the foregoing should be taken to imply that fMRI and other brain-investigation techniques are useless: there is beautiful and amazing science in how they work and what well-designed experiments can teach us. “One of my favourites,” Fletcher says, “is the observation that one can take measures of brain activity (either using fMRI or EEG) while someone is learning . . . a list of words, and that activity can actually predict whether particular words will be remembered when the person is tested later (even the next day). This to me demonstrates something important – that observing activity in the brain can tell us something about how somebody is processing stimuli in ways that the person themselves is unable to report. With measures like that, we can begin to see how valuable it is to measure brain activity – it is giving us information that would otherwise be hidden from us.”

In this light, one might humbly venture a preliminary diagnosis of the pop brain hacks’ chronic intellectual error. It is that they misleadingly assume we always know how to interpret such “hidden” information, and that it is always more reliably meaningful than what lies in plain view. The hucksters of neuroscientism are the conspiracy theorists of the human animal, the 9/11 Truthers of the life of the mind.

Steven Poole is the author of the forthcoming book “You Aren’t What You Eat”, which will be published by Union Books in October.

This article was updated on 18 September 2012.

This article first appeared in the 10 September 2012 issue of the New Statesman, Autumn politics special

Show Hide image

Why Isis seeks a battle with Western nations - and why it can't be ignored

Islamic State believes it must eventually confront and then defeat the West. To get there, it seeks to polarise Muslim and non-Muslim communities alike.

It was precisely the type of attack that had long been feared: a co-ordinated and brutal act of urban warfare that brought Paris to a standstill for more than three hours on an otherwise typical Friday night. Six of the nine attackers had spent time fighting for Islamic State in Syria. Indeed, it was the third act of international terrorism perpetrated by IS in a fortnight, a campaign that started with the bombing of a Russian Metrojet flight over Sinai in Egypt, followed by a double suicide bombing in Beirut that killed 41 people – the deadliest attack in the Lebanese capital since the civil war there ended in 1990.

There are several significant operational observations to be made about what transpired in Paris. The attackers wore suicide belts in which the active ingredient was TATP, a highly unstable explosive based on acetone and hydrogen peroxide. TATP was also used in July 2005 when the London transport network was attacked. Known as the “mother of Satan” because of its volatility, it is usually manufactured at home and it is prone to accidental detonation – or, indeed, sometimes fails to detonate at all.

When two weeks after the July 2005 attacks four bombers attempted to replicate the carnage, their bombs failed to explode precisely because they had not been manufactured properly. The same was true for Richard Reid, the “Shoe Bomber”, and Umar Farouk Abdulmutallab, the “Underwear Bomber”, who smuggled TATP explosives on to American aircraft in 2001 and 2009, respectively.

Perhaps the most worrying aspect of the Paris attacks is that every device proved to be viable – a reality born of the permissive environment in Syria and Iraq. A new generation of terrorists is now able to learn and rehearse the skills required to build devices that detonate successfully. The skills come with experience, and the newly ungoverned spaces of the Levant provide an ideal training ground.

Yet, for all the viability of the TATP devices used in Paris, the greatest loss of life came from assault rifles. This demonstrates how relatively unsophisticated tactics can still achieve mass casualties for terrorists determined to kill as many people as possible. The threat is particularly acute in mainland Europe, where automatic weapons move easily across the Continent, typically originating from criminal gangs in eastern Europe. Smuggling them into Britain is harder because the Channel limits the number of potential entry points.

The added protection resulting from Britain being an island is often overlooked. Just as guns are able to move more freely across the Continent, so, too, can people. This was brought into sharp relief when Imran Khawaja, a British man from west London who joined Islamic State in January 2014, attempted to re-enter the UK.

Khawaja had been particularly cunning. He hoped to slip back into Britain by evading the authorities after faking his own death in Syria, a plan his compatriots facilitated by eulogising and glorifying him. He then made his way across Europe by land, passing through several European countries before being arrested on arrival at Dover. None of this is to suggest that Britain does not face a very serious threat from Islamic State terrorism (it does), but the risks here are diminished compared to the threat facing countries in mainland Europe.


Trying to understand the strategic rationale behind Islamic State’s attacks outside Syria and Iraq is daunting. A degree of conjecture is required, although information gleaned from its communiqués, statements, and behaviour can go some way towards
informing a judgement.

It may seem obvious to observe that IS sees itself primarily as a state, yet this is worth restating, because other jihadist groups have made claims to statehood while continuing to act as terrorists or insurgents, tacitly recognising the nonsense of their own position. Not so Islamic State. It truly believes it has achieved the Sunni ideal of a caliphate and it acts accordingly.

This was the thinking that led the group to break from al-Qaeda, rebuffing Ayman al-Zawahiri’s position as the group’s emir. From Islamic State’s perspective, countries are not subservient to individuals. The significance of this self-belief became apparent last summer when the US began dropping aid parcels to stranded Yazidis who were otherwise starving and dying from exposure in the Sinjar Mountains of Iraq. The US also committed itself to protecting Erbil in northern Iraq by bombing IS fighters who were moving on the city, not least because US diplomats were based there and President Obama could not afford a repeat of the 2012 Benghazi debacle in Libya.

Islamic State responded by beheading its first Western hostage, the American journalist James Foley. Although the video of this was billed as a “Message to America”, it was directed specifically at Obama rather than the American people. In a speech evidently written for him, Foley told viewers that the US government was to blame for his execution because of its “complacency and criminality”.

When Mohammed Emwazi – “Jihadi John” – appeared in Isis videos as executioner-in-chief, he went some way towards explaining those accusations. “You are no longer fighting an insurgency. We are an Islamic army and a state,” he said. “Any attempt, by you, Obama, to deny the Muslims their rights of living safely under the Islamic caliphate will result in the bloodshed of your people.” To that extent, Islamic State has pursued a campaign of retribution over the past 12 months against those it regards as belligerent enemies: the United States, Britain, France, Russia and its regional arch-rival Hezbollah, the Lebanese-based and Iranian-backed Shia militia.

There is an unspoken corollary to this approach, too: that Islamic State wants to make the cost of acting against it so unbearably high that its opponents are intimidated into acquiescence. For all its nihilistic sadism, IS is a rational actor. The group controls a large landmass, enjoys autonomy and makes claims to a revived caliphate. That is a project it wants to continue expanding and consolidating by being left alone to overrun the Middle East, a process that involves massacring minorities, including the Shias, Christians, Yazidis and Kurds.
If the West intervenes in this it must be prepared to face the prospect of mass-casualty terrorism at home.

Some will invariably argue that this is precisely what we should do. Leave them to it: Islamic State may be distasteful, but the cost of acting against it is too high. Besides, we cannot police the world, and what concern is it of ours if Arab societies implode in this way?

This view overlooks a broader (and inevitable) strategic imperative that can never be divorced from Islamic State. The group’s millenarianism and commitment to eschatological beliefs are such that it wants to be left alone – for now.

IS ultimately believes it must confront and then defeat the West in a comprehensive battle between haqq and batil: truth and falsehood. That became clear enough when Abdul-Rahman Kassig (originally Peter Kassig) became the fifth Western hostage to be executed by IS in November last year. The video of his killing was different from those that preceded it and started with the execution of 21 soldiers from the Syrian Arab Army who were fighting on behalf of President Bashar al-Assad.

A short speech by Mohammed Emwazi – again, directed at Obama – noted that the execution was taking place in Dabiq, a town in north-western Syria. The significance of this is not to be underestimated. Dabiq is noted as being the venue of a final showdown between the armies of Islam and those of “Rome”, a reference to the superpower of the day.

“To Obama, the dog of Rome, today we’re slaughtering the soldiers of Bashar and tomorrow we’ll be slaughtering your soldiers,” Emwazi said. “We will break this final and last crusade . . . and here we are burying the first of your crusader army [Kassig] in Dabiq.”

Kassig was branded a “crusader” because he had served in the US armed forces.

That final encounter is not necessarily reliant on Western intervention. Emwazi explained that Islamic State would also use Dabiq as a springboard to “slaughter your people on your streets”. Thus, for Islamic State, a confrontation with the West is inevitable. It would rather be left to consolidate its position for now, but there is no eventuality in which we could expect to escape its sabre-rattling indefinitely.

The religious significance attached to sites such as Dabiq plays a huge role in motivating the fighters of IS. While the world looks on with horrified bewilderment at its rampages, the power of its eschatological reasoning provides some insight.

Writing shortly after Russia entered the conflict, a relatively well-known Dutch fighter called Yilmaz (also known as Chechclear) invoked the importance of end-times prophecies. “Read the many hadith [sayings of the Prophet Muhammad] regarding Bilad al Sham [Greater Syria/the Levant] and the battles that are going to be fought on these grounds,” he said. “Is it not slowly unfolding before our eyes?”

Herein lies the power of Islamic State’s reasoning – its fighters, and the movement as a whole, draw huge succour from the religious importance of the sites around which they are fighting. It serves to convince them of the righteousness of their cause and the nobility of their endeavours.

Faced with a campaign of Western aerial bombardment (albeit one that is limited and unambitious), Islamic State has decided to bait its enemies into fighting it on the ground. To that end, towards the end of the Kassig execution video, Emwazi advises Obama that Islamic State is “eagerly waiting for the rest of your armies [sic] to arrive”.


One final point should be noted about the possible strategic aims of the Paris attacks of 13 November. Islamic State has been dispirited by the mass migration of Syrian refugees into Europe. Instead, it has appealed to them to migrate eastwards, towards the caliphate, rather than into disbelieving Western nations.

In an attempt to dissuade refugees from heading to Europe, IS released a series of videos featuring Western foreign fighters – including some from France – who told viewers how much they despised their home countries. Their message was one of persecution, of Muslims under siege, and of a hostile, unwelcoming Western world.

By way of contrast, they attempted to display the benefits of living in the so-called caliphate, with stilted images of the good life that would make even North Korean officials blush: schoolchildren in class, doctors in hospitals, market stalls filled with fresh produce.

Smuggling fighters into France who had posed as refugees is likely to have been a deliberate and calculating move, designed to exploit fears among some about the potential security risk posed by accepting Syrian refugees. Islamic State likens refugees seeking a future in Europe to the fracturing of Islam into various encampments following the death of the Prophet Muhammad in 632AD. Most of these sects arose from divisions over who should succeed the Prophet in leadership of the Muslim community, but some went into open apostasy.

Viewing events in this way, Islamic State argues that any Muslim not backing its project is guilty of heresy. For refugees to be running from it in such large numbers is particularly humiliating: the group even ran an advert that juxtaposed an image of a camouflaged military jacket alongside that of a life vest. A caption read, “How would you rather meet Allah?”

An article published this year in Islamic State’s English-language magazine Dabiq made this very point. It noted that: “Now, with the presence of the Islamic State, the opportunity to perform hijrah [migration] from darul-kufr [the land of disbelief] to darul-Islam [the land of Islam] and wage jihad against the Crusaders . . . is available to every Muslim as well as the chance to live under the shade of the Shariah alone.”

Islamic State recognises that it cannot kill all of the refugees, but by exploiting European fears about their arrival and presence, they can at least make their lives more difficult and force them into rethinking their choice. All of this falls into a strategy where IS wants to eradicate what it calls the “grayzone” of coexistence. Its aim is to divide the world along binary lines – Muslim and non-Muslim; Islam and non-Islam; black and white – with absolutely no room for any shades of grey.

“The Muslims in the West will quickly find themselves between one of two choices, they either apostatise and adopt the kufri [infidel] religion propagated by Bush, Obama, Blair, Cameron, Sarkozy and Hollande in the name of Islam so as to live amongst the kuffar [disbelievers] without hardship, or they [migrate] to the Islamic State,” says an editorial in Dabiq magazine. “The option to stand on the sidelines as a mere observer is being lost.”


Atrocities such as the Paris attacks are designed to put a strain on the “grayzone”, thereby polarising Muslim and non-Muslim communities alike. Indeed, this is precisely what Islamic State said it hoped to achieve after the Malian-French radical Amedy Coulibaly declared, in a video released two days after his death, that he had participated in the Charlie Hebdo attacks on IS’s behalf. “The time had come for another event – magnified by the presence of the Caliphate on the global stage – to further bring division to the world and destroy the grayzone everywhere,” Dabiq said.

Beyond the tendency of all totalitarian movements to move towards absolutism in their quest for dominance, Islamic State also believes that by polarising and dividing the world it will hasten the return of the messiah. Once again, eschatology reveals itself as an important motivating principle.

This is both a blessing and a curse for Islamic State. Certainly, it is what underwrites its remarkable self-assurance and certainty and at the same time fuels its barbarism. Yet it may also prove to be its unravelling. IS has now attacked Russian and French civilians within a fortnight, killing hundreds. The wider world is finally realising that Islamic State is a threat it cannot afford to ignore.

Shiraz Maher is a contributing writer for the New Statesman and a senior research fellow at the International Centre for the Study of Radicalisation at King’s College London

Shiraz Maher is a contributing writer for the New Statesman and a senior research fellow at King’s College London’s International Centre for the Study of Radicalisation.

This article first appeared in the 19 November 2015 issue of the New Statesman, The age of terror