This is a metaphor. Photograph: Getty Images
Show Hide image

Your brain on pseudoscience: the rise of popular neurobollocks

The “neuroscience” shelves in bookshops are groaning. But are the works of authors such as Malcolm Gladwell and Jonah Lehrer just self-help books dressed up in a lab coat?

An intellectual pestilence is upon us. Shop shelves groan with books purporting to explain, through snazzy brain-imaging studies, not only how thoughts and emotions function, but how politics and religion work, and what the correct answers are to age-old philosophical controversies. The dazzling real achievements of brain research are routinely pressed into service for questions they were never designed to answer. This is the plague of neuroscientism – aka neurobabble, neurobollocks, or neurotrash – and it’s everywhere.

In my book-strewn lodgings, one literally trips over volumes promising that “the deepest mysteries of what makes us who we are are gradually being unravelled” by neuroscience and cognitive psychology. (Even practising scientists sometimes make such grandiose claims for a general audience, perhaps urged on by their editors: that quotation is from the psychologist Elaine Fox’s interesting book on “the new science of optimism”, Rainy Brain, Sunny Brain, published this summer.) In general, the “neural” explanation has become a gold standard of non-fiction exegesis, adding its own brand of computer-assisted lab-coat bling to a whole new industry of intellectual quackery that affects to elucidate even complex sociocultural phenomena. Chris Mooney’s The Republican Brain: the Science of Why They Deny Science – and Reality disavows “reductionism” yet encourages readers to treat people with whom they disagree more as pathological specimens of brain biology than as rational interlocutors.

The New Atheist polemicist Sam Harris, in The Moral Landscape, interprets brain and other research as showing that there are objective moral truths, enthusiastically inferring – almost as though this were the point all along – that science proves “conservative Islam” is bad.

Happily, a new branch of the neuroscienceexplains everything genre may be created at any time by the simple expedient of adding the prefix “neuro” to whatever you are talking about. Thus, “neuroeconomics” is the latest in a long line of rhetorical attempts to sell the dismal science as a hard one; “molecular gastronomy” has now been trumped in the scientised gluttony stakes by “neurogastronomy”; students of Republican and Democratic brains are doing “neuropolitics”; literature academics practise “neurocriticism”. There is “neurotheology”, “neuromagic” (according to Sleights of Mind, an amusing book about how conjurors exploit perceptual bias) and even “neuromarketing”. Hoping it’s not too late to jump on the bandwagon, I have decided to announce that I, too, am skilled in the newly minted fields of neuroprocrastination and neuroflâneurship.

Illumination is promised on a personal as well as a political level by the junk enlightenment of the popular brain industry. How can I become more creative? How can I make better decisions? How can I be happier? Or thinner? Never fear: brain research has the answers. It is self-help armoured in hard science. Life advice is the hook for nearly all such books. (Some cram the hard sell right into the title – such as John B Arden’s Rewire Your Brain: Think Your Way to a Better Life.) Quite consistently, heir recommendations boil down to a kind of neo- Stoicism, drizzled with brain-juice. In a selfcongratulatory egalitarian age, you can no longer tell people to improve themselves morally. So self-improvement is couched in instrumental, scientifically approved terms.

The idea that a neurological explanation could exhaust the meaning of experience was already being mocked as “medical materialism” by the psychologist William James a century ago. And today’s ubiquitous rhetorical confidence about how the brain works papers over a still-enormous scientific uncertainty. Paul Fletcher, professor of health neuroscience at the University of Cambridge, says that he gets “exasperated” by much popular coverage of neuroimaging research, which assumes that “activity in a brain region is the answer to some profound question about psychological processes. This is very hard to justify given how little we currently know about what different regions of the brain actually do.” Too often, he tells me in an email correspondence, a popular writer will “opt for some sort of neuro-flapdoodle in which a highly simplistic and questionable point is accompanied by a suitably grand-sounding neural term and thus acquires a weightiness that it really doesn’t deserve. In my view, this is no different to some mountebank selling quacksalve by talking about the physics of water molecules’ memories, or a beautician talking about action liposomes.”

Shades of grey

The human brain, it is said, is the most complex object in the known universe. That a part of it “lights up” on an fMRI scan does not mean the rest is inactive; nor is it obvious what any such lighting-up indicates; nor is it straightforward to infer general lessons about life from experiments conducted under highly artificial conditions. Nor do we have the faintest clue about the biggest mystery of all – how does a lump of wet grey matter produce the conscious experience you are having right now, reading this paragraph? How come the brain gives rise to the mind? No one knows.

So, instead, here is a recipe for writing a hit popular brain book. You start each chapter with a pat anecdote about an individual’s professional or entrepreneurial success, or narrow escape from peril. You then mine the neuroscientific research for an apparently relevant specific result and narrate the experiment, perhaps interviewing the scientist involved and describing his hair. You then climax in a fit of premature extrapolation, inferring from the scientific result a calming bromide about what it is to function optimally as a modern human being. Voilà, a laboratory-sanctioned Big Idea in digestible narrative form. This is what the psychologist Christopher Chabris has named the “story-study-lesson” model, perhaps first perfected by one Malcolm Gladwell. A series of these threesomes may be packaged into a book, and then resold again and again as a stand-up act on the wonderfully lucrative corporate lecture circuit.

Such is the rigid formula of Imagine: How Creativity Works, published in March this year by the American writer Jonah Lehrer. The book is a shatteringly glib mishmash of magazine yarn, bizarrely incompetent literary criticism, inspiring business stories about mops and dolls and zany overinterpretation of research findings in neuroscience and psychology. Lehrer responded to my hostile review of the book by claiming that I thought the science he was writing about was “useless”, but such garbage needs to be denounced precisely in defence of the achievements of science. (In a sense, as Paul Fletcher points out, such books are “anti science, given that science is supposed to be  our protection against believing whatever we find most convenient, comforting or compelling”.) More recently, Lehrer admitted fabricating quotes by Bob Dylan in Imagine, which was hastily withdrawn from sale, and he resigned from his post at the New Yorker. To invent things supposedly said by the most obsessively studied popular artist of our age is a surprising gambit. Perhaps Lehrer misunderstood his own advice about creativity.

Mastering one’s own brain is also the key to survival in a dog-eat-dog corporate world, as promised by the cognitive scientist Art Markman’s Smart Thinking: How to Think Big, Innovate and Outperform Your Rivals. Meanwhile, the field (or cult) of “neurolinguistic programming” (NLP) sells techniques not only of self-overcoming but of domination over others. (According to a recent NLP handbook, you can “create virtually any and all states” in other people by using “embedded commands”.) The employee using such arcane neurowisdom will get promoted over the heads of his colleagues; the executive will discover expert-sanctioned ways to render his underlings more docile and productive, harnessing “creativity” for profit.

Waterstones now even has a display section labelled “Smart Thinking”, stocked with pop brain tracts. The true function of such books, of course, is to free readers from the responsibility of thinking for themselves. This is made eerily explicit in the psychologist Jonathan Haidt’s The Righteous Mind, published last March, which claims to show that “moral knowledge” is best obtained through “intuition” (arising from unconscious brain processing) rather than by explicit reasoning. “Anyone who values truth should stop worshipping reason,” Haidt enthuses, in a perverse manifesto for autolobotomy. I made an Olympian effort to take his advice seriously, and found myself rejecting the reasoning of his entire book.

Modern neuro-self-help pictures the brain as a kind of recalcitrant Windows PC. You know there is obscure stuff going on under the hood, so you tinker delicately with what you can see to try to coax it into working the way you want. In an earlier age, thinkers pictured the brain as a marvellously subtle clockwork mechanism, that being the cutting-edge high technology of the day. Our own brain-as-computer metaphor has been around for decades: there is the “hardware”, made up of different physical parts (the brain), and the “software”, processing routines that use different neuronal “circuits”. Updating things a bit for the kids, the evolutionary psychologist Robert Kurzban, in Why Everyone (Else) Is a Hypocrite, explains that the brain is like an iPhone running a bunch of different apps.

Such metaphors are apt to a degree, as long as you remember to get them the right way round. (Gladwell, in Blink – whose motivational selfhelp slogan is that “we can control rapid cognition” – burblingly describes the fusiform gyrus as “an incredibly sophisticated piece of brain software”, though the fusiform gyrus is a physical area of the brain, and so analogous to “hardware” not “software”.) But these writers tend to reach for just one functional story about a brain subsystem – the story that fits with their Big Idea – while ignoring other roles the same system might play. This can lead to a comical inconsistency across different books, and even within the oeuvre of a single author.

Is dopamine “the molecule of intuition”, as Jonah Lehrer risibly suggested in The Decisive Moment (2009), or is it the basis of “the neural highway that’s responsible for generating the pleasurable emotions”, as he wrote in Imagine? (Meanwhile, Susan Cain’s Quiet: the Power of Introverts in a World That Can’t Stop Talking calls dopamine the “reward chemical” and postulates that extroverts are more responsive to it.) Other recurring stars of the pop literature are the hormone oxytocin (the “love chemical”) and mirror neurons, which allegedly explain empathy. Jonathan Haidt tells the weirdly unexplanatory micro-story that, in one experiment, “The subjects used their mirror neurons, empathised, and felt the other’s pain.” If I tell you to use your mirror neurons, do you know what to do? Alternatively, can you do as Lehrer advises and “listen to” your prefrontal cortex? Self-help can be a tricky business.

Cherry-picking

Distortion of what and how much we know is bound to occur, Paul Fletcher points out, if the literature is cherry-picked.

“Having outlined your theory,” he says, “you can then cite a finding from a neuroimaging study identifying, for example, activity in a brain region such as the insula . . . You then select from among the many theories of insula function, choosing the one that best fits with your overall hypothesis, but neglecting to mention that nobody really knows what the insula does or that there are many ideas about its possible function.”

But the great movie-monster of nearly all the pop brain literature is another region: the amygdala. It is routinely described as the “ancient” or “primitive” brain, scarily atavistic. There is strong evidence for the amygdala’s role in fear, but then fear is one of the most heavily studied emotions; popularisers downplay or ignore the amygdala’s associations with the cuddlier emotions and memory. The implicit picture is of our uneasy coexistence with a beast inside the head, which needs to be controlled if we are to be happy, or at least liberal. (In The Republican Brain, Mooney suggests that “conservatives and authoritarians” might be the nasty way they are because they have a “more active amygdala”.) René Descartes located the soul in the pineal gland; the moral of modern pop neuroscience is that original sin is physical – a bestial, demonic proto-brain lurking at the heart of darkness within our own skulls. It’s an angry ghost in the machine.

Indeed, despite their technical paraphernalia of neurotransmitters and anterior temporal gyruses, modern pop brain books are offering a spiritual topography. Such is the seductive appeal of fMRI brain scans, their splashes of red, yellow and green lighting up what looks like a black intracranial vacuum. In mass culture, the fMRI scan (usually merged from several individuals) has become a secular icon, the converse of a Hubble Space Telescope image. The latter shows us awe-inspiring vistas of distant nebulae, as though painstakingly airbrushed by a sci-fi book-jacket artist; the former peers the other way, into psychedelic inner space. And the pictures, like religious icons, inspire uncritical devotion: a 2008 study, Fletcher notes, showed that “people – even neuroscience undergrads – are more likely to believe a brain scan than a bar graph”.

In The Invisible Gorilla, Christopher Chabris and his collaborator Daniel Simons advise readers to be wary of such “brain porn”, but popular magazines, science websites and books are frenzied consumers and hypers of these scans. “This is your brain on music”, announces a caption to a set of fMRI images, and we are invited to conclude that we now understand more about the experience of listening to music. The “This is your brain on” meme, it seems, is indefinitely extensible: Google results offer “This is your brain on poker”, “This is your brain on metaphor”, “This is your brain on diet soda”, “This is your brain on God” and so on, ad nauseam. I hereby volunteer to submit to a functional magnetic-resonance imaging scan while reading a stack of pop neuroscience volumes, for an illuminating series of pictures entitled This Is Your Brain on Stupid Books About Your Brain.

None of the foregoing should be taken to imply that fMRI and other brain-investigation techniques are useless: there is beautiful and amazing science in how they work and what well-designed experiments can teach us. “One of my favourites,” Fletcher says, “is the observation that one can take measures of brain activity (either using fMRI or EEG) while someone is learning . . . a list of words, and that activity can actually predict whether particular words will be remembered when the person is tested later (even the next day). This to me demonstrates something important – that observing activity in the brain can tell us something about how somebody is processing stimuli in ways that the person themselves is unable to report. With measures like that, we can begin to see how valuable it is to measure brain activity – it is giving us information that would otherwise be hidden from us.”

In this light, one might humbly venture a preliminary diagnosis of the pop brain hacks’ chronic intellectual error. It is that they misleadingly assume we always know how to interpret such “hidden” information, and that it is always more reliably meaningful than what lies in plain view. The hucksters of neuroscientism are the conspiracy theorists of the human animal, the 9/11 Truthers of the life of the mind.

Steven Poole is the author of the forthcoming book “You Aren’t What You Eat”, which will be published by Union Books in October.

This article was updated on 18 September 2012.

This article first appeared in the 10 September 2012 issue of the New Statesman, Autumn politics special

Photo: Getty
Show Hide image

The dustman and the doctor: fairness and the student fees debate

The idea that education – all education – should be free is intoxicating and liberating. But there's a problem.

The most toxic political imagery of the student fees debate dates from 2010. First, there was Nick Clegg brandishing a sheet of paper bearing his election pledge that the Liberal Democrats would vote against “any increase” in tuition fees. Then, a few months later, there was the sight of protesters scrawling graffiti and urinating on the statue of Winston Churchill in Parliament Square. Churchill was rapidly restored, but Clegg – who, I am told, did not believe in the pledge when he signed it but could not resist the prospect of those student voters in university towns – never properly recovered.

The issue of how to fund English universities had been febrile for years – long before the 2008 financial crisis, the ballooning of the Budget deficit that followed and the 2010 Lib Dem vote for the vertiginous increase in English tuition fees. (University funding is a devolved matter, with the Scots going their own way.)

In 2004, Tony Blair, enfeebled by the absence of weapons of mass destruction in Iraq, had almost been knocked off his prime ministerial perch when he, too, trebled fees, albeit to a mere £3,000, to be paid back after graduation. Gordon Brown’s allies, smelling post-Iraq weakness, hovered over the Labour leader before allowing him – by a sliver – to survive.

The Conservatives have historically been less troubled by the matter. Students largely have not voted in high enough numbers – certainly not for them – to impinge on their chances of electoral success. Meanwhile, the centre left has had lumps kicked out of it while wrestling with the problem of how best to fund higher education. Jeremy Corbyn’s 2017 manifesto significantly changed Labour’s position, promising to abolish fees altogether; he would also, he told the NME, “deal with” student debt. That half-pledge has now become a vague “ambition” because of its estimated £100bn price tag.

As a piece of campaigning, it worked. By contrast, Ed Miliband got nowhere in 2015 with his promise to reduce fees by a third to £6,000. It was too little, too late to mobilise student voters or their concerned parents, but more than enough for George Osborne, an unrepentant Vince Cable and a nervous higher education sector (sotto voce) to raise questions about Labour’s fiscal rectitude and/or the financial security of universities.

The Institute for Fiscal Studies (IFS), in its disinterested and peskily rigorous way, joined in – and with a more subtle point, suggesting that cutting fees would benefit higher-earning graduates the most. Those who earned less over their lifetime would, in any event, not have to pay all of the money back.

Until Corbyn’s swashbuckling manifesto simplified matters, or oversimplified them, the left had been tied in knots on the fairness point from the moment that tuition fees were introduced, relatively quietly, in the peak-Blair year of 1998.

The idea that education – all education – should be free is intoxicating and liberating. It is intoxicating because one’s Enlightenment reflexes are happily triggered: the pursuit of knowledge is wonderful; knowledge leads to individual self-fulfilment and should be made available to the largest possible number. We all benefit from a better-educated population, not least by the spread of liberal values. Utilitarians rejoice – the country becomes economically more prosperous, though the evidence for this is irritatingly murky.

It is liberating because it is a beautifully simple proposition, and thus the complexity of nasty trade-offs – between those who go to university and those who don’t, between generations, between different sorts of universities, between disciplines and courses, between funding higher education and funding a zillion other priorities – is washed away by the dazzling premise. Free.

Alas, there is a problem. Once upon a time, a British university education was for the very few. The state, in the form of the general taxpayer, footed the bill. Now, around 40 per cent of 18- to 19-year-olds are at university and nobody in front-line politics is keen on hauling down the number, notwithstanding the occasional hyperventilating headline about useless degrees in golf course management or surfing studies.

The Liberal Democrats’ ill-fated 2010 manifesto had a little-noticed passage that called for scrapping the participation target of 50 per cent – alongside the now ritual aspiration to improve vocational training and apprenticeship schemes, a promise that is yet another reminder of a long-established and debilitating British weakness that nobody seems to know how to reverse. But mass higher education is here to stay – and it’s a good thing, too.

We could have chosen (and could still choose) both to fund increasing numbers of people going to university and to pay for all of their tuition, but that would not have been a self-funding investment – at least, not for a very long time. Other European countries with decent universities have indeed managed without asking graduates to contribute anywhere near as much as ours. The Swedes pay nothing for tuition. Dutch students pay a quarter of their English counterparts. The Germans have proportionately fewer students in tertiary education (though their vocational education is widely known to be heaps better), but their students are at university for longer and they pay very little for the pleasure. You get the picture.

It would require a lot of extra taxation if we were to go down that route – and there are many other competing demands beyond deficit reduction. Yet the issue is not only framed by tax priorities. We can’t easily afford to have the state picking up the tab because – an ugly fact – we are less well off than most northern European countries that charge less. Yes, we are the fifth-largest economy in the world – how could any of us, since the Brexit vote, not know that? – but we are far from being the fifth most economically prosperous country in the EU, once you allow for the intrusion of vulgar reality in the form of GDP per head. On that measure, we sit somewhere in the middle of the pack.

So who pays? Asking students to pay something is not in itself an outrage. The massive social and economic privileges that my generation accrued from our gloriously free university education may now be spread more widely but that has not eliminated the personal advantages that, on average, follow a degree. Graduates are more likely to get jobs, more likely to get better jobs and more likely to keep their jobs in a recession. The Department for Education puts the graduate premium on average at £250,000 before tax over a lifetime for women and £170,000 for men. These figures may be overstated and might not be sustained, but it is overwhelmingly likely that most graduates will still benefit materially from their degrees.

From the starting point in 2004 – long before the deficit soared – Blair and his then education secretary, Charles Clarke, decided that graduates should pay more once they began to earn sufficient money. I remember Blair at the time doing a BBC Newsnight special with an angry audience, packed with students telling him that he was wrecking their lives and had insufficient respect for their contribution to the greater good. A very articulate trainee doctor told Blair that she faced a mountain of debt (those were the days – that would now be several mountains). Blair responded with a range of left-wing arguments – at least, if you are of a redistributive frame of mind. Here are some highlights of the exchange:

Blair: I think it is unfair to ask general taxpayers – 80 per cent of whom have not been to university – when you have got an adult who perhaps wants to get an additional skill and they have to pay for it if they don’t go to university, to say to those people: we are not giving you education for free. And to say to under-fives, where we are desperately short of investment, to say to primary schools, where again we need more money, that we are going to give an even bigger subsidy to university students. Believe me, if I could say to you, “You can have it all for free,” I would love to.

The student, more than matching the prime minister’s passion, was spectacularly unimpressed.

Student: It really infuriates me that you say, “Why should the dustman fund the doctor?” When he has [a] heart attack, he will be pleased that I went to university and graduated as a doctor. Therefore he should contribute towards the cost of my degree.

Blair: But surely there should be a fair balance. He is contributing to the cost of your degree. Five-sixths of the cost of any degree, even after our proposals come in, will be contributed by the general taxpayer.

Not bad for a prime minister who was not often associated with causes dear to the dustmen part of his Labour flock – nor associated with redistribution in general. Of course, the figure of five-sixths paid for by the state is now, since the introduction of £9,000 tuition fees, a great deal smaller. The trainee doctor of 2017 is expected, over the course of their lifetime, to fork out much more. The average student debt is getting on for £50,000.

The current numbers are the result of decisions taken by Vince Cable of the Liberal Democrats and David Willetts of the Conservative Party. Unlike Blair, these two men were on the left of their parties, with a firm belief in the importance of education and its positive impact on social mobility. The hike in fees led to protests and occupations but also to universities getting much of the extra money that they needed, even if they were markedly reluctant to say so, doubtless for fear of stirring up their students.

There has been no drop in the participation rate of students from poorer family backgrounds. Quite the reverse – despite Jeremy Corbyn’s personal refusal to believe the evidence. But the repayment of fees means that, in effect, recent graduates pay income tax at a rate of 29 per cent once they earn more than £21,000. (The Department for Education cheerily call this “a contribution”, as if it were voluntary.)

The repayment point could have risen with inflation to ease the load but it hasn’t. That allows the Treasury to recoup more money. Why hasn’t the £21,000 limit been raised? The reason is that, under the current IFS estimates, three-quarters of graduates will not pay back all of their debt after 30 years, at which point it is forgiven. Worse, interest rates on this fee debt are 3 per cent above inflation – and thus nearly 6 per cent above the base rate. That is not quite at Wonga levels but it is patently demoralising and much too steep.

That is far from the end of the matter. Until last September, poorer students received a maintenance grant of up to £3,400 to help with their living costs. For better- off students, the state’s supposition has always been that their parents should and would contribute financially to ensure that their offspring could lead a reasonable life while at university. No government has chosen to make this very explicit: there are only so many enemies you want at any one time on any one issue.

But as the number of students rose, so did the number entitled to the grant, and as part of the strategy to reduce the country’s Budget deficit, those grants were turned into loans, too.

The Labour Party, before Jeremy Corbyn became leader, opposed the change when it was announced but not with much elan. From my Oxford eyrie, I was astonished at how little excitement this generated. Perhaps everyone was exhausted by the failed protests five years earlier.

There is mitigation. It is worth remembering that nobody pays anything for their tuition up front (part of the Blair package, too) and some universities, including mine, have good and reliable schemes to help those from poorer backgrounds and hardship funds for those whose circumstances – normally their parents’ circumstances – change while they are studying.

But I know from direct experience that many students worry a great deal about the debt that awaits them. And if graduates were feather-bedded before 1998 (and that includes me), it is hard not to sympathise now. The debt is too much for too many.

Blair defined the problem correctly – the question of who pays is about striking a fair balance – even if Corbyn seems uninterested in the pain involved in thinking it through and has opted for the easiest answer. But what should that balance be? A graduate tax for those of us who went to university when it was both a much scarcer resource and cheap would offend people who want as little retrospection as possible in the tax system. However, it would do something to deal with generational injustice, a subject on which Corbyn’s credentials are sullied by his fondness for the “triple lock” on pensions.

Labour’s policy of telling English students that they will pay nothing for their tuition is nowhere near as left-wing as it sounds, but it was far too successful a piece of retail politics for anyone in his team to consider going back to the drawing board. So now it is the Tories, facing an energised student vote, who have to engage with the issues for the first time since the tumult of 2010. The least they can do – and they should do it fast – is cut the interest rate. They won’t want to do any of it but, as the man said, the times they are a-changing.

Mark Damazer is master of St Peter’s College, Oxford

This article first appeared in the 10 September 2012 issue of the New Statesman, Autumn politics special