This is a metaphor. Photograph: Getty Images
Show Hide image

Your brain on pseudoscience: the rise of popular neurobollocks

The “neuroscience” shelves in bookshops are groaning. But are the works of authors such as Malcolm Gladwell and Jonah Lehrer just self-help books dressed up in a lab coat?

An intellectual pestilence is upon us. Shop shelves groan with books purporting to explain, through snazzy brain-imaging studies, not only how thoughts and emotions function, but how politics and religion work, and what the correct answers are to age-old philosophical controversies. The dazzling real achievements of brain research are routinely pressed into service for questions they were never designed to answer. This is the plague of neuroscientism – aka neurobabble, neurobollocks, or neurotrash – and it’s everywhere.

In my book-strewn lodgings, one literally trips over volumes promising that “the deepest mysteries of what makes us who we are are gradually being unravelled” by neuroscience and cognitive psychology. (Even practising scientists sometimes make such grandiose claims for a general audience, perhaps urged on by their editors: that quotation is from the psychologist Elaine Fox’s interesting book on “the new science of optimism”, Rainy Brain, Sunny Brain, published this summer.) In general, the “neural” explanation has become a gold standard of non-fiction exegesis, adding its own brand of computer-assisted lab-coat bling to a whole new industry of intellectual quackery that affects to elucidate even complex sociocultural phenomena. Chris Mooney’s The Republican Brain: the Science of Why They Deny Science – and Reality disavows “reductionism” yet encourages readers to treat people with whom they disagree more as pathological specimens of brain biology than as rational interlocutors.

The New Atheist polemicist Sam Harris, in The Moral Landscape, interprets brain and other research as showing that there are objective moral truths, enthusiastically inferring – almost as though this were the point all along – that science proves “conservative Islam” is bad.

Happily, a new branch of the neuroscienceexplains everything genre may be created at any time by the simple expedient of adding the prefix “neuro” to whatever you are talking about. Thus, “neuroeconomics” is the latest in a long line of rhetorical attempts to sell the dismal science as a hard one; “molecular gastronomy” has now been trumped in the scientised gluttony stakes by “neurogastronomy”; students of Republican and Democratic brains are doing “neuropolitics”; literature academics practise “neurocriticism”. There is “neurotheology”, “neuromagic” (according to Sleights of Mind, an amusing book about how conjurors exploit perceptual bias) and even “neuromarketing”. Hoping it’s not too late to jump on the bandwagon, I have decided to announce that I, too, am skilled in the newly minted fields of neuroprocrastination and neuroflâneurship.

Illumination is promised on a personal as well as a political level by the junk enlightenment of the popular brain industry. How can I become more creative? How can I make better decisions? How can I be happier? Or thinner? Never fear: brain research has the answers. It is self-help armoured in hard science. Life advice is the hook for nearly all such books. (Some cram the hard sell right into the title – such as John B Arden’s Rewire Your Brain: Think Your Way to a Better Life.) Quite consistently, heir recommendations boil down to a kind of neo- Stoicism, drizzled with brain-juice. In a selfcongratulatory egalitarian age, you can no longer tell people to improve themselves morally. So self-improvement is couched in instrumental, scientifically approved terms.

The idea that a neurological explanation could exhaust the meaning of experience was already being mocked as “medical materialism” by the psychologist William James a century ago. And today’s ubiquitous rhetorical confidence about how the brain works papers over a still-enormous scientific uncertainty. Paul Fletcher, professor of health neuroscience at the University of Cambridge, says that he gets “exasperated” by much popular coverage of neuroimaging research, which assumes that “activity in a brain region is the answer to some profound question about psychological processes. This is very hard to justify given how little we currently know about what different regions of the brain actually do.” Too often, he tells me in an email correspondence, a popular writer will “opt for some sort of neuro-flapdoodle in which a highly simplistic and questionable point is accompanied by a suitably grand-sounding neural term and thus acquires a weightiness that it really doesn’t deserve. In my view, this is no different to some mountebank selling quacksalve by talking about the physics of water molecules’ memories, or a beautician talking about action liposomes.”

Shades of grey

The human brain, it is said, is the most complex object in the known universe. That a part of it “lights up” on an fMRI scan does not mean the rest is inactive; nor is it obvious what any such lighting-up indicates; nor is it straightforward to infer general lessons about life from experiments conducted under highly artificial conditions. Nor do we have the faintest clue about the biggest mystery of all – how does a lump of wet grey matter produce the conscious experience you are having right now, reading this paragraph? How come the brain gives rise to the mind? No one knows.

So, instead, here is a recipe for writing a hit popular brain book. You start each chapter with a pat anecdote about an individual’s professional or entrepreneurial success, or narrow escape from peril. You then mine the neuroscientific research for an apparently relevant specific result and narrate the experiment, perhaps interviewing the scientist involved and describing his hair. You then climax in a fit of premature extrapolation, inferring from the scientific result a calming bromide about what it is to function optimally as a modern human being. Voilà, a laboratory-sanctioned Big Idea in digestible narrative form. This is what the psychologist Christopher Chabris has named the “story-study-lesson” model, perhaps first perfected by one Malcolm Gladwell. A series of these threesomes may be packaged into a book, and then resold again and again as a stand-up act on the wonderfully lucrative corporate lecture circuit.

Such is the rigid formula of Imagine: How Creativity Works, published in March this year by the American writer Jonah Lehrer. The book is a shatteringly glib mishmash of magazine yarn, bizarrely incompetent literary criticism, inspiring business stories about mops and dolls and zany overinterpretation of research findings in neuroscience and psychology. Lehrer responded to my hostile review of the book by claiming that I thought the science he was writing about was “useless”, but such garbage needs to be denounced precisely in defence of the achievements of science. (In a sense, as Paul Fletcher points out, such books are “anti science, given that science is supposed to be  our protection against believing whatever we find most convenient, comforting or compelling”.) More recently, Lehrer admitted fabricating quotes by Bob Dylan in Imagine, which was hastily withdrawn from sale, and he resigned from his post at the New Yorker. To invent things supposedly said by the most obsessively studied popular artist of our age is a surprising gambit. Perhaps Lehrer misunderstood his own advice about creativity.

Mastering one’s own brain is also the key to survival in a dog-eat-dog corporate world, as promised by the cognitive scientist Art Markman’s Smart Thinking: How to Think Big, Innovate and Outperform Your Rivals. Meanwhile, the field (or cult) of “neurolinguistic programming” (NLP) sells techniques not only of self-overcoming but of domination over others. (According to a recent NLP handbook, you can “create virtually any and all states” in other people by using “embedded commands”.) The employee using such arcane neurowisdom will get promoted over the heads of his colleagues; the executive will discover expert-sanctioned ways to render his underlings more docile and productive, harnessing “creativity” for profit.

Waterstones now even has a display section labelled “Smart Thinking”, stocked with pop brain tracts. The true function of such books, of course, is to free readers from the responsibility of thinking for themselves. This is made eerily explicit in the psychologist Jonathan Haidt’s The Righteous Mind, published last March, which claims to show that “moral knowledge” is best obtained through “intuition” (arising from unconscious brain processing) rather than by explicit reasoning. “Anyone who values truth should stop worshipping reason,” Haidt enthuses, in a perverse manifesto for autolobotomy. I made an Olympian effort to take his advice seriously, and found myself rejecting the reasoning of his entire book.

Modern neuro-self-help pictures the brain as a kind of recalcitrant Windows PC. You know there is obscure stuff going on under the hood, so you tinker delicately with what you can see to try to coax it into working the way you want. In an earlier age, thinkers pictured the brain as a marvellously subtle clockwork mechanism, that being the cutting-edge high technology of the day. Our own brain-as-computer metaphor has been around for decades: there is the “hardware”, made up of different physical parts (the brain), and the “software”, processing routines that use different neuronal “circuits”. Updating things a bit for the kids, the evolutionary psychologist Robert Kurzban, in Why Everyone (Else) Is a Hypocrite, explains that the brain is like an iPhone running a bunch of different apps.

Such metaphors are apt to a degree, as long as you remember to get them the right way round. (Gladwell, in Blink – whose motivational selfhelp slogan is that “we can control rapid cognition” – burblingly describes the fusiform gyrus as “an incredibly sophisticated piece of brain software”, though the fusiform gyrus is a physical area of the brain, and so analogous to “hardware” not “software”.) But these writers tend to reach for just one functional story about a brain subsystem – the story that fits with their Big Idea – while ignoring other roles the same system might play. This can lead to a comical inconsistency across different books, and even within the oeuvre of a single author.

Is dopamine “the molecule of intuition”, as Jonah Lehrer risibly suggested in The Decisive Moment (2009), or is it the basis of “the neural highway that’s responsible for generating the pleasurable emotions”, as he wrote in Imagine? (Meanwhile, Susan Cain’s Quiet: the Power of Introverts in a World That Can’t Stop Talking calls dopamine the “reward chemical” and postulates that extroverts are more responsive to it.) Other recurring stars of the pop literature are the hormone oxytocin (the “love chemical”) and mirror neurons, which allegedly explain empathy. Jonathan Haidt tells the weirdly unexplanatory micro-story that, in one experiment, “The subjects used their mirror neurons, empathised, and felt the other’s pain.” If I tell you to use your mirror neurons, do you know what to do? Alternatively, can you do as Lehrer advises and “listen to” your prefrontal cortex? Self-help can be a tricky business.

Cherry-picking

Distortion of what and how much we know is bound to occur, Paul Fletcher points out, if the literature is cherry-picked.

“Having outlined your theory,” he says, “you can then cite a finding from a neuroimaging study identifying, for example, activity in a brain region such as the insula . . . You then select from among the many theories of insula function, choosing the one that best fits with your overall hypothesis, but neglecting to mention that nobody really knows what the insula does or that there are many ideas about its possible function.”

But the great movie-monster of nearly all the pop brain literature is another region: the amygdala. It is routinely described as the “ancient” or “primitive” brain, scarily atavistic. There is strong evidence for the amygdala’s role in fear, but then fear is one of the most heavily studied emotions; popularisers downplay or ignore the amygdala’s associations with the cuddlier emotions and memory. The implicit picture is of our uneasy coexistence with a beast inside the head, which needs to be controlled if we are to be happy, or at least liberal. (In The Republican Brain, Mooney suggests that “conservatives and authoritarians” might be the nasty way they are because they have a “more active amygdala”.) René Descartes located the soul in the pineal gland; the moral of modern pop neuroscience is that original sin is physical – a bestial, demonic proto-brain lurking at the heart of darkness within our own skulls. It’s an angry ghost in the machine.

Indeed, despite their technical paraphernalia of neurotransmitters and anterior temporal gyruses, modern pop brain books are offering a spiritual topography. Such is the seductive appeal of fMRI brain scans, their splashes of red, yellow and green lighting up what looks like a black intracranial vacuum. In mass culture, the fMRI scan (usually merged from several individuals) has become a secular icon, the converse of a Hubble Space Telescope image. The latter shows us awe-inspiring vistas of distant nebulae, as though painstakingly airbrushed by a sci-fi book-jacket artist; the former peers the other way, into psychedelic inner space. And the pictures, like religious icons, inspire uncritical devotion: a 2008 study, Fletcher notes, showed that “people – even neuroscience undergrads – are more likely to believe a brain scan than a bar graph”.

In The Invisible Gorilla, Christopher Chabris and his collaborator Daniel Simons advise readers to be wary of such “brain porn”, but popular magazines, science websites and books are frenzied consumers and hypers of these scans. “This is your brain on music”, announces a caption to a set of fMRI images, and we are invited to conclude that we now understand more about the experience of listening to music. The “This is your brain on” meme, it seems, is indefinitely extensible: Google results offer “This is your brain on poker”, “This is your brain on metaphor”, “This is your brain on diet soda”, “This is your brain on God” and so on, ad nauseam. I hereby volunteer to submit to a functional magnetic-resonance imaging scan while reading a stack of pop neuroscience volumes, for an illuminating series of pictures entitled This Is Your Brain on Stupid Books About Your Brain.

None of the foregoing should be taken to imply that fMRI and other brain-investigation techniques are useless: there is beautiful and amazing science in how they work and what well-designed experiments can teach us. “One of my favourites,” Fletcher says, “is the observation that one can take measures of brain activity (either using fMRI or EEG) while someone is learning . . . a list of words, and that activity can actually predict whether particular words will be remembered when the person is tested later (even the next day). This to me demonstrates something important – that observing activity in the brain can tell us something about how somebody is processing stimuli in ways that the person themselves is unable to report. With measures like that, we can begin to see how valuable it is to measure brain activity – it is giving us information that would otherwise be hidden from us.”

In this light, one might humbly venture a preliminary diagnosis of the pop brain hacks’ chronic intellectual error. It is that they misleadingly assume we always know how to interpret such “hidden” information, and that it is always more reliably meaningful than what lies in plain view. The hucksters of neuroscientism are the conspiracy theorists of the human animal, the 9/11 Truthers of the life of the mind.

Steven Poole is the author of the forthcoming book “You Aren’t What You Eat”, which will be published by Union Books in October.

This article was updated on 18 September 2012.

This article first appeared in the 10 September 2012 issue of the New Statesman, Autumn politics special

MILES COLE
Show Hide image

The new Brexit economics

George Osborne’s austerity plan – now abandoned by the Tories – was the most costly macroeconomic policy mistake since the 1930s.

George Osborne is no longer chancellor, sacked by the post-Brexit Prime Minister, Theresa May. Philip Hammond, the new Chancellor, has yet to announce detailed plans but he has indicated that the real economy rather than the deficit is his priority. The senior Conservatives Sajid Javid and Stephen Crabb have advocated substantial increases in public-sector infrastructure investment, noting how cheap it is for the government to borrow. The argument that Osborne and the Conservatives had been making since 2010 – that the priority for macroeconomic policy had to be to reduce the government’s budget deficit – seems to have been brushed aside.

Is there a good economic reason why Brexit in particular should require abandoning austerity economics? I would argue that the Tory obsession with the budget deficit has had very little to do with economics for the past four or five years. Instead, it has been a political ruse with two intentions: to help win elections and to reduce the size of the state. That Britain’s macroeconomic policy was dictated by politics rather than economics was a precursor for the Brexit vote. However, austerity had already begun to reach its political sell-by date, and Brexit marks its end.

To understand why austerity today is opposed by nearly all economists, and to grasp the partial nature of any Conservative rethink, it is important to know why it began and how it evolved. By 2010 the biggest recession since the Second World War had led to rapid increases in government budget deficits around the world. It is inevitable that deficits (the difference between government spending and tax receipts) increase in a recession, because taxes fall as incomes fall, but government spending rises further because benefit payments increase with rising unemployment. We experienced record deficits in 2010 simply because the recession was unusually severe.

In 2009 governments had raised spending and cut taxes in an effort to moderate the recession. This was done because the macroeconomic stabilisation tool of choice, nominal short-term interest rates, had become impotent once these rates hit their lower bound near zero. Keynes described the same situation in the 1930s as a liquidity trap, but most economists today use a more straightforward description: the problem of the zero lower bound (ZLB). Cutting rates below this lower bound might not stimulate demand because people could avoid them by holding cash. The textbook response to the problem is to use fiscal policy to stimulate the economy, which involves raising spending and cutting taxes. Most studies suggest that the recession would have been even worse without this expansionary fiscal policy in 2009.

Fiscal stimulus changed to fiscal contraction, more popularly known as austerity, in most of the major economies in 2010, but the reasons for this change varied from country to country. George Osborne used three different arguments to justify substantial spending cuts and tax increases before and after the coalition government was formed. The first was that unconventional monetary policy (quantitative easing, or QE) could replace the role of lower interest rates in stimulating the economy. As QE was completely untested, this was wishful thinking: the Bank of England was bound to act cautiously, because it had no idea what impact QE would have. The second was that a fiscal policy contraction would in fact expand the economy because it would inspire consumer and business confidence. This idea, disputed by most economists at the time, has now lost all credibility.

***

The third reason for trying to cut the deficit was that the financial markets would not buy government debt without it. At first, this rationale seemed to be confirmed by events as the eurozone crisis developed, and so it became the main justification for the policy. However, by 2012 it was becoming clear to many economists that the debt crisis in Ireland, Portugal and Spain was peculiar to the eurozone, and in particular to the failure of the European Central Bank (ECB) to act as a lender of last resort, buying government debt when the market failed to.

In September 2012 the ECB changed its policy and the eurozone crisis beyond Greece came to an end. This was the main reason why renewed problems in Greece last year did not lead to any contagion in the markets. Yet it is not something that the ECB will admit, because it places responsibility for the crisis at its door.

By 2012 two other things had also become clear to economists. First, governments outside the eurozone were having no problems selling their debt, as interest rates on this reached record lows. There was an obvious reason why this should be so: with central banks buying large quantities of government debt as a result of QE, there was absolutely no chance that governments would default. Nor have I ever seen any evidence that there was any likelihood of a UK debt funding crisis in 2010, beyond the irrelevant warnings of those “close to the markets”. Second, the austerity policy had done considerable harm. In macroeconomic terms the recovery from recession had been derailed. With the help of analysis from the Office for Budget Responsibility, I calculated that the GDP lost as a result of austerity implied an average cost for each UK household of at least £4,000.

Following these events, the number of academic economists who supported austerity became very small (they had always been a minority). How much of the UK deficit was cyclical or structural was irrelevant: at the ZLB, fiscal policy should stimulate, and the deficit should be dealt with once the recession was over.

Yet you would not know this from the public debate. Osborne continued to insist that deficit reduction be a priority, and his belief seemed to have become hard-wired into nearly all media discussion. So perverse was this for standard macroeconomics that I christened it “mediamacro”: the reduction of macroeconomics to the logic of household finance. Even parts of the Labour Party seemed to be succumbing to a mediamacro view, until the fiscal credibility rule introduced in March by the shadow chancellor, John McDonnell. (This included an explicit knockout from the deficit target if interest rates hit the ZLB, allowing fiscal policy to focus on recovering from recession.)

It is obvious why a focus on the deficit was politically attractive for Osborne. After 2010 the coalition government adopted the mantra that the deficit had been caused by the previous Labour government’s profligacy, even though it was almost entirely a consequence of the recession. The Tories were “clearing up the mess Labour left”, and so austerity could be blamed on their predecessors. Labour foolishly decided not to challenge this myth, and so it became what could be termed a “politicised truth”. It allowed the media to say that Osborne was more competent at running the economy than his predecessors. Much of the public, hearing only mediamacro, agreed.

An obsession with cutting the deficit was attractive to the Tories, as it helped them to appear competent. It also enabled them to achieve their ideological goal of shrinking the state. I have described this elsewhere as “deficit deceit”: using manufactured fear about the deficit to achieve otherwise unpopular reductions in public spending.

The UK recovery from the 2008/2009 recession was the weakest on record. Although employment showed strong growth from 2013, this may have owed much to an unprecedented decline in real wages and stagnant productivity growth. By the main metrics by which economists judge the success of an economy, the period of the coalition government looked very poor. Many economists tried to point this out during the 2015 election but they were largely ignored. When a survey of macroeconomists showed that most thought austerity had been harmful, the broadcast media found letters from business leaders supporting the Conservative position more newsworthy.

***

In my view, mediamacro and its focus on the deficit played an important role in winning the Conservatives the 2015 general election. I believe Osborne thought so, too, and so he ­decided to try to repeat his success. Although the level of government debt was close to being stabilised, he decided to embark on a further period of fiscal consolidation so that he could achieve a budget surplus.

Osborne’s austerity plans after 2015 were different from what happened in 2010 for a number of reasons. First, while 2010 austerity also occurred in the US and the eurozone, 2015 austerity was largely a UK affair. Second, by 2015 the Bank of England had decided that interest rates could go lower than their current level if need be. We are therefore no longer at the ZLB and, in theory, the impact of fiscal consolidation on demand could be offset by reducing interest rates, as long as no adverse shocks hit the economy. The argument against fiscal consolidation was rather that it increased the vulnerability of the economy if a negative shock occurred. As we have seen, Brexit is just this kind of shock.

In this respect, abandoning Osborne’s surplus target makes sense. However, there were many other strong arguments against going for surplus. The strongest of these was the case for additional public-sector investment at a time when interest rates were extremely low. Osborne loved appearing in the media wearing a hard hat and talked the talk on investment, but in reality his fiscal plans involved a steadily decreasing share of public investment in GDP. Labour’s fiscal rules, like those of the coalition government, have targeted the deficit excluding public investment, precisely so that investment could increase when the circumstances were right. In 2015 the circumstances were as right as they can be. The Organisation for Economic Co-operation and Development, the International Monetary Fund and pretty well every economist agreed.

Brexit only reinforces this argument. Yet Brexit will also almost certainly worsen the deficit. This is why the recent acceptance by the Tories that public-sector investment should rise is significant. They may have ­decided that they have got all they could hope to achieve from deficit deceit, and that now is the time to focus on the real needs of the economy, given the short- and medium-term drag on growth caused by Brexit.

It is also worth noting that although the Conservatives have, in effect, disowned Osborne’s 2015 austerity, they still insist their 2010 policy was correct. This partial change of heart is little comfort to those of us who have been arguing against austerity for the past six years. In 2015 the Conservatives persuaded voters that electing Ed Miliband as prime minister and Ed Balls as chancellor was taking a big risk with the economy. What it would have meant, in fact, is that we would already be getting the public investment the Conservatives are now calling for, and we would have avoided both the uncertainty before the EU referendum and Brexit itself.

Many economists before the 2015 election said the same thing, but they made no impact on mediamacro. The number of economists who supported Osborne’s new fiscal charter was vanishingly small but it seemed to matter not one bit. This suggests that if a leading political party wants to ignore mainstream economics and academic economists in favour of simplistic ideas, it can get away with doing so.

As I wrote in March, the failure of debate made me very concerned about the outcome of the EU referendum. Economists were as united as they ever are that Brexit would involve significant economic costs, and the scale of these costs is probably greater than the average loss due to austerity, simply because they are repeated year after year. Yet our warnings were easily deflected with the slogan “Project Fear”, borrowed from the SNP’s nickname for the No campaign in the 2014 Scottish referendum.

It remains unclear whether economists’ warnings were ignored because they were never heard fully or because they were not trusted, but in either case economics as a profession needs to think seriously about what it can do to make itself more relevant. We do not want economics in the UK to change from being called the dismal science to becoming the “I told you so” science.

Some things will not change following the Brexit vote. Mediamacro will go on obsessing about the deficit, and the Conservatives will go on wanting to cut many parts of government expenditure so that they can cut taxes. But the signs are that deficit deceit, creating an imperative that budget deficits must be cut as a pretext for reducing the size of the state, has come to an end in the UK. It will go down in history as probably the most costly macroeconomic policy mistake since the 1930s, causing a great deal of misery to many people’s lives.

Simon Wren-Lewis is a professor of economic policy at the Blavatnik School of Government, University of Oxford. He blogs at: mainlymacro.blogspot.com

 Simon Wren-Lewis is is Professor of Economic Policy in the Blavatnik School of Government at Oxford University, and a fellow of Merton College. He blogs at mainlymacro.

This article first appeared in the 21 July 2016 issue of the New Statesman, The English Revolt