This is a metaphor. Photograph: Getty Images
Show Hide image

Your brain on pseudoscience: the rise of popular neurobollocks

The “neuroscience” shelves in bookshops are groaning. But are the works of authors such as Malcolm Gladwell and Jonah Lehrer just self-help books dressed up in a lab coat?

An intellectual pestilence is upon us. Shop shelves groan with books purporting to explain, through snazzy brain-imaging studies, not only how thoughts and emotions function, but how politics and religion work, and what the correct answers are to age-old philosophical controversies. The dazzling real achievements of brain research are routinely pressed into service for questions they were never designed to answer. This is the plague of neuroscientism – aka neurobabble, neurobollocks, or neurotrash – and it’s everywhere.

In my book-strewn lodgings, one literally trips over volumes promising that “the deepest mysteries of what makes us who we are are gradually being unravelled” by neuroscience and cognitive psychology. (Even practising scientists sometimes make such grandiose claims for a general audience, perhaps urged on by their editors: that quotation is from the psychologist Elaine Fox’s interesting book on “the new science of optimism”, Rainy Brain, Sunny Brain, published this summer.) In general, the “neural” explanation has become a gold standard of non-fiction exegesis, adding its own brand of computer-assisted lab-coat bling to a whole new industry of intellectual quackery that affects to elucidate even complex sociocultural phenomena. Chris Mooney’s The Republican Brain: the Science of Why They Deny Science – and Reality disavows “reductionism” yet encourages readers to treat people with whom they disagree more as pathological specimens of brain biology than as rational interlocutors.

The New Atheist polemicist Sam Harris, in The Moral Landscape, interprets brain and other research as showing that there are objective moral truths, enthusiastically inferring – almost as though this were the point all along – that science proves “conservative Islam” is bad.

Happily, a new branch of the neuroscienceexplains everything genre may be created at any time by the simple expedient of adding the prefix “neuro” to whatever you are talking about. Thus, “neuroeconomics” is the latest in a long line of rhetorical attempts to sell the dismal science as a hard one; “molecular gastronomy” has now been trumped in the scientised gluttony stakes by “neurogastronomy”; students of Republican and Democratic brains are doing “neuropolitics”; literature academics practise “neurocriticism”. There is “neurotheology”, “neuromagic” (according to Sleights of Mind, an amusing book about how conjurors exploit perceptual bias) and even “neuromarketing”. Hoping it’s not too late to jump on the bandwagon, I have decided to announce that I, too, am skilled in the newly minted fields of neuroprocrastination and neuroflâneurship.

Illumination is promised on a personal as well as a political level by the junk enlightenment of the popular brain industry. How can I become more creative? How can I make better decisions? How can I be happier? Or thinner? Never fear: brain research has the answers. It is self-help armoured in hard science. Life advice is the hook for nearly all such books. (Some cram the hard sell right into the title – such as John B Arden’s Rewire Your Brain: Think Your Way to a Better Life.) Quite consistently, heir recommendations boil down to a kind of neo- Stoicism, drizzled with brain-juice. In a selfcongratulatory egalitarian age, you can no longer tell people to improve themselves morally. So self-improvement is couched in instrumental, scientifically approved terms.

The idea that a neurological explanation could exhaust the meaning of experience was already being mocked as “medical materialism” by the psychologist William James a century ago. And today’s ubiquitous rhetorical confidence about how the brain works papers over a still-enormous scientific uncertainty. Paul Fletcher, professor of health neuroscience at the University of Cambridge, says that he gets “exasperated” by much popular coverage of neuroimaging research, which assumes that “activity in a brain region is the answer to some profound question about psychological processes. This is very hard to justify given how little we currently know about what different regions of the brain actually do.” Too often, he tells me in an email correspondence, a popular writer will “opt for some sort of neuro-flapdoodle in which a highly simplistic and questionable point is accompanied by a suitably grand-sounding neural term and thus acquires a weightiness that it really doesn’t deserve. In my view, this is no different to some mountebank selling quacksalve by talking about the physics of water molecules’ memories, or a beautician talking about action liposomes.”

Shades of grey

The human brain, it is said, is the most complex object in the known universe. That a part of it “lights up” on an fMRI scan does not mean the rest is inactive; nor is it obvious what any such lighting-up indicates; nor is it straightforward to infer general lessons about life from experiments conducted under highly artificial conditions. Nor do we have the faintest clue about the biggest mystery of all – how does a lump of wet grey matter produce the conscious experience you are having right now, reading this paragraph? How come the brain gives rise to the mind? No one knows.

So, instead, here is a recipe for writing a hit popular brain book. You start each chapter with a pat anecdote about an individual’s professional or entrepreneurial success, or narrow escape from peril. You then mine the neuroscientific research for an apparently relevant specific result and narrate the experiment, perhaps interviewing the scientist involved and describing his hair. You then climax in a fit of premature extrapolation, inferring from the scientific result a calming bromide about what it is to function optimally as a modern human being. Voilà, a laboratory-sanctioned Big Idea in digestible narrative form. This is what the psychologist Christopher Chabris has named the “story-study-lesson” model, perhaps first perfected by one Malcolm Gladwell. A series of these threesomes may be packaged into a book, and then resold again and again as a stand-up act on the wonderfully lucrative corporate lecture circuit.

Such is the rigid formula of Imagine: How Creativity Works, published in March this year by the American writer Jonah Lehrer. The book is a shatteringly glib mishmash of magazine yarn, bizarrely incompetent literary criticism, inspiring business stories about mops and dolls and zany overinterpretation of research findings in neuroscience and psychology. Lehrer responded to my hostile review of the book by claiming that I thought the science he was writing about was “useless”, but such garbage needs to be denounced precisely in defence of the achievements of science. (In a sense, as Paul Fletcher points out, such books are “anti science, given that science is supposed to be  our protection against believing whatever we find most convenient, comforting or compelling”.) More recently, Lehrer admitted fabricating quotes by Bob Dylan in Imagine, which was hastily withdrawn from sale, and he resigned from his post at the New Yorker. To invent things supposedly said by the most obsessively studied popular artist of our age is a surprising gambit. Perhaps Lehrer misunderstood his own advice about creativity.

Mastering one’s own brain is also the key to survival in a dog-eat-dog corporate world, as promised by the cognitive scientist Art Markman’s Smart Thinking: How to Think Big, Innovate and Outperform Your Rivals. Meanwhile, the field (or cult) of “neurolinguistic programming” (NLP) sells techniques not only of self-overcoming but of domination over others. (According to a recent NLP handbook, you can “create virtually any and all states” in other people by using “embedded commands”.) The employee using such arcane neurowisdom will get promoted over the heads of his colleagues; the executive will discover expert-sanctioned ways to render his underlings more docile and productive, harnessing “creativity” for profit.

Waterstones now even has a display section labelled “Smart Thinking”, stocked with pop brain tracts. The true function of such books, of course, is to free readers from the responsibility of thinking for themselves. This is made eerily explicit in the psychologist Jonathan Haidt’s The Righteous Mind, published last March, which claims to show that “moral knowledge” is best obtained through “intuition” (arising from unconscious brain processing) rather than by explicit reasoning. “Anyone who values truth should stop worshipping reason,” Haidt enthuses, in a perverse manifesto for autolobotomy. I made an Olympian effort to take his advice seriously, and found myself rejecting the reasoning of his entire book.

Modern neuro-self-help pictures the brain as a kind of recalcitrant Windows PC. You know there is obscure stuff going on under the hood, so you tinker delicately with what you can see to try to coax it into working the way you want. In an earlier age, thinkers pictured the brain as a marvellously subtle clockwork mechanism, that being the cutting-edge high technology of the day. Our own brain-as-computer metaphor has been around for decades: there is the “hardware”, made up of different physical parts (the brain), and the “software”, processing routines that use different neuronal “circuits”. Updating things a bit for the kids, the evolutionary psychologist Robert Kurzban, in Why Everyone (Else) Is a Hypocrite, explains that the brain is like an iPhone running a bunch of different apps.

Such metaphors are apt to a degree, as long as you remember to get them the right way round. (Gladwell, in Blink – whose motivational selfhelp slogan is that “we can control rapid cognition” – burblingly describes the fusiform gyrus as “an incredibly sophisticated piece of brain software”, though the fusiform gyrus is a physical area of the brain, and so analogous to “hardware” not “software”.) But these writers tend to reach for just one functional story about a brain subsystem – the story that fits with their Big Idea – while ignoring other roles the same system might play. This can lead to a comical inconsistency across different books, and even within the oeuvre of a single author.

Is dopamine “the molecule of intuition”, as Jonah Lehrer risibly suggested in The Decisive Moment (2009), or is it the basis of “the neural highway that’s responsible for generating the pleasurable emotions”, as he wrote in Imagine? (Meanwhile, Susan Cain’s Quiet: the Power of Introverts in a World That Can’t Stop Talking calls dopamine the “reward chemical” and postulates that extroverts are more responsive to it.) Other recurring stars of the pop literature are the hormone oxytocin (the “love chemical”) and mirror neurons, which allegedly explain empathy. Jonathan Haidt tells the weirdly unexplanatory micro-story that, in one experiment, “The subjects used their mirror neurons, empathised, and felt the other’s pain.” If I tell you to use your mirror neurons, do you know what to do? Alternatively, can you do as Lehrer advises and “listen to” your prefrontal cortex? Self-help can be a tricky business.

Cherry-picking

Distortion of what and how much we know is bound to occur, Paul Fletcher points out, if the literature is cherry-picked.

“Having outlined your theory,” he says, “you can then cite a finding from a neuroimaging study identifying, for example, activity in a brain region such as the insula . . . You then select from among the many theories of insula function, choosing the one that best fits with your overall hypothesis, but neglecting to mention that nobody really knows what the insula does or that there are many ideas about its possible function.”

But the great movie-monster of nearly all the pop brain literature is another region: the amygdala. It is routinely described as the “ancient” or “primitive” brain, scarily atavistic. There is strong evidence for the amygdala’s role in fear, but then fear is one of the most heavily studied emotions; popularisers downplay or ignore the amygdala’s associations with the cuddlier emotions and memory. The implicit picture is of our uneasy coexistence with a beast inside the head, which needs to be controlled if we are to be happy, or at least liberal. (In The Republican Brain, Mooney suggests that “conservatives and authoritarians” might be the nasty way they are because they have a “more active amygdala”.) René Descartes located the soul in the pineal gland; the moral of modern pop neuroscience is that original sin is physical – a bestial, demonic proto-brain lurking at the heart of darkness within our own skulls. It’s an angry ghost in the machine.

Indeed, despite their technical paraphernalia of neurotransmitters and anterior temporal gyruses, modern pop brain books are offering a spiritual topography. Such is the seductive appeal of fMRI brain scans, their splashes of red, yellow and green lighting up what looks like a black intracranial vacuum. In mass culture, the fMRI scan (usually merged from several individuals) has become a secular icon, the converse of a Hubble Space Telescope image. The latter shows us awe-inspiring vistas of distant nebulae, as though painstakingly airbrushed by a sci-fi book-jacket artist; the former peers the other way, into psychedelic inner space. And the pictures, like religious icons, inspire uncritical devotion: a 2008 study, Fletcher notes, showed that “people – even neuroscience undergrads – are more likely to believe a brain scan than a bar graph”.

In The Invisible Gorilla, Christopher Chabris and his collaborator Daniel Simons advise readers to be wary of such “brain porn”, but popular magazines, science websites and books are frenzied consumers and hypers of these scans. “This is your brain on music”, announces a caption to a set of fMRI images, and we are invited to conclude that we now understand more about the experience of listening to music. The “This is your brain on” meme, it seems, is indefinitely extensible: Google results offer “This is your brain on poker”, “This is your brain on metaphor”, “This is your brain on diet soda”, “This is your brain on God” and so on, ad nauseam. I hereby volunteer to submit to a functional magnetic-resonance imaging scan while reading a stack of pop neuroscience volumes, for an illuminating series of pictures entitled This Is Your Brain on Stupid Books About Your Brain.

None of the foregoing should be taken to imply that fMRI and other brain-investigation techniques are useless: there is beautiful and amazing science in how they work and what well-designed experiments can teach us. “One of my favourites,” Fletcher says, “is the observation that one can take measures of brain activity (either using fMRI or EEG) while someone is learning . . . a list of words, and that activity can actually predict whether particular words will be remembered when the person is tested later (even the next day). This to me demonstrates something important – that observing activity in the brain can tell us something about how somebody is processing stimuli in ways that the person themselves is unable to report. With measures like that, we can begin to see how valuable it is to measure brain activity – it is giving us information that would otherwise be hidden from us.”

In this light, one might humbly venture a preliminary diagnosis of the pop brain hacks’ chronic intellectual error. It is that they misleadingly assume we always know how to interpret such “hidden” information, and that it is always more reliably meaningful than what lies in plain view. The hucksters of neuroscientism are the conspiracy theorists of the human animal, the 9/11 Truthers of the life of the mind.

Steven Poole is the author of the forthcoming book “You Aren’t What You Eat”, which will be published by Union Books in October.

This article was updated on 18 September 2012.

This article first appeared in the 10 September 2012 issue of the New Statesman, Autumn politics special

LAURENCE GRIFFITHS/GETTY IMAGES
Show Hide image

The darkening skies of the summer game

Cricket was once the English national sport – but, for many people today, it has become invisible.

In 1975 Roy Harper wrote an elegiac song called “When an Old Cricketer Leaves the Crease”. With its wistful recollection of “those fabled men” from the game’s golden age and its images of “a dusty pitch and two pound six of willow wood in the sun”, deepened by the melancholy cornets of the Grimethorpe Colliery Band, it evoked ancestral memories of distant summers.

Yet, with its nod towards “Geoff” (Boycott) and “John” (Snow), two dominant figures of the here and now, it wasn’t merely nostalgic. The song threw a hoop around a century of English cricket, whether seen or imagined, and pulled off the rare trick of sounding both old and new.

If you were seeking a pivotal year in postwar cricket, 1975 would do nicely. Colin Cowdrey, later Baron Cowdrey of Tonbridge, an amateur in spirit, played the last of his 114 Test matches in a career that had begun 21 years earlier. Graham Gooch, every inch a pro, won the first of his 118 Test caps, spread over the next two decades. Cowdrey, it might be said, with a bit of licence, was Guy Crouchback to Gooch’s Hooper.

In February that year, Sir Neville Cardus, whose romantic, not always factual writing in the old Manchester Guardian had shaped the way cricket-lovers thought about the game, died at the age of 86. Four months later, Clive Lloyd, then the captain of West Indies, scored a century of a brilliance that Cardus would have recognised against Australia’s fearsome fast bowlers as his team won the first and most enjoyable World Cup.

Something else happened that year. David Steele, a bespectacled, 33-year-old batsman (who looked ten years older), was plucked from the obscurity of Northamptonshire’s middle order to take on the mighty Australians at Lord’s. He made 50 dogged runs and added three more half-centuries, although the tourists won the series. Come December, this resolutely unfashionable plodder from the Potteries was voted Sports Personality of the Year by BBC viewers. Such was cricket’s power to capture the national mood, even in defeat.

Last year, when England actually beat the Australians, Joe Root of Yorkshire contributed two glowing centuries. No plodder, he. The cherubic Sheffielder was a member of the team that swiftly went on to win another series in South Africa. But when the BBC presented voters with a list of candidates for the award that Steele had won without any prompting, Root’s name was absent. Cricket simply didn’t figure.

It was an appalling slight on a cricketer who is already established in the annals of English batsmanship. Others also stand tall. The current team is led by Alastair Cook of Essex, who has made more runs in Test cricket than any other Englishman, while James Anderson, the Lancashire fast bowler, holds the English record for Test wickets. These are men of high talent and character, whose names will resonate through our game’s history. Yet, for many people, cricket has become invisible.

When England play Pakistan at Lord’s on Thursday, in the first match of a new series, the ground will be full. In the Coronation Garden behind the Victorian pavilion, there will be talk of “Kipper” Cowdrey, good old Goochie and maybe even the valiant Steele. Beyond the Grace Gate, named after the most celebrated of those fabled men whom Harper sang about, there will be ­indifference. The summer game, squeezed out of view this year by football’s European Championship, as well as the rituals of Wimbledon and the Open, is drifting towards insignificance.

How often do you now see children playing it in parks, or families improvising games on the beach? As for street cricket, with stumps chalked on walls, it has not been spotted in years. Public schools, which have wonderful playing fields and teachers who are prepared to devote to cricket the long hours that it demands, continue to do the game proud. The England team is full of public school boys, led by Cook, who attended Bedford. In state schools, alas, cricket is merely a rumour that many teachers don’t want their pupils to hear in case it gives them ideas.

At a recreational level, too, the story is changing. In “The Whitsun Weddings”, Philip Larkin described seeing from a train carriage the Odeon, a cooling tower and “someone running up to bowl”. Yet fewer people play the game these days – between 2013 and 2014, for instance, there was a 7 per cent fall in the number of players aged between 14 and 65 across England and Wales. As a result, there are fewer cricketers of Test standard. It can’t be ignored that, increasingly, England have to promote players from the swelling ranks of those born overseas. This month, for instance, England replaced Nick Compton (born in Durban, South Africa) with Gary Ballance (born in Harare, Zimbabwe). Both men went to Harrow.

As football becomes ever more newsworthy, even at the height of summer, cricket is banished to the margins of newspapers, including those that, until a few summers ago, served the game so loyally. Once there were dozens of broadsheet reporters, well known and much loved: Alan Gibson of the Times, who was forever changing trains at Didcot; David Foot, who wrote lyrical capsule essays for the Guardian; and Dicky Rutnagur of the Telegraph, who – uniquely – saw both Garry Sobers and Ravi Shastri hit six sixes in an over.

Now, unless there is hard news, or some celebrity dust to sprinkle, sports desks are not interested in cricket. One experienced reporter, who left his post at the paper where Cardus invented sportswriting, says, “I was fed up with having to answer the same question every morning: ‘What’s the Pietersen story today?’ That’s what it had come down to.”

The greatest loss by far has been the absence of Test cricket on terrestrial television. Since Channel 4 took over coverage from the BBC in 1999 and then passed the baton on to Sky after the Ashes series of 2005, a generation of young people has grown up without attachment to a game that their parents and grandparents took for granted. In Michael Atherton and Nasser Hussain, two former captains of England, Sky has outstanding performers, but their talents are not as widely known as they should be. The game may be millions of pounds richer for Sky’s bounty but cricket has suffered an immeasurable loss.

Meanwhile, on the wireless, where John Arlott and Christopher Martin-Jenkins made their reputations as supreme broadcasters, the BBC’s Test Match Special is mired in tittering mediocrity. It still has its moments – when Jonathan Agnew is in the box, or when Boycott is not talking about himself – but the show, hogged by adolescent show-offs, has lost its dignity.

Arlott, begging Rimbaud’s pardon, held the key to this savage parade, because he represented so long and so faithfully the spirit of English cricket. A Hampshire countryman who trod the beat as a Southampton copper before becoming a poetry producer at the BBC, he gave voice to all those “cricketers of the heart”, as he liked to call them, in honour of those people who followed the game. Summer in England meant, among other things, Arlott’s voice describing cricketers on the green.

Together with Cardus, an observer of a very different kind, he reinforced the idea of cricket as an essential feature of the English imagination. Neither created this mythology, which goes back to shepherds loafing on the Weald of Kent and emerged full-fledged in the glory of W G Grace and Ranjitsinhji. Yet these remarkable men certainly confirmed it in the eyes and ears of their readers and listeners.

Cardus, a distinguished music critic, belonged to the spirit world. Arlott, who had a shelf of first editions by Thomas Hardy (“the greatest of English novelists”), was a man of the soil. Neither was remotely interested in psychology but both knew quite a lot about human character. As Arlott reminded us, “A cricketer is showing you his character all the time.”

***

Cricket, they understood, was the most English of sports because it yoked together the rural and urban, north and south, young and old, men and women. The blacksmith, for an afternoon, stood on the same ground as the squire. L P Hartley caught something of this in The Go-Between and Harold Pinter, a great cricket lover, took delight in making the cricket match in that book a crucial part of his screenplay for Joseph Losey’s 1971 film adaptation, starring Alan Bates and Julie Christie.

By tradition, England teams have relied on cavaliers from the south and west for their runs: Frank Woolley, Wally Hammond, Denis Compton, Peter May, Tom Graveney, Ted Dexter. The north has usually supplied the fast bowlers: Harold Larwood of Nottinghamshire, Fred Trueman of Yorkshire, Brian Statham of Lancashire and another Lancastrian, Frank Tyson, who played for Northamptonshire. It is a cultural distinction that has no parallel in any other sport played in this country.

In terms of geography and temperament, cricket has always been the national game. Football may be more popular, but cricket tells us so much more about what kind of people we are. From Grace the bearded Victorian through Wilfred Rhodes the Yorkshire all-rounder and Douglas Jardine, the Old Wykehamist who created the ­“bodyline” strategy to defeat Don Bradman and Australia, to Trueman, Boycott, Ian Botham, Andrew Flintoff and now the ­imperturbable Cook, cricketers have revealed England to us.

Perhaps, given the sport’s capacity for renewal, we shouldn’t be too disheartened. There was a lot of boring cricket half a century ago before the one-day game, in the form of the Gillette Cup, arrived in 1963. The problem is, Twenty20, the bastard grandchild of the old Gillette, now holds the old-fashioned game at gunpoint. It titillates the easily bored, so it is “good” television, and has made millionaires of the leading players. It also makes many long-time cricket watchers wonder whether they understand the game any longer.

With Twenty20 has come a different sort of spectator, one that is new to cricket. These people are not cricket lovers in the old sense but “fans” who demonstrate tribal loyalties. As a consequence, the culture of a game that has never tolerated tribalism has been subverted by rowdy and sometimes intimidating behaviour.

Outside Lord’s, which retains a sense of fair play, it is clear that many people who attend Test matches know little about the men they are watching. The author Colin Shindler attended the Edgbaston Test in Birmingham against Australia last summer and observed that the spectators around him in the Eric Hollies stand “had no idea which counties the England players belonged to. All they wanted to do was drink, shout and draw attention to themselves. They couldn’t sit still even for an over.”

The Kulturkampf is complete and we are living in the ruins. The game’s rulers may not miss the old-fashioned spectators as they leave, never to return, because they want to connect with younger spectators, whatever the price – but cricket will. Who will pass on its lore, as Cardus, Arlott and CMJ did?

Last month it was reported that Yorkshire, the proudest tree in the forest of English cricket and county champions in the past two seasons, were preparing to sell their museum to help trim debts of almost £22m. This, from the club that gave us Rhodes and George Hirst; Herbert Sutcliffe and Leonard Hutton; Maurice Leyland and Hedley Verity; Trueman and Boycott; Brian Close and Raymond Illingworth; Michael Vaughan and young Root. Fabled men, indeed.

The English summer, wrote Cardus, ever the romantic, is inconceivable without cricket. He was right, but the skies are darkening and the air is full of those melancholy cornets.

This article first appeared in the 14 July 2016 issue of the New Statesman, The Brexit PM