This is a metaphor. Photograph: Getty Images
Show Hide image

Your brain on pseudoscience: the rise of popular neurobollocks

The “neuroscience” shelves in bookshops are groaning. But are the works of authors such as Malcolm Gladwell and Jonah Lehrer just self-help books dressed up in a lab coat?

An intellectual pestilence is upon us. Shop shelves groan with books purporting to explain, through snazzy brain-imaging studies, not only how thoughts and emotions function, but how politics and religion work, and what the correct answers are to age-old philosophical controversies. The dazzling real achievements of brain research are routinely pressed into service for questions they were never designed to answer. This is the plague of neuroscientism – aka neurobabble, neurobollocks, or neurotrash – and it’s everywhere.

In my book-strewn lodgings, one literally trips over volumes promising that “the deepest mysteries of what makes us who we are are gradually being unravelled” by neuroscience and cognitive psychology. (Even practising scientists sometimes make such grandiose claims for a general audience, perhaps urged on by their editors: that quotation is from the psychologist Elaine Fox’s interesting book on “the new science of optimism”, Rainy Brain, Sunny Brain, published this summer.) In general, the “neural” explanation has become a gold standard of non-fiction exegesis, adding its own brand of computer-assisted lab-coat bling to a whole new industry of intellectual quackery that affects to elucidate even complex sociocultural phenomena. Chris Mooney’s The Republican Brain: the Science of Why They Deny Science – and Reality disavows “reductionism” yet encourages readers to treat people with whom they disagree more as pathological specimens of brain biology than as rational interlocutors.

The New Atheist polemicist Sam Harris, in The Moral Landscape, interprets brain and other research as showing that there are objective moral truths, enthusiastically inferring – almost as though this were the point all along – that science proves “conservative Islam” is bad.

Happily, a new branch of the neuroscienceexplains everything genre may be created at any time by the simple expedient of adding the prefix “neuro” to whatever you are talking about. Thus, “neuroeconomics” is the latest in a long line of rhetorical attempts to sell the dismal science as a hard one; “molecular gastronomy” has now been trumped in the scientised gluttony stakes by “neurogastronomy”; students of Republican and Democratic brains are doing “neuropolitics”; literature academics practise “neurocriticism”. There is “neurotheology”, “neuromagic” (according to Sleights of Mind, an amusing book about how conjurors exploit perceptual bias) and even “neuromarketing”. Hoping it’s not too late to jump on the bandwagon, I have decided to announce that I, too, am skilled in the newly minted fields of neuroprocrastination and neuroflâneurship.

Illumination is promised on a personal as well as a political level by the junk enlightenment of the popular brain industry. How can I become more creative? How can I make better decisions? How can I be happier? Or thinner? Never fear: brain research has the answers. It is self-help armoured in hard science. Life advice is the hook for nearly all such books. (Some cram the hard sell right into the title – such as John B Arden’s Rewire Your Brain: Think Your Way to a Better Life.) Quite consistently, heir recommendations boil down to a kind of neo- Stoicism, drizzled with brain-juice. In a selfcongratulatory egalitarian age, you can no longer tell people to improve themselves morally. So self-improvement is couched in instrumental, scientifically approved terms.

The idea that a neurological explanation could exhaust the meaning of experience was already being mocked as “medical materialism” by the psychologist William James a century ago. And today’s ubiquitous rhetorical confidence about how the brain works papers over a still-enormous scientific uncertainty. Paul Fletcher, professor of health neuroscience at the University of Cambridge, says that he gets “exasperated” by much popular coverage of neuroimaging research, which assumes that “activity in a brain region is the answer to some profound question about psychological processes. This is very hard to justify given how little we currently know about what different regions of the brain actually do.” Too often, he tells me in an email correspondence, a popular writer will “opt for some sort of neuro-flapdoodle in which a highly simplistic and questionable point is accompanied by a suitably grand-sounding neural term and thus acquires a weightiness that it really doesn’t deserve. In my view, this is no different to some mountebank selling quacksalve by talking about the physics of water molecules’ memories, or a beautician talking about action liposomes.”

Shades of grey

The human brain, it is said, is the most complex object in the known universe. That a part of it “lights up” on an fMRI scan does not mean the rest is inactive; nor is it obvious what any such lighting-up indicates; nor is it straightforward to infer general lessons about life from experiments conducted under highly artificial conditions. Nor do we have the faintest clue about the biggest mystery of all – how does a lump of wet grey matter produce the conscious experience you are having right now, reading this paragraph? How come the brain gives rise to the mind? No one knows.

So, instead, here is a recipe for writing a hit popular brain book. You start each chapter with a pat anecdote about an individual’s professional or entrepreneurial success, or narrow escape from peril. You then mine the neuroscientific research for an apparently relevant specific result and narrate the experiment, perhaps interviewing the scientist involved and describing his hair. You then climax in a fit of premature extrapolation, inferring from the scientific result a calming bromide about what it is to function optimally as a modern human being. Voilà, a laboratory-sanctioned Big Idea in digestible narrative form. This is what the psychologist Christopher Chabris has named the “story-study-lesson” model, perhaps first perfected by one Malcolm Gladwell. A series of these threesomes may be packaged into a book, and then resold again and again as a stand-up act on the wonderfully lucrative corporate lecture circuit.

Such is the rigid formula of Imagine: How Creativity Works, published in March this year by the American writer Jonah Lehrer. The book is a shatteringly glib mishmash of magazine yarn, bizarrely incompetent literary criticism, inspiring business stories about mops and dolls and zany overinterpretation of research findings in neuroscience and psychology. Lehrer responded to my hostile review of the book by claiming that I thought the science he was writing about was “useless”, but such garbage needs to be denounced precisely in defence of the achievements of science. (In a sense, as Paul Fletcher points out, such books are “anti science, given that science is supposed to be  our protection against believing whatever we find most convenient, comforting or compelling”.) More recently, Lehrer admitted fabricating quotes by Bob Dylan in Imagine, which was hastily withdrawn from sale, and he resigned from his post at the New Yorker. To invent things supposedly said by the most obsessively studied popular artist of our age is a surprising gambit. Perhaps Lehrer misunderstood his own advice about creativity.

Mastering one’s own brain is also the key to survival in a dog-eat-dog corporate world, as promised by the cognitive scientist Art Markman’s Smart Thinking: How to Think Big, Innovate and Outperform Your Rivals. Meanwhile, the field (or cult) of “neurolinguistic programming” (NLP) sells techniques not only of self-overcoming but of domination over others. (According to a recent NLP handbook, you can “create virtually any and all states” in other people by using “embedded commands”.) The employee using such arcane neurowisdom will get promoted over the heads of his colleagues; the executive will discover expert-sanctioned ways to render his underlings more docile and productive, harnessing “creativity” for profit.

Waterstones now even has a display section labelled “Smart Thinking”, stocked with pop brain tracts. The true function of such books, of course, is to free readers from the responsibility of thinking for themselves. This is made eerily explicit in the psychologist Jonathan Haidt’s The Righteous Mind, published last March, which claims to show that “moral knowledge” is best obtained through “intuition” (arising from unconscious brain processing) rather than by explicit reasoning. “Anyone who values truth should stop worshipping reason,” Haidt enthuses, in a perverse manifesto for autolobotomy. I made an Olympian effort to take his advice seriously, and found myself rejecting the reasoning of his entire book.

Modern neuro-self-help pictures the brain as a kind of recalcitrant Windows PC. You know there is obscure stuff going on under the hood, so you tinker delicately with what you can see to try to coax it into working the way you want. In an earlier age, thinkers pictured the brain as a marvellously subtle clockwork mechanism, that being the cutting-edge high technology of the day. Our own brain-as-computer metaphor has been around for decades: there is the “hardware”, made up of different physical parts (the brain), and the “software”, processing routines that use different neuronal “circuits”. Updating things a bit for the kids, the evolutionary psychologist Robert Kurzban, in Why Everyone (Else) Is a Hypocrite, explains that the brain is like an iPhone running a bunch of different apps.

Such metaphors are apt to a degree, as long as you remember to get them the right way round. (Gladwell, in Blink – whose motivational selfhelp slogan is that “we can control rapid cognition” – burblingly describes the fusiform gyrus as “an incredibly sophisticated piece of brain software”, though the fusiform gyrus is a physical area of the brain, and so analogous to “hardware” not “software”.) But these writers tend to reach for just one functional story about a brain subsystem – the story that fits with their Big Idea – while ignoring other roles the same system might play. This can lead to a comical inconsistency across different books, and even within the oeuvre of a single author.

Is dopamine “the molecule of intuition”, as Jonah Lehrer risibly suggested in The Decisive Moment (2009), or is it the basis of “the neural highway that’s responsible for generating the pleasurable emotions”, as he wrote in Imagine? (Meanwhile, Susan Cain’s Quiet: the Power of Introverts in a World That Can’t Stop Talking calls dopamine the “reward chemical” and postulates that extroverts are more responsive to it.) Other recurring stars of the pop literature are the hormone oxytocin (the “love chemical”) and mirror neurons, which allegedly explain empathy. Jonathan Haidt tells the weirdly unexplanatory micro-story that, in one experiment, “The subjects used their mirror neurons, empathised, and felt the other’s pain.” If I tell you to use your mirror neurons, do you know what to do? Alternatively, can you do as Lehrer advises and “listen to” your prefrontal cortex? Self-help can be a tricky business.

Cherry-picking

Distortion of what and how much we know is bound to occur, Paul Fletcher points out, if the literature is cherry-picked.

“Having outlined your theory,” he says, “you can then cite a finding from a neuroimaging study identifying, for example, activity in a brain region such as the insula . . . You then select from among the many theories of insula function, choosing the one that best fits with your overall hypothesis, but neglecting to mention that nobody really knows what the insula does or that there are many ideas about its possible function.”

But the great movie-monster of nearly all the pop brain literature is another region: the amygdala. It is routinely described as the “ancient” or “primitive” brain, scarily atavistic. There is strong evidence for the amygdala’s role in fear, but then fear is one of the most heavily studied emotions; popularisers downplay or ignore the amygdala’s associations with the cuddlier emotions and memory. The implicit picture is of our uneasy coexistence with a beast inside the head, which needs to be controlled if we are to be happy, or at least liberal. (In The Republican Brain, Mooney suggests that “conservatives and authoritarians” might be the nasty way they are because they have a “more active amygdala”.) René Descartes located the soul in the pineal gland; the moral of modern pop neuroscience is that original sin is physical – a bestial, demonic proto-brain lurking at the heart of darkness within our own skulls. It’s an angry ghost in the machine.

Indeed, despite their technical paraphernalia of neurotransmitters and anterior temporal gyruses, modern pop brain books are offering a spiritual topography. Such is the seductive appeal of fMRI brain scans, their splashes of red, yellow and green lighting up what looks like a black intracranial vacuum. In mass culture, the fMRI scan (usually merged from several individuals) has become a secular icon, the converse of a Hubble Space Telescope image. The latter shows us awe-inspiring vistas of distant nebulae, as though painstakingly airbrushed by a sci-fi book-jacket artist; the former peers the other way, into psychedelic inner space. And the pictures, like religious icons, inspire uncritical devotion: a 2008 study, Fletcher notes, showed that “people – even neuroscience undergrads – are more likely to believe a brain scan than a bar graph”.

In The Invisible Gorilla, Christopher Chabris and his collaborator Daniel Simons advise readers to be wary of such “brain porn”, but popular magazines, science websites and books are frenzied consumers and hypers of these scans. “This is your brain on music”, announces a caption to a set of fMRI images, and we are invited to conclude that we now understand more about the experience of listening to music. The “This is your brain on” meme, it seems, is indefinitely extensible: Google results offer “This is your brain on poker”, “This is your brain on metaphor”, “This is your brain on diet soda”, “This is your brain on God” and so on, ad nauseam. I hereby volunteer to submit to a functional magnetic-resonance imaging scan while reading a stack of pop neuroscience volumes, for an illuminating series of pictures entitled This Is Your Brain on Stupid Books About Your Brain.

None of the foregoing should be taken to imply that fMRI and other brain-investigation techniques are useless: there is beautiful and amazing science in how they work and what well-designed experiments can teach us. “One of my favourites,” Fletcher says, “is the observation that one can take measures of brain activity (either using fMRI or EEG) while someone is learning . . . a list of words, and that activity can actually predict whether particular words will be remembered when the person is tested later (even the next day). This to me demonstrates something important – that observing activity in the brain can tell us something about how somebody is processing stimuli in ways that the person themselves is unable to report. With measures like that, we can begin to see how valuable it is to measure brain activity – it is giving us information that would otherwise be hidden from us.”

In this light, one might humbly venture a preliminary diagnosis of the pop brain hacks’ chronic intellectual error. It is that they misleadingly assume we always know how to interpret such “hidden” information, and that it is always more reliably meaningful than what lies in plain view. The hucksters of neuroscientism are the conspiracy theorists of the human animal, the 9/11 Truthers of the life of the mind.

Steven Poole is the author of the forthcoming book “You Aren’t What You Eat”, which will be published by Union Books in October.

This article was updated on 18 September 2012.

This article first appeared in the 10 September 2012 issue of the New Statesman, Autumn politics special

© LEWIS MORLEY/NATIONAL MEDIA MUSEUM SCIENCE & SOCIETY PICTURE LIBRARY. COURTESY OF VICTORIA AND ALBERT MUSEUM, LONDON
Show Hide image

Nostalgia without memory

We had a terrific time in the Sixties – but at what cost to the millennial generation?

There is a flurry of Sixties-worship at present, with an exhibition at the ­Victoria and Albert Museum in London and a cinema documentary about the Beatles’ ­touring years directed by Ron Howard. Next month, two more books on the ­subject will join the pile to which I have admittedly contributed more than my share. Steve Turner’s Beatles ’66: the Revolutionary Year reconstructs the band’s exploits in that eventful season (also recently chronicled in Jon Savage’s weighty 1966: the Year the Decade Exploded). And Paul Howard’s I Read the News Today, Oh Boy tells the story of Tara Browne, the gilded young Guinness heir whose death at the wheel of a Lotus Elan inspired John Lennon’s greatest song, “A Day in the Life”.

Truly, this is the decade that never dies. At frequent intervals since the mid-­Eighties, glossy magazines have announced that “the Sixties are back”, with fashion spreads of Paisley fabrics, Mary Quant-ish bobs, ­shorter-than-ever miniskirts and elastic-sided Chelsea boots. Sixties pop music eternally dominates radio playlists, while the Rolling Stones, the decade’s most notorious band, though now withered old-age pensioners, are still widely reckoned the coolest, most dangerous dudes on the planet.

For that, we largely have to thank the “Sixties children”, who lived through the most magical time for youth there ever was, survived its surfeits of alcohol, sex and mind-shredding drugs, and now seek to perpetuate their glorious heyday even unto senility. But the greatest celebrants of the era are often people who never experienced it first-hand yet still yearn for it in a syndrome that psychologists call “nostalgia without memory”. Tony Blair’s “Cool Britannia” shtick in the Nineties, for instance, pastiched the Swinging London of three decades earlier, right down to the Union Jack carrier bags. In folk memory the Sixties are as a rosy blur of psychedelic colour, free love and Beatles music, their complexity and manifold horrors either unrealised or disregarded.

***

The mythic decade, as opposed to the real one, was no straight ten-year stretch. It didn’t get into gear until 1962 with the satire boom that produced BBC TV’s That Was The Week That Was, David Frost’s first starring vehicle, and Private Eye magazine, and didn’t absorb pop music until the Beatles’ historic first visit to America in 1964. Its closing year, marked by a series of vast open-air festivals – Woodstock, Bob Dylan on the Isle of Wight, the Stones’ free concert in Hyde Park – felt almost like a decade on its own. When 1970 dawned, so much resembling a grey morning-after, many Sixties children simply refused to believe the party was over and clung to their caftans and joss sticks far into the harsh new eras of glam rock and punk.

Its prime time is generally agreed to have been 1965, when London gave vent to a concerted burst of youthful creativity in music, art, fashion, photography, cinema and graphics, and a shabby, sleepy metropolis, bombed to ruins not long previously, received the unlikely sobriquet of “swinging”. At this stage, the swinging was confined to a small circle of musicians, models, actors and photographers, congregating in the same few, unpublicised bistros and clubs: the most emblematic pop single, among so many, was Dobie Gray’s “The ‘In’ Crowd”.

It is seen above all as an era of burgeoning freedom and tolerance when Britannia seemed to be loosening her Victorian stays one by one. The contraceptive pill became widely available, ending centuries of shotgun marriages and perilous backstreet abortions, and theatre censorship by an archaic royal flunky called the Lord Chamberlain came to an end. Male homosexuality was decriminalised, though not yet destigmatised, and the first feminist voices spoke out. The word “fuck” appeared in the Times (during the farcically unsuccessful obscenity prosecution of Penguin, publisher of D H Lawrence’s Lady Chatterley’s Lover) and was heard on BBC Television, albeit only in quotation marks, from the National Theatre’s literary manager, Kenneth Tynan.

Yet alongside the pop-cultural harlequinade, Britain faced many of the same problems as we do today – some, indeed, significantly worse. Industrial strife was so common that the rest of Europe came to know strikes as “the English disease”. Harold Wilson’s Labour government, continuously in power after 1964, imposed a strict wages freeze, then known as a “pay pause”, and failed so utterly to solve its own financial deficit that in 1967 Wilson was forced to devalue the pound by 14 per cent. The World Cup-winning 1966, that supposed annus mirabilis, also brought two events whose horrors still resonate: the Moors murders trial and the Aberfan disaster, in which a south Wales primary school was engulfed by a giant slag heap, killing 116 children and 28 adults.

Meanwhile, the outside world was taking its first steps backwards into hell. America’s inspirational young president John F Kennedy was assassinated, as, in horrifically quick succession, were his brother Robert and the great civil rights leader Dr Martin Luther King. The United States was shamed at home by racism and police violence (not much change there, then) and abroad by its war in Vietnam, which nightly filled British TV screens with images of bombed civilian enclaves and maimed children (little change there, either – except today barrel bombs replace napalm). A democratic movement in communist-controlled Czechoslovakia was crushed; there was incalculable murder and terror in China’s Cultural Revolution, genocide in Indonesia and Biafra, apartheid in South Africa and endemic famine in India. June 1967 brought not only Sgt Pepper’s Lonely Hearts Club Band and the “Summer of Love” but the Arab-Israeli Six Day War, whose cumulative effects remain seemingly impossible to resolve.

Throughout the Sixties, Britain, along with the rest of western Europe, faced the threat of nuclear war with Soviet Russia and more-than-possible total obliteration. And yet, paradoxically, this was a time of enviable domestic peace and stability. There was full employment, with almost nobody ever getting sacked except at the very top. Inflation was marginal; the NHS and other public services functioned without any hint of crisis; the nationalised railways, shorn of unprofitable branch lines by Lord Beeching, were dirty but dependable; the postal service, even after the introduction of an avowedly “second-class” tier, remained the envy of the world.

Pending that terminal flash in the sky, people felt safe. The only communicable disease left to be feared was smallpox. ­Terrorism was something that happened only in the distant Middle East: it could not conceivably take root among Britain’s hard working and law-abiding Indian and Pakistani immigrants despite the unfettered racism constantly hurled at them. One walked on to aircraft or into official buildings or the BBC without security checks. The first shadow of Northern Ireland’s Troubles, which were to bloody the Seventies and be described by American commentators as “Britain’s Vietnam”, did not appear until 1968.

Two world wars in the space of 30 years had trained ordinary Britons to feel guilty about any conspicuous consumption. In the Sixties, the advertising industry set about remedying this. The new Sunday newspaper colour supplements bulged with adverts for Scandinavian furniture, stereo systems and white Kosset carpets, and bombarded their readers with recipes for exotic dishes such as chicken Kiev and beef stroganoff, using quantities of butter and cream that once would have seemed downright immoral. When Rowntree launched a new wafer-thin chocolate mint, the company made a last-minute name change from Minty Thins to After Eight, suggesting elegant high-society dinner parties to a demographic only recently weaned from high teas. So older generations, too, could join an “in” crowd and share the feeling of life becoming measurably better every day.

The attention paid to youth was an extraordinary volte-face from that ancient British maxim “Children should be seen and not heard”. Young people now not only wielded huge economic power through pop music and fashion, but kicked aside class distinctions and social barriers. Following the Beatles template, almost all of the decade’s brightest new celebrities were in their twenties and from humble backgrounds: the photographer David Bailey, the model Twiggy, the painter David Hockney, the comedian Jimmy Tarbuck, the film stars David Hemmings, Rita Tushingham, Tom Courtenay and Terence Stamp. A northern or a cockney accent was almost a prerequisite of success. In Britain in the past, the working class had always tried to talk “up”; now the upper and middle classes strove to talk “down”. It still goes on.

Without any form of social media other than underground newspapers and ­flyers, Sixties youth culture managed to be remarkably united. It assumed that every figure of authority – indeed, anyone over 30 – was a pitiable lunatic. Unlike its counterparts in America and across Europe, it raised up no demagogues: its figureheads were lead singers in bands and radio disc jockeys whose dimness in no way reduced their potency. The hippies, who arrived post-1966, are now viewed as hopelessly naive and deluded, with their mantra of “Love and Peace”. Yet their pop festivals, love-ins and “happenings” were occasions that brought hundreds of thousands together without the slightest violence. There were ­moments when even their fiercest detractors wondered if they might really be a force for changing the world for the better.

***

The V&A exhibition “You Say You Want a Revolution?” focuses on the decade’s final phase, when Britain’s initially playful underground hardened into a many-headed protest movement containing every kind of extreme-leftish ideology; churning out insurrectionary literature amid the comforts of the consumer society; holding marches, demos and sit-ins of increasing militancy despite having nothing to protest about nearer than the Vietnam War (in which the Wilson government played no part whatsoever). It was always more serious in other European countries and the US, where former hippies made an easy transition to urban guerrillas and to Charles Manson’s serial-killing “Family”.

Simultaneously, the British police declared war on leading musicians whose songs seemed to encourage their fans to take drugs, whether the pot known to jazz players for generations or the new, man-made, “mind-expanding” lysergic acid diethylamide (LSD), which leaked from the very pores of the Beatles’ Sgt Pepper album. The fear was legitimate – in fact, nowhere near proportionate to the long-term problem in the making – but the reaction was hysterical scapegoating. In early 1967, with the collusion of MI5 and possibly the CIA, 18 police officers raided the Rolling Stone Keith Richards’s cottage in Sussex and Richards and Mick Jagger were charged with drug possession. After a grotesque show trial – yet another strike against that supposed Summer of Love – both Stones received prison terms for offences that normally would have rated a small fine or merely probation.

The recent death of Richard Neville, the founder of Oz magazine, awoke further memories of that moment when the Sixties’ indulgence of youth was suddenly turned off. The 1971 trial of Neville and his two co-editors for conspiracy to corrupt youthful morals (specifically by depicting Rupert Bear with an erection) was just as self-defeatingly comical as the Lady Chatterley prosecution almost a decade earlier.

For millennials who grew up around the year 2000, the Sixties are an object not so much of nostalgia without memory as envy without memory. My 25-year-old daughter often remarks what a terrific time my generation had and what a messed-up world we created for hers. I can’t argue with that.

Philip Norman’s “Paul McCartney: the Biography” is published by Weidenfeld & Nicolson. He tweets at: @PNormanWriter

This article first appeared in the 15 September 2016 issue of the New Statesman, The fall of the golden generation