This is a metaphor. Photograph: Getty Images
Show Hide image

Your brain on pseudoscience: the rise of popular neurobollocks

The “neuroscience” shelves in bookshops are groaning. But are the works of authors such as Malcolm Gladwell and Jonah Lehrer just self-help books dressed up in a lab coat?

An intellectual pestilence is upon us. Shop shelves groan with books purporting to explain, through snazzy brain-imaging studies, not only how thoughts and emotions function, but how politics and religion work, and what the correct answers are to age-old philosophical controversies. The dazzling real achievements of brain research are routinely pressed into service for questions they were never designed to answer. This is the plague of neuroscientism – aka neurobabble, neurobollocks, or neurotrash – and it’s everywhere.

In my book-strewn lodgings, one literally trips over volumes promising that “the deepest mysteries of what makes us who we are are gradually being unravelled” by neuroscience and cognitive psychology. (Even practising scientists sometimes make such grandiose claims for a general audience, perhaps urged on by their editors: that quotation is from the psychologist Elaine Fox’s interesting book on “the new science of optimism”, Rainy Brain, Sunny Brain, published this summer.) In general, the “neural” explanation has become a gold standard of non-fiction exegesis, adding its own brand of computer-assisted lab-coat bling to a whole new industry of intellectual quackery that affects to elucidate even complex sociocultural phenomena. Chris Mooney’s The Republican Brain: the Science of Why They Deny Science – and Reality disavows “reductionism” yet encourages readers to treat people with whom they disagree more as pathological specimens of brain biology than as rational interlocutors.

The New Atheist polemicist Sam Harris, in The Moral Landscape, interprets brain and other research as showing that there are objective moral truths, enthusiastically inferring – almost as though this were the point all along – that science proves “conservative Islam” is bad.

Happily, a new branch of the neuroscienceexplains everything genre may be created at any time by the simple expedient of adding the prefix “neuro” to whatever you are talking about. Thus, “neuroeconomics” is the latest in a long line of rhetorical attempts to sell the dismal science as a hard one; “molecular gastronomy” has now been trumped in the scientised gluttony stakes by “neurogastronomy”; students of Republican and Democratic brains are doing “neuropolitics”; literature academics practise “neurocriticism”. There is “neurotheology”, “neuromagic” (according to Sleights of Mind, an amusing book about how conjurors exploit perceptual bias) and even “neuromarketing”. Hoping it’s not too late to jump on the bandwagon, I have decided to announce that I, too, am skilled in the newly minted fields of neuroprocrastination and neuroflâneurship.

Illumination is promised on a personal as well as a political level by the junk enlightenment of the popular brain industry. How can I become more creative? How can I make better decisions? How can I be happier? Or thinner? Never fear: brain research has the answers. It is self-help armoured in hard science. Life advice is the hook for nearly all such books. (Some cram the hard sell right into the title – such as John B Arden’s Rewire Your Brain: Think Your Way to a Better Life.) Quite consistently, heir recommendations boil down to a kind of neo- Stoicism, drizzled with brain-juice. In a selfcongratulatory egalitarian age, you can no longer tell people to improve themselves morally. So self-improvement is couched in instrumental, scientifically approved terms.

The idea that a neurological explanation could exhaust the meaning of experience was already being mocked as “medical materialism” by the psychologist William James a century ago. And today’s ubiquitous rhetorical confidence about how the brain works papers over a still-enormous scientific uncertainty. Paul Fletcher, professor of health neuroscience at the University of Cambridge, says that he gets “exasperated” by much popular coverage of neuroimaging research, which assumes that “activity in a brain region is the answer to some profound question about psychological processes. This is very hard to justify given how little we currently know about what different regions of the brain actually do.” Too often, he tells me in an email correspondence, a popular writer will “opt for some sort of neuro-flapdoodle in which a highly simplistic and questionable point is accompanied by a suitably grand-sounding neural term and thus acquires a weightiness that it really doesn’t deserve. In my view, this is no different to some mountebank selling quacksalve by talking about the physics of water molecules’ memories, or a beautician talking about action liposomes.”

Shades of grey

The human brain, it is said, is the most complex object in the known universe. That a part of it “lights up” on an fMRI scan does not mean the rest is inactive; nor is it obvious what any such lighting-up indicates; nor is it straightforward to infer general lessons about life from experiments conducted under highly artificial conditions. Nor do we have the faintest clue about the biggest mystery of all – how does a lump of wet grey matter produce the conscious experience you are having right now, reading this paragraph? How come the brain gives rise to the mind? No one knows.

So, instead, here is a recipe for writing a hit popular brain book. You start each chapter with a pat anecdote about an individual’s professional or entrepreneurial success, or narrow escape from peril. You then mine the neuroscientific research for an apparently relevant specific result and narrate the experiment, perhaps interviewing the scientist involved and describing his hair. You then climax in a fit of premature extrapolation, inferring from the scientific result a calming bromide about what it is to function optimally as a modern human being. Voilà, a laboratory-sanctioned Big Idea in digestible narrative form. This is what the psychologist Christopher Chabris has named the “story-study-lesson” model, perhaps first perfected by one Malcolm Gladwell. A series of these threesomes may be packaged into a book, and then resold again and again as a stand-up act on the wonderfully lucrative corporate lecture circuit.

Such is the rigid formula of Imagine: How Creativity Works, published in March this year by the American writer Jonah Lehrer. The book is a shatteringly glib mishmash of magazine yarn, bizarrely incompetent literary criticism, inspiring business stories about mops and dolls and zany overinterpretation of research findings in neuroscience and psychology. Lehrer responded to my hostile review of the book by claiming that I thought the science he was writing about was “useless”, but such garbage needs to be denounced precisely in defence of the achievements of science. (In a sense, as Paul Fletcher points out, such books are “anti science, given that science is supposed to be  our protection against believing whatever we find most convenient, comforting or compelling”.) More recently, Lehrer admitted fabricating quotes by Bob Dylan in Imagine, which was hastily withdrawn from sale, and he resigned from his post at the New Yorker. To invent things supposedly said by the most obsessively studied popular artist of our age is a surprising gambit. Perhaps Lehrer misunderstood his own advice about creativity.

Mastering one’s own brain is also the key to survival in a dog-eat-dog corporate world, as promised by the cognitive scientist Art Markman’s Smart Thinking: How to Think Big, Innovate and Outperform Your Rivals. Meanwhile, the field (or cult) of “neurolinguistic programming” (NLP) sells techniques not only of self-overcoming but of domination over others. (According to a recent NLP handbook, you can “create virtually any and all states” in other people by using “embedded commands”.) The employee using such arcane neurowisdom will get promoted over the heads of his colleagues; the executive will discover expert-sanctioned ways to render his underlings more docile and productive, harnessing “creativity” for profit.

Waterstones now even has a display section labelled “Smart Thinking”, stocked with pop brain tracts. The true function of such books, of course, is to free readers from the responsibility of thinking for themselves. This is made eerily explicit in the psychologist Jonathan Haidt’s The Righteous Mind, published last March, which claims to show that “moral knowledge” is best obtained through “intuition” (arising from unconscious brain processing) rather than by explicit reasoning. “Anyone who values truth should stop worshipping reason,” Haidt enthuses, in a perverse manifesto for autolobotomy. I made an Olympian effort to take his advice seriously, and found myself rejecting the reasoning of his entire book.

Modern neuro-self-help pictures the brain as a kind of recalcitrant Windows PC. You know there is obscure stuff going on under the hood, so you tinker delicately with what you can see to try to coax it into working the way you want. In an earlier age, thinkers pictured the brain as a marvellously subtle clockwork mechanism, that being the cutting-edge high technology of the day. Our own brain-as-computer metaphor has been around for decades: there is the “hardware”, made up of different physical parts (the brain), and the “software”, processing routines that use different neuronal “circuits”. Updating things a bit for the kids, the evolutionary psychologist Robert Kurzban, in Why Everyone (Else) Is a Hypocrite, explains that the brain is like an iPhone running a bunch of different apps.

Such metaphors are apt to a degree, as long as you remember to get them the right way round. (Gladwell, in Blink – whose motivational selfhelp slogan is that “we can control rapid cognition” – burblingly describes the fusiform gyrus as “an incredibly sophisticated piece of brain software”, though the fusiform gyrus is a physical area of the brain, and so analogous to “hardware” not “software”.) But these writers tend to reach for just one functional story about a brain subsystem – the story that fits with their Big Idea – while ignoring other roles the same system might play. This can lead to a comical inconsistency across different books, and even within the oeuvre of a single author.

Is dopamine “the molecule of intuition”, as Jonah Lehrer risibly suggested in The Decisive Moment (2009), or is it the basis of “the neural highway that’s responsible for generating the pleasurable emotions”, as he wrote in Imagine? (Meanwhile, Susan Cain’s Quiet: the Power of Introverts in a World That Can’t Stop Talking calls dopamine the “reward chemical” and postulates that extroverts are more responsive to it.) Other recurring stars of the pop literature are the hormone oxytocin (the “love chemical”) and mirror neurons, which allegedly explain empathy. Jonathan Haidt tells the weirdly unexplanatory micro-story that, in one experiment, “The subjects used their mirror neurons, empathised, and felt the other’s pain.” If I tell you to use your mirror neurons, do you know what to do? Alternatively, can you do as Lehrer advises and “listen to” your prefrontal cortex? Self-help can be a tricky business.


Distortion of what and how much we know is bound to occur, Paul Fletcher points out, if the literature is cherry-picked.

“Having outlined your theory,” he says, “you can then cite a finding from a neuroimaging study identifying, for example, activity in a brain region such as the insula . . . You then select from among the many theories of insula function, choosing the one that best fits with your overall hypothesis, but neglecting to mention that nobody really knows what the insula does or that there are many ideas about its possible function.”

But the great movie-monster of nearly all the pop brain literature is another region: the amygdala. It is routinely described as the “ancient” or “primitive” brain, scarily atavistic. There is strong evidence for the amygdala’s role in fear, but then fear is one of the most heavily studied emotions; popularisers downplay or ignore the amygdala’s associations with the cuddlier emotions and memory. The implicit picture is of our uneasy coexistence with a beast inside the head, which needs to be controlled if we are to be happy, or at least liberal. (In The Republican Brain, Mooney suggests that “conservatives and authoritarians” might be the nasty way they are because they have a “more active amygdala”.) René Descartes located the soul in the pineal gland; the moral of modern pop neuroscience is that original sin is physical – a bestial, demonic proto-brain lurking at the heart of darkness within our own skulls. It’s an angry ghost in the machine.

Indeed, despite their technical paraphernalia of neurotransmitters and anterior temporal gyruses, modern pop brain books are offering a spiritual topography. Such is the seductive appeal of fMRI brain scans, their splashes of red, yellow and green lighting up what looks like a black intracranial vacuum. In mass culture, the fMRI scan (usually merged from several individuals) has become a secular icon, the converse of a Hubble Space Telescope image. The latter shows us awe-inspiring vistas of distant nebulae, as though painstakingly airbrushed by a sci-fi book-jacket artist; the former peers the other way, into psychedelic inner space. And the pictures, like religious icons, inspire uncritical devotion: a 2008 study, Fletcher notes, showed that “people – even neuroscience undergrads – are more likely to believe a brain scan than a bar graph”.

In The Invisible Gorilla, Christopher Chabris and his collaborator Daniel Simons advise readers to be wary of such “brain porn”, but popular magazines, science websites and books are frenzied consumers and hypers of these scans. “This is your brain on music”, announces a caption to a set of fMRI images, and we are invited to conclude that we now understand more about the experience of listening to music. The “This is your brain on” meme, it seems, is indefinitely extensible: Google results offer “This is your brain on poker”, “This is your brain on metaphor”, “This is your brain on diet soda”, “This is your brain on God” and so on, ad nauseam. I hereby volunteer to submit to a functional magnetic-resonance imaging scan while reading a stack of pop neuroscience volumes, for an illuminating series of pictures entitled This Is Your Brain on Stupid Books About Your Brain.

None of the foregoing should be taken to imply that fMRI and other brain-investigation techniques are useless: there is beautiful and amazing science in how they work and what well-designed experiments can teach us. “One of my favourites,” Fletcher says, “is the observation that one can take measures of brain activity (either using fMRI or EEG) while someone is learning . . . a list of words, and that activity can actually predict whether particular words will be remembered when the person is tested later (even the next day). This to me demonstrates something important – that observing activity in the brain can tell us something about how somebody is processing stimuli in ways that the person themselves is unable to report. With measures like that, we can begin to see how valuable it is to measure brain activity – it is giving us information that would otherwise be hidden from us.”

In this light, one might humbly venture a preliminary diagnosis of the pop brain hacks’ chronic intellectual error. It is that they misleadingly assume we always know how to interpret such “hidden” information, and that it is always more reliably meaningful than what lies in plain view. The hucksters of neuroscientism are the conspiracy theorists of the human animal, the 9/11 Truthers of the life of the mind.

Steven Poole is the author of the forthcoming book “You Aren’t What You Eat”, which will be published by Union Books in October.

This article was updated on 18 September 2012.

This article first appeared in the 10 September 2012 issue of the New Statesman, Autumn politics special

Almeida Theatre
Show Hide image

Rupert Goold: “A director always has to be more of a listener”

The artistic director of the Almeida Theatre on working with Patrick Stewart, the inaccessibility of the arts, and directing his wife in Medea.

Eight years ago Rupert Goold’s Macbeth made his name. The critics were unanimous in their praise, with one calling it the “Macbeth of a lifetime”. Goold’s first Olivier Award soon followed (Enron won him a second in 2009, King Charles III nearly won him a third last year). It was a family triumph; Lady Macbeth was played by Goold’s wife, Kate Fleetwood.

Now the pair has finally reunited and Fleetwood is his undisputed lead. She is playing Medea in the Almeida’s latest and final play of its Greek season. Directing your wife is one thing. Directing her in a play about a woman who murders her children because her husband abandons her is another. And it’s been harder than Goold expected.

“You live with someone every day, and they don’t age because the change is so incremental, and then you do something together and you realise how much you’ve changed. It’s like playing tennis with someone after eight years: you’re completely different players.”

As it is, Goold thinks the director-actor relationship is inevitably fraught. “There is an essential slave-master, sadomasochistic, relationship,” he says. “The incredibly complicated thing about being an actor is you’re constantly being told what to do. And one of the most damaging things about being a director – and why most of them are complete arseholes – is because they get off at telling people what to do.”

Goold doesn’t. He’s as amicable in person as the pictures – bountiful hair, loose jacket, wide grin – suggest. And when we meet in the Almedia’s crowded rehearsal rooms, tucked away on Upper Street, 100 yards from the theatre, he’s surprisingly serene given his play is about to open.

He once said that directing a play is like running towards a wall and hoping it becomes a door just before the curtain goes up. Has the door appeared? “It’s always a funny moment [at the end of rehearsal]. Sometimes you do a show and it’s a bit dead and the costumes and set transform it. Then sometimes it’s perfect and the design kills it.”

We meet shortly before last Thursday’s press night, and he can’t tell how good it is. But it “certainly feels quite private. The idea that loads of people are going to come and watch it now feels a bit weird. You bring a lot of your sense of relationships and parenting into it.”

Goold has always argued that the classics wither without intervention. So in this revival of Euripides’ 2,446-year-old play, Medea is a writer and her husband, Jason (of Argonauts fame), is an actor. “But it’s not really about that… it’s more about divorce, about what it means to separate.”

“It’s about the impact of a long-term relationship when it collapses. I don’t know whether there is a rich tradition of drama like that, and yet for most people, those kind of separations are far more profound and complicated and have greater ramifications than first love; and we have millions of plays about first love!”

Every generation discovers their own time in the Greek plays. Goold thinks he and playwright Rachel Cusk were shaped by the aftermath of the 1970s in interpreting Medea; “That’s the period when the idea of the family began to get tainted.” And when critics praised Oresteia, the Almeida’s first Greek play and a surprise West End transfer, they compared it to the Sopranos.

Yet there is something eternal about these plays. Goold says it’s the way they “stare at these problems that are totally perennial, like death,” and then offer answers that aren’t easy. Medea kills the kids and a mother rips her son to shreds in the Bakkhai (the Almeida’s predecessor to Medea). Where’s the moral compass in that?

Except there is a twist in Goold’s Medea, and it’s not one every critic has taken kindly to. It was enough to stop the Telegraph’s Dominic Cavendish, otherwise lavish in his praise, from calling it “a Medea for our times”. Nevertheless, the reviews have been kind, as they often are for Goold; although The Times’ Ann Treneman was vitriolic in her dislike (“Everyone is ghastly. The men are beyond irritating. The women even worse.”).

In theory, Goold welcomes the criticism. “I’d rather our audience hated something and talked about it than was passively pleased,” he tells me ahead of reviews.

Controversial and bracing theatre is what Goold wants to keep directing and producing; as the Almeida’s artistic director he is in charge of more than just his own shows. But how does he do it? I put a question to him: if I had to direct Medea instead of him, what advice would he have given me?

He pauses. “You’ve got to love words,” he begins. “There’s no point doing it unless you have a real delight in language. And you have to have vision. But probably the most important thing is, you’ve got to know how to manage a room.”

“It’s people management. So often I have assistants, or directors I produce, and I think ‘God, they’re just not listening to what that person is trying to say, what they’re trying to give.’ They’re either shutting them down or forcing them into a box.”

“Most people in a creative process have to focus on what they want to say, but a director always has to be more of a listener. People do it different ways. Some people spin one plate incredibly fast and vibrantly in the middle of the room, and hope all the others get sucked in. It’s about thriving off of one person – the director, the lead performer, whomever.”

“I’m more about the lowest common denominator: the person you’re most aware of is the least engaged. You have to keep lifting them up, then you get more creativity coming in.”

It’s not always simple. When actors and directors disagree, the director can only demand so much, especially if the actor is far more famous than them. When Goold directed Macbeth, Patrick Stewart was his lead. Stewart was a movie star and twice his age.

“Patrick’s take on Macbeth… I didn’t think it should be played that way. I’d played him as a student and I had an idea of what he was.”

“But then you think, ‘Ok, you’re never going to be what I want you to be, but actually let me get rid of that, and just focus on what’s good about what you want to be, and get rid of some of the crap.’”

Goold doesn’t think he’s ever really struggled to win an actor’s respect (“touch wood”). The key thing, he says, is that “they just feel you’re trying to make legible their intention”.

And then you must work around your lead. In Macbeth, Stewart was “a big deep river of energy… when normally you get two people frenetically going ‘Uhgh! Is this a dagger I see before me! Uhgh!’ and there’s lots of hysteria.”

“So we threw all sorts of other shit at the production to compensate, to provide all the adrenalin which Patrick was taking away to provide clarity and humanity.”

Many people want to be theatre directors, and yet so few are successful. The writers, actors and playwrights who sell shows can be counted on a few hands. Depressingly, Goold thinks it’s becoming harder to break in. It’s difficult to be discovered. “God, I don’t know, what I worry – wonder – most is: ‘Are there just loads of great directors who don’t make it?’”

 The assisting route is just not a good way to find great new directors. “The kind of people who make good assistants don’t make good directors, it’s almost diametrically opposite.” As for regional directors, newspaper budgets have collapsed, so they can no longer rely on a visit from a handful of national critics, as Goold did when he was based in Salisbury and Northampton. And audiences for touring shows have, by some measures, halved in the past twenty years.

Theatre has also evolved. When Goold was coming through, “There were not a lot of directors who felt they were outside the library, so for me to whack on some techno was radical! Now it’d be more commonplace.” New directors have to find new ways to capture our attention – or at least the critics’.

But the critics have changed too. A nod from a critic can still be vital in the right circles, but the days when critics “made” directors is long over. “I remember Nick de Jongh saying, ‘Oh Rupert Goold, I made him.’ Because he’d put Macbeth on the front page of the Standard. I owed my career to him, and in some ways I did! But it's an absurd idea, that would not happen now.”

“It’s all changed so much in literally the past three years. There was a time, for better or worse, when you had a big group of establishment critics: de Jongh, Michael Billington, Michael Coveney, Charlie Spencer – they were mostly men – Susannah Clapp. And if they all liked your show, you were a hit.” (“They could be horrible,” he adds.)

“Now I get more of a sense of a show by being on Twitter than reading the reviews.” It’s “probably a good thing”, Goold thinks, and it certainly beats New York, where a single review – the New York Times' – makes or breaks plays. But it’s another problem for aspiring directors, who can no longer be so easily plucked from the crowd.

It’s no longer a problem Goold needs to overcome. His star could wane, but he seems likely to be among the leading voices in British theatre for a while yet.

Harry Lambert is a staff writer and editor of May2015, the New Statesman's election website.

Show Hide image

What Jeremy Corbyn can learn from Orwell

Corbyn’s ideas may echo George Orwell’s – but they’d need Orwell’s Britain to work. It’s time Corbyn accepted the British as they are today.

All Labour Party leaderships since 1900 have offered themselves as “new”, but Tony Blair’s succession in 1994 triggered a break with the past so ruthless that the Labour leadership virtually declared war on the party. Now it is party members’ turn and they, for now at any rate, think that real Labour is Jeremy.

To Keir Hardie, real Labour had been a trade union lobby expounding Fellowship. To the Webbs, real Labour was “common ownership” by the best means available. Sidney’s Clause Four (adopted 1918) left open what that might be. In the 1920s, the Christian Socialist R H Tawney stitched Equality into the banner, but during the Depression young intellectuals such as Evan Durbin and Hugh Gaitskell designated Planning as Labour’s modern mission. After the Second World War, Clement Attlee followed the miners (and the London Passenger Transport Board) into Nationalisation. Harold Wilson tried to inject Science and Technology into the mix but everything after that was an attempt to move Labour away from state-regulated markets and in the direction of market-regulated states.

What made the recent leadership contest so alarming was how broken was the intellectual tradition. None of the candidates made anything of a long history of thinking about the relationship between socialism and what the people want. Yvette Cooper wanted to go over the numbers; only they were the wrong numbers. Andy Burnham twisted and turned. Liz Kendall based her bid on two words: “Have me.” Only Jeremy Corbyn seemed to have any kind of Labour narrative to tell and, of course, ever the ­rebel, he was not responsible for any of it. His conference address in Brighton was little more than the notes of a street-corner campaigner to a small crowd.

Given the paucity of thinking, and this being an English party for now, it is only a matter of time before George Orwell is brought in to see how Jeremy measures up. In fact, it’s happened already. Rafael Behr in the Guardian and Nick Cohen in the Spectator both see him as the kind of hard-left intellectual Orwell dreaded, while Charles Cooke in the National Review and Jason Cowley in the New Statesman joined unlikely fashion forces to take a side-look at Jeremy’s dreadful dress sense – to Orwell, a sure sign of a socialist. Cooke thought he looked like a “burned-out geography teacher at a third-rate comprehensive”. Cowley thought he looked like a red-brick university sociology lecturer circa 1978. Fair enough. He does. But there is more. Being a middle-class teetotal vegetarian bicycling socialistic feministic atheistic metropolitan anti-racist republican nice guy, with allotment and “squashily pacifist” leanings to match, clearly puts him in the land of the cranks as described by Orwell in The Road to Wigan Pier (1937) – one of “that dreary tribe of high-minded women and sandal-wearers and bearded fruit-juice drinkers who come flocking towards the smell of ‘progress’ like bluebottles to a dead cat”. And though Corbyn, as “a fully fledged, fully bearded, unabashed socialist” (Huffington Post), might make all true Orwellians twitch, he really made their day when he refused to sing the National Anthem. Orwell cited precisely that (see “The Lion and the Unicorn”, 1941) as an example of the distance between left-wing intellectuals and the people. It seemed that, by standing there, mouth shut, Comrade Corbyn didn’t just cut his wrists, he lay down full length in the coffin and pulled the lid shut.


Trouble is, this line of attack not only misrepresents the Labour leader, it misrepresents Orwell. For the great man was not as unflinchingly straight and true as some people think. It is impossible, for instance, to think of Orwell singing “God Save the King”, because he, too, was one of that “dreary tribe” of London lefties, and even when he joined Labour he remained ever the rebel. As for Corbyn, for a start, he is not badly dressed. He just doesn’t look like Chuka or Tristram. He may look like a threadbare schoolteacher, but Orwell was one twice over. Orwell was never a vegetarian or a teetotaller, but, like Corbyn, neither was he interested in fancy food (or drink), he kept an allotment, drove a motorbike, bicycled, cared about the poor, cared about the environment, loathed the empire, came close to pacifism at one point, and opposed war with Germany well past the time when it was reasonable to do so.

In Orwell’s thinking about socialism, for too long his main reference point was the London Marxist left. Not only did he make speeches in favour of revolutions, he took part in one with a gun in his hand. Orwell was far more interested, as Corbyn has been far more interested, in speaking truth to power than in holding office. His loyalty was to the movement, or at least the idea of the movement, not to MPs or the front bench, which he rarely mentioned. There is nothing in Corbyn’s position that would have shocked Orwell and, should they have met, there’d have been much to talk about: belief in public ownership and non-economic values, confidence in the state’s ability to make life better, progressive taxation, national health, state education, social care, anti-socially useless banking, anti-colonialism and a whole lot of other anti-isms besides. It’s hard to be sure what Orwell’s position would have been on Trident and immigration. Not Corbyn’s, I suspect. He was not as alert to feminism as he might have been but equally, few men try to write novels from a woman’s point of view and all Orwellians recognise that Julia is the dark hero of Nineteen Eighty-Four. In truth they are both austere types, not in it for themselves and not on anyone else’s expense account either. Corbyn won the leadership because this shone through from the very beginning. He came across as unaffected and straightforward – much as Orwell tried to be in his writing.

Except, as powerfully expressed in these pages by John Gray, Corbyn’s politics were made for another world. What sort of world would he need? First off, he’d need a regulated labour market: regulated by the state in partnership with a labour movement sensitive to what people wanted and experienced in trying to provide it. He would also need capital controls, a manufacturing base capable of building the new investment with Keynesian payback, an efficient and motivated Inland Revenue, a widespread public-service ethos that sees the country as an asset, not a market, and an overwhelming democratic mandate to get things done. In other words, Corbyn needs Orwell’s Britain – not this one – and at the very least, if he can’t have that, he needs the freedom to act that the European Commission forbids.

There’s another problem. Orwell did not trust left-wing intellectuals and spent half his life trying to work out their motivations as a class who spoke for the people, went in search of the people, and praised the people, but did not know them or believe in them. True, Corbyn says he wants to be open and inclusive, but we know he can’t possibly mean it when he says it will be the party, not him or the PLP, that will decide policy, just as we knew it couldn’t possibly be true when he said he’d turn PMQs into the People’s Question Time. Jeremy hasn’t changed his mind in forty years, appears to have great difficulty (unlike Tony Benn) in fusing socialism to national identity or experience (Hardie, Ben Okri and Maya Angelou were bolted on to his Brighton speech) and seems to think that not being happy with what you are given somehow captures the historic essence of socialism (rather than its opposite).

Granted, not thinking outside the ­circle is an inherent fault of the sectarian left but some of our most prominent left-wing journalists have it, too. Working-class support for nationalisation? Good. Right answer! Working-class opposition to benefit scroungers and further mass immigration? Bad. Wrong answer! Would you like to try again? In his essay “In Defence of Comrade Zilliacus” (1947) Orwell reckoned that left-wing intellectuals saw only what they wanted to see. For all their talk of representing the people, they hated the masses. “What they are frightened of is the prevailing opinion within their own group . . . there is always an orthodoxy, a parrot-cry . . .”

The game is hard and he may go down in a welter of knives, yet Corbyn still has time. He may go on making the same speech – on the benefits of apple pie to apple growers – but at some point he will have to drop the wish-list and get on the side of the British people as they are, and live with that, and build into it. Only the nation state can even begin to do the things he wants to do. The quicker he gets that, the quicker we can see if the latest incarnation of new Labour has a future.

Robert Colls is the author of “George Orwell: English Rebel” (Oxford University Press)

This article first appeared in the 08 October 2015 issue of the New Statesman, Putin vs Isis