This is a metaphor. Photograph: Getty Images
Show Hide image

Your brain on pseudoscience: the rise of popular neurobollocks

The “neuroscience” shelves in bookshops are groaning. But are the works of authors such as Malcolm Gladwell and Jonah Lehrer just self-help books dressed up in a lab coat?

An intellectual pestilence is upon us. Shop shelves groan with books purporting to explain, through snazzy brain-imaging studies, not only how thoughts and emotions function, but how politics and religion work, and what the correct answers are to age-old philosophical controversies. The dazzling real achievements of brain research are routinely pressed into service for questions they were never designed to answer. This is the plague of neuroscientism – aka neurobabble, neurobollocks, or neurotrash – and it’s everywhere.

In my book-strewn lodgings, one literally trips over volumes promising that “the deepest mysteries of what makes us who we are are gradually being unravelled” by neuroscience and cognitive psychology. (Even practising scientists sometimes make such grandiose claims for a general audience, perhaps urged on by their editors: that quotation is from the psychologist Elaine Fox’s interesting book on “the new science of optimism”, Rainy Brain, Sunny Brain, published this summer.) In general, the “neural” explanation has become a gold standard of non-fiction exegesis, adding its own brand of computer-assisted lab-coat bling to a whole new industry of intellectual quackery that affects to elucidate even complex sociocultural phenomena. Chris Mooney’s The Republican Brain: the Science of Why They Deny Science – and Reality disavows “reductionism” yet encourages readers to treat people with whom they disagree more as pathological specimens of brain biology than as rational interlocutors.

The New Atheist polemicist Sam Harris, in The Moral Landscape, interprets brain and other research as showing that there are objective moral truths, enthusiastically inferring – almost as though this were the point all along – that science proves “conservative Islam” is bad.

Happily, a new branch of the neuroscienceexplains everything genre may be created at any time by the simple expedient of adding the prefix “neuro” to whatever you are talking about. Thus, “neuroeconomics” is the latest in a long line of rhetorical attempts to sell the dismal science as a hard one; “molecular gastronomy” has now been trumped in the scientised gluttony stakes by “neurogastronomy”; students of Republican and Democratic brains are doing “neuropolitics”; literature academics practise “neurocriticism”. There is “neurotheology”, “neuromagic” (according to Sleights of Mind, an amusing book about how conjurors exploit perceptual bias) and even “neuromarketing”. Hoping it’s not too late to jump on the bandwagon, I have decided to announce that I, too, am skilled in the newly minted fields of neuroprocrastination and neuroflâneurship.

Illumination is promised on a personal as well as a political level by the junk enlightenment of the popular brain industry. How can I become more creative? How can I make better decisions? How can I be happier? Or thinner? Never fear: brain research has the answers. It is self-help armoured in hard science. Life advice is the hook for nearly all such books. (Some cram the hard sell right into the title – such as John B Arden’s Rewire Your Brain: Think Your Way to a Better Life.) Quite consistently, heir recommendations boil down to a kind of neo- Stoicism, drizzled with brain-juice. In a selfcongratulatory egalitarian age, you can no longer tell people to improve themselves morally. So self-improvement is couched in instrumental, scientifically approved terms.

The idea that a neurological explanation could exhaust the meaning of experience was already being mocked as “medical materialism” by the psychologist William James a century ago. And today’s ubiquitous rhetorical confidence about how the brain works papers over a still-enormous scientific uncertainty. Paul Fletcher, professor of health neuroscience at the University of Cambridge, says that he gets “exasperated” by much popular coverage of neuroimaging research, which assumes that “activity in a brain region is the answer to some profound question about psychological processes. This is very hard to justify given how little we currently know about what different regions of the brain actually do.” Too often, he tells me in an email correspondence, a popular writer will “opt for some sort of neuro-flapdoodle in which a highly simplistic and questionable point is accompanied by a suitably grand-sounding neural term and thus acquires a weightiness that it really doesn’t deserve. In my view, this is no different to some mountebank selling quacksalve by talking about the physics of water molecules’ memories, or a beautician talking about action liposomes.”

Shades of grey

The human brain, it is said, is the most complex object in the known universe. That a part of it “lights up” on an fMRI scan does not mean the rest is inactive; nor is it obvious what any such lighting-up indicates; nor is it straightforward to infer general lessons about life from experiments conducted under highly artificial conditions. Nor do we have the faintest clue about the biggest mystery of all – how does a lump of wet grey matter produce the conscious experience you are having right now, reading this paragraph? How come the brain gives rise to the mind? No one knows.

So, instead, here is a recipe for writing a hit popular brain book. You start each chapter with a pat anecdote about an individual’s professional or entrepreneurial success, or narrow escape from peril. You then mine the neuroscientific research for an apparently relevant specific result and narrate the experiment, perhaps interviewing the scientist involved and describing his hair. You then climax in a fit of premature extrapolation, inferring from the scientific result a calming bromide about what it is to function optimally as a modern human being. Voilà, a laboratory-sanctioned Big Idea in digestible narrative form. This is what the psychologist Christopher Chabris has named the “story-study-lesson” model, perhaps first perfected by one Malcolm Gladwell. A series of these threesomes may be packaged into a book, and then resold again and again as a stand-up act on the wonderfully lucrative corporate lecture circuit.

Such is the rigid formula of Imagine: How Creativity Works, published in March this year by the American writer Jonah Lehrer. The book is a shatteringly glib mishmash of magazine yarn, bizarrely incompetent literary criticism, inspiring business stories about mops and dolls and zany overinterpretation of research findings in neuroscience and psychology. Lehrer responded to my hostile review of the book by claiming that I thought the science he was writing about was “useless”, but such garbage needs to be denounced precisely in defence of the achievements of science. (In a sense, as Paul Fletcher points out, such books are “anti science, given that science is supposed to be  our protection against believing whatever we find most convenient, comforting or compelling”.) More recently, Lehrer admitted fabricating quotes by Bob Dylan in Imagine, which was hastily withdrawn from sale, and he resigned from his post at the New Yorker. To invent things supposedly said by the most obsessively studied popular artist of our age is a surprising gambit. Perhaps Lehrer misunderstood his own advice about creativity.

Mastering one’s own brain is also the key to survival in a dog-eat-dog corporate world, as promised by the cognitive scientist Art Markman’s Smart Thinking: How to Think Big, Innovate and Outperform Your Rivals. Meanwhile, the field (or cult) of “neurolinguistic programming” (NLP) sells techniques not only of self-overcoming but of domination over others. (According to a recent NLP handbook, you can “create virtually any and all states” in other people by using “embedded commands”.) The employee using such arcane neurowisdom will get promoted over the heads of his colleagues; the executive will discover expert-sanctioned ways to render his underlings more docile and productive, harnessing “creativity” for profit.

Waterstones now even has a display section labelled “Smart Thinking”, stocked with pop brain tracts. The true function of such books, of course, is to free readers from the responsibility of thinking for themselves. This is made eerily explicit in the psychologist Jonathan Haidt’s The Righteous Mind, published last March, which claims to show that “moral knowledge” is best obtained through “intuition” (arising from unconscious brain processing) rather than by explicit reasoning. “Anyone who values truth should stop worshipping reason,” Haidt enthuses, in a perverse manifesto for autolobotomy. I made an Olympian effort to take his advice seriously, and found myself rejecting the reasoning of his entire book.

Modern neuro-self-help pictures the brain as a kind of recalcitrant Windows PC. You know there is obscure stuff going on under the hood, so you tinker delicately with what you can see to try to coax it into working the way you want. In an earlier age, thinkers pictured the brain as a marvellously subtle clockwork mechanism, that being the cutting-edge high technology of the day. Our own brain-as-computer metaphor has been around for decades: there is the “hardware”, made up of different physical parts (the brain), and the “software”, processing routines that use different neuronal “circuits”. Updating things a bit for the kids, the evolutionary psychologist Robert Kurzban, in Why Everyone (Else) Is a Hypocrite, explains that the brain is like an iPhone running a bunch of different apps.

Such metaphors are apt to a degree, as long as you remember to get them the right way round. (Gladwell, in Blink – whose motivational selfhelp slogan is that “we can control rapid cognition” – burblingly describes the fusiform gyrus as “an incredibly sophisticated piece of brain software”, though the fusiform gyrus is a physical area of the brain, and so analogous to “hardware” not “software”.) But these writers tend to reach for just one functional story about a brain subsystem – the story that fits with their Big Idea – while ignoring other roles the same system might play. This can lead to a comical inconsistency across different books, and even within the oeuvre of a single author.

Is dopamine “the molecule of intuition”, as Jonah Lehrer risibly suggested in The Decisive Moment (2009), or is it the basis of “the neural highway that’s responsible for generating the pleasurable emotions”, as he wrote in Imagine? (Meanwhile, Susan Cain’s Quiet: the Power of Introverts in a World That Can’t Stop Talking calls dopamine the “reward chemical” and postulates that extroverts are more responsive to it.) Other recurring stars of the pop literature are the hormone oxytocin (the “love chemical”) and mirror neurons, which allegedly explain empathy. Jonathan Haidt tells the weirdly unexplanatory micro-story that, in one experiment, “The subjects used their mirror neurons, empathised, and felt the other’s pain.” If I tell you to use your mirror neurons, do you know what to do? Alternatively, can you do as Lehrer advises and “listen to” your prefrontal cortex? Self-help can be a tricky business.

Cherry-picking

Distortion of what and how much we know is bound to occur, Paul Fletcher points out, if the literature is cherry-picked.

“Having outlined your theory,” he says, “you can then cite a finding from a neuroimaging study identifying, for example, activity in a brain region such as the insula . . . You then select from among the many theories of insula function, choosing the one that best fits with your overall hypothesis, but neglecting to mention that nobody really knows what the insula does or that there are many ideas about its possible function.”

But the great movie-monster of nearly all the pop brain literature is another region: the amygdala. It is routinely described as the “ancient” or “primitive” brain, scarily atavistic. There is strong evidence for the amygdala’s role in fear, but then fear is one of the most heavily studied emotions; popularisers downplay or ignore the amygdala’s associations with the cuddlier emotions and memory. The implicit picture is of our uneasy coexistence with a beast inside the head, which needs to be controlled if we are to be happy, or at least liberal. (In The Republican Brain, Mooney suggests that “conservatives and authoritarians” might be the nasty way they are because they have a “more active amygdala”.) René Descartes located the soul in the pineal gland; the moral of modern pop neuroscience is that original sin is physical – a bestial, demonic proto-brain lurking at the heart of darkness within our own skulls. It’s an angry ghost in the machine.

Indeed, despite their technical paraphernalia of neurotransmitters and anterior temporal gyruses, modern pop brain books are offering a spiritual topography. Such is the seductive appeal of fMRI brain scans, their splashes of red, yellow and green lighting up what looks like a black intracranial vacuum. In mass culture, the fMRI scan (usually merged from several individuals) has become a secular icon, the converse of a Hubble Space Telescope image. The latter shows us awe-inspiring vistas of distant nebulae, as though painstakingly airbrushed by a sci-fi book-jacket artist; the former peers the other way, into psychedelic inner space. And the pictures, like religious icons, inspire uncritical devotion: a 2008 study, Fletcher notes, showed that “people – even neuroscience undergrads – are more likely to believe a brain scan than a bar graph”.

In The Invisible Gorilla, Christopher Chabris and his collaborator Daniel Simons advise readers to be wary of such “brain porn”, but popular magazines, science websites and books are frenzied consumers and hypers of these scans. “This is your brain on music”, announces a caption to a set of fMRI images, and we are invited to conclude that we now understand more about the experience of listening to music. The “This is your brain on” meme, it seems, is indefinitely extensible: Google results offer “This is your brain on poker”, “This is your brain on metaphor”, “This is your brain on diet soda”, “This is your brain on God” and so on, ad nauseam. I hereby volunteer to submit to a functional magnetic-resonance imaging scan while reading a stack of pop neuroscience volumes, for an illuminating series of pictures entitled This Is Your Brain on Stupid Books About Your Brain.

None of the foregoing should be taken to imply that fMRI and other brain-investigation techniques are useless: there is beautiful and amazing science in how they work and what well-designed experiments can teach us. “One of my favourites,” Fletcher says, “is the observation that one can take measures of brain activity (either using fMRI or EEG) while someone is learning . . . a list of words, and that activity can actually predict whether particular words will be remembered when the person is tested later (even the next day). This to me demonstrates something important – that observing activity in the brain can tell us something about how somebody is processing stimuli in ways that the person themselves is unable to report. With measures like that, we can begin to see how valuable it is to measure brain activity – it is giving us information that would otherwise be hidden from us.”

In this light, one might humbly venture a preliminary diagnosis of the pop brain hacks’ chronic intellectual error. It is that they misleadingly assume we always know how to interpret such “hidden” information, and that it is always more reliably meaningful than what lies in plain view. The hucksters of neuroscientism are the conspiracy theorists of the human animal, the 9/11 Truthers of the life of the mind.

Steven Poole is the author of the forthcoming book “You Aren’t What You Eat”, which will be published by Union Books in October.

This article was updated on 18 September 2012.

This article first appeared in the 10 September 2012 issue of the New Statesman, Autumn politics special

CLIVE BARDA
Show Hide image

The lost magic of England

The great conservative journalist Peregrine Worsthorne reflects on a long life at the heart of the establishment.

In a recent editorial meeting, our subscriptions manager happened to mention that Peregrine Worsthorne was still a New Statesman subscriber. A former editor of the Sunday Telegraph and, during a long Fleet Street career, a self-styled “romantic reactionary” scourge of liberals and liberalism, Worsthorne used to be something of a pantomime villain for the left, a role he delighted in. He had close friends among the “Peterhouse right”, the group of High Tory intellectuals who gathered around Maurice Cowling at the small, conspiratorial Cambridge college. He was a frequent contributor to Encounter (which turned out to be funded by the CIA) and an ardent cold warrior. His social conservatism and lofty affectations offended lefty Islingtonian sensibilities. On several occasions he was the Guardian’s reviewer of choice for its annual collection of journalism, The Bedside Guardian, and he invariably delivered the required scornful appraisal while praising its witty television critic, Nancy Banks-Smith. There is no suggestion, he wrote in 1981, that the “Guardian ever sees itself as part of the problem; itself as having some responsibility for the evils its writers described so well”.

His prose style was Oxbridge high table, more Walter Pater than George Orwell. It was essential not to take Worsthorne too seriously, because he delighted in mischief-making and wilful provocation – one of his targets for remorseless ridicule was Andrew Neil, when Neil edited the abrasively Thatcherite Sunday Times. He ended up suing Worsthorne, who was famous for his silk shirts and Garrick Club lunches, for libel; he was awarded damages of £1, the then cover price of the Sunday Times.

“I wrote that in the old days editors of distinguished Sunday papers could be found dining at All Souls, and something must have changed when they’re caught with their trousers down in a nightclub,” Worsthorne told me when we met recently. “I had no idea he was going to sue. I was teasing. I occasionally run into him and we smile at each other, so it’s all forgotten and forgiven.”

After his retirement in 1989, Worsthorne, although he remained a resolute defender of aristocracy, seemed to mellow, and even mischievously suggested that the Guardian had replaced the Times as the newspaper of record. In the 1990s he began writing occasionally for the New Statesman – the then literary editor, Peter Wilby, commissioned book reviews from him, as I did after I succeeded Wilby. Like most journalists of his generation, Worsthorne was a joy to work with; he wrote to length, delivered his copy on time and was never precious about being edited. (Bill Deedes and Tony Howard were the same.) He might have had the mannerisms of an old-style toff but he was also a tradesman, who understood that journalism was a craft.

Shortly before Christmas, I rang Wors­thorne at the home in Buckinghamshire he shares with his second wife, Lucinda Lambton, the charming architectural writer. I asked how he was. “I’m like a squeezed lemon: all used up,” he said. Lucy described him as being “frail but not ill”. I told him that I would visit, so one recent morning I did. Home is a Grade II-listed old rectory in the village of Hedgerley. It is grand but dishevelled and eccentrically furnished. A sign on the main gates warns you to “Beware of the Dog”. But the dog turns out to be blind and moves around the house uneasily, poignantly bumping into objects and walls. At lunch, a small replica mosque in the dining room issues repeated mechanised calls to prayer. “Why does it keep doing that?” Perry asks. “Isn’t it fun,” Lucy says. She then turns to me: “Have some more duck pâté.”

As a student, I used to read Worsthorne’s columns and essays with pleasure. I did not share his positions and prejudices but I admired the style in which he articulated them. “The job of journalism is not to be scholarly,” he wrote in 1989. “The most that can be achieved by an individual newspaper or journalist is the articulation of an intelligent, well-thought-out, coherent set of prejudices – ie, a moral position.”

His Sunday Telegraph, which he edited from 1986 to 1989, was like no other newspaper. The recondite and reactionary comment pages (the focus of his energies) were unapologetically High Tory, contrary to the prevailing Thatcherite orthodoxies of the time, but were mostly well written and historically literate. Bruce Anderson was one of the columnists. “You never knew what you were going to get when you opened the paper,” he told me. “Perry was a dandy, a popinjay, and of course he didn’t lack self-esteem. He had a nostalgia for Young England. In all the time I wrote for him, however, I never took his approval for granted. I always felt a tightening of the stomach muscles when I showed him something.”

***

Worsthorne is 92 now and, though his memory is failing, he remains a lucid and engaging conversationalist. Moving slowly, in short, shuffling steps, he has a long beard and retains a certain dandyish glamour. His silver hair is swept back from a high, smooth forehead. He remains a stubborn defender of the aristocracy – “Superiority is a dread word, but we are in very short supply of superiority because no one likes the word” – but the old hauteur has gone, replaced by humility and a kind of wonder and bafflement that he has endured so long and seen so much: a journalistic Lear, but one who is not raging against the dying of the light.

On arrival, I am shown through to the drawing room, where Perry sits quietly near an open fire, a copy of that morning’s Times before him. He moves to a corner armchair and passes me a copy of his book Democracy Needs Aristocracy (2005). “It’s all in there,” he says. “I’ve always thought the English aristocracy so marvellous compared to other ruling classes. It seemed to me that we had got a ruling class of such extraordinary historical excellence, which is rooted in England
almost since the Norman Conquest.

“Just read the 18th-century speeches – the great period – they’re all Whig or Tory, but all come from that [the aristocracy]. If they didn’t come directly from the aristocracy, they turned themselves very quickly into people who talk in its language. Poetic. If you read Burke, who’s the best in my view, it’s difficult not to be tempted to think what he says has a lot of truth in it . . .”

His voice fades. He has lost his way and asks what we were talking about. “Oh, yes,” he says. “It survived when others – the French and Russians and so on – were having revolutions. It was absolutely crazy to set about destroying that. There was something magical . . . the parliamentary speeches made by Burke and so on – this is a miracle! No other country has it apart from America in the early days. And I thought to get rid of it, to undermine it, was a mistake.”

I ask how exactly the aristocracy was undermined. Even today, because of the concentration of the ownership of so much land among so few and because of the enduring influence of the old families, the great schools and Oxbridge, Britain remains a peculiar hybrid: part populist hyper-democracy and part quasi-feudal state. The Tory benches are no longer filled by aristocrats but the old class structures remain.

“Equality was the order of the day after the war,” Worsthorne replies. “And in a way it did a lot of good, equalising people’s chances in the world. But it didn’t really get anywhere; the ruling class went happily on. But slowly, and I think unnecessarily dangerously, it was destroyed – and now there are no superior people around [in politics]. The Cecil family – Lord Salisbury, he was chucked out of politics. The Cecil family is being told they are not wanted. The institutions are falling apart . . .

“But there were people who had natural authority, like Denis Healey. I’m not saying it’s only aristocrats – a lot of Labour people had it. But now we haven’t got any Denis Healeys.”

Born in 1923, the younger son of Alexander Koch de Gooreynd, a Belgian banker, Worsthorne (the family anglicised its name) was educated at Stowe and was an undergraduate at both Cambridge (Peterhouse, where he studied under the historian Herbert Butterfield, the author of The Whig Interpretation of History) and Oxford (Magdalen College). “I have always felt slightly underprivileged and de-classed by having gone to Stowe, unlike my father who went to Eton,” Worsthorne wrote in 1985.

Yet his memories of Stowe remain pellucid. There he fell under the influence of the belle-lettrist John Davenport, who later became a close friend of Dylan Thomas. “He was a marvellous man, a famous intellectual of the 1930s, an ex-boxer, too. But in the war he came to Stowe and he was preparing me for a scholarship to Cambridge. He told me to read three books, and find something to alleviate the boredom of an examiner, some little thing you’ll pick up. And I duly did and got the scholarship.”

Can you remember which three books he recommended?

“Tawney. Something by Connolly, um . . . that’s the terrible thing about getting old, extremely old – you forget. And by the time you die you can’t remember your brother’s name. It’s a terrible shock. I used to think old age could be a joy because you’d have more time to read. But if you push your luck and get too far, and last too long, you start finding reading really quite difficult. The connections go, I suppose.”

Was the Connolly book Enemies of Promise (1938)?

“Yes, that’s right. It was. And the other one was . . . Hang on, the writer of the book . . . What’s the country invaded by Russia, next to Russia?

Finland, I say. Edmund Wilson’s To the Finland Station (1940)?

“Yes. Wilson. How did you get that?”

We both laugh.

***

Worsthorne is saddened but not surprised that so many Scots voted for independence and his preference is for Britain to remain a member of the European Union. “What’s happening is part of the hopelessness of English politics. It’s horrible. I can’t think why the Scots would want to be on their own but it might happen. The youth will vote [for independence]. This is part of my central theme: the Scots no longer think it’s worthwhile belonging to England. The magic of England has gone – and it’s the perversity of the Tory party to want to get us out of the European Union when of course we’re much more than ever unlikely to be able to look after ourselves as an independent state because of the quality of our political system.

“The people who want to get us out are obviously of an undesirable kind. That the future should depend on [Nigel] Farage is part of the sickness. I mean the real horror is for him to have any influence at all. And when you think of the great days of the Labour Party, the giants who strode the stage – famous, lasting historical figures, some of them: Healey, Attlee, who was probably the greatest, [Ernest] Bevin. I’m well aware that Labour in the good days produced people who were superior.”

He digresses to reflect on his wartime experience as a soldier – he served in Phantom, the special reconnaissance unit, alongside Michael Oakeshott, the philosopher of English conservatism who became a close friend, and the actor David Niven, our “prize colleague”.

“I remember Harold Macmillan saying to me, after the Second World War, the British people needed their belt enlarged; they’d done their job and they deserved a reward. And that’s what he set about doing. And he wasn’t a right-wing, unsympathetic man at all. But he didn’t – and this is what is good about conservatism – he didn’t turn it into an ‘ism’. It was a sympathetic feel, an instinctive feel, and of course people in the trenches felt it, too: solidarity with the rest of England and not just their own brotherhood. Of course he didn’t get on with Margaret Thatcher at all.”

Worsthorne admired Thatcher and believed that the “Conservatives required a dictator woman” to shake things up, though he was not a Thatcherite and denounced what he called her “bourgeois triumphalism”. He expresses regret at how the miners were treated during the bitter strike of 1984-85. “I quarrelled with her about the miners’ strike, and the people she got around her to conduct it were a pretty ropey lot.

“I liked her as a person. I was with her that last night when she wasn’t prime minister any more, but she was still in Downing Street and had everything cut off. The pressman [Bernard Ingham] got several of us to try to take her mind off her miseries that night. There’s a photograph of me standing at the top of the stairs.”

In the summer of 1989, Peregrine Wors­thorne was sacked as the editor of the Sunday Telegraph by Andrew Knight, a former journalist-turned-management enforcer, over breakfast at Claridge’s. He wrote about the experience in an elegant diary for the Spectator: “I remember well the exact moment when this thunderbolt, coming out of a blue sky, hit me. It was when the waiter had just served two perfectly poached eggs on buttered toast . . . In my mind I knew that the information just imparted was a paralysingly painful blow: pretty well a professional death sentence.”

He no longer reads the Telegraph.

“Politically they don’t have much to say of interest. But I can’t put the finger on exactly what it is I don’t like about it. Boredom, I think!”

You must read Charles Moore?

“He is my favourite. Interesting fellow. He converted to Catholicism and started riding to hounds in the same week.”

He has no regrets about pursuing a long career in journalism rather than, say, as a full-time writer or academic, like his friends Cowling and Oakeshott. “I was incredibly lucky to do journalism. What people don’t realise – and perhaps you don’t agree – but it’s really a very easy life, compared to many others. And you have good company in other journalists and so on. I was an apprentice on the Times, after working [as a sub-editor] on the Glasgow Herald.”

How does he spend the days?

“Living, I suppose. It takes an hour to get dressed because all the muscles go. Then I read the Times and get bored with it halfway through. Then there’s a meal to eat. The ­answer is, the days go. I used to go for walks but I can’t do that now. But Lucy’s getting me all kinds of instruments to facilitate people with no muscles, to help you walk. I’m very sceptical about it working, but then again, better than the alternative.”

He does not read as much as he would wish. He takes the Statesman, the Spectator and the Times but no longer the Guardian. He is reading Niall Ferguson’s biography of Kissinger, The Maisky Diaries by Ivan Maisky, Stalin’s ambassador to London from 1932 to 1943, and Living on Paper, a selection of letters by Iris Murdoch, whom he knew. “I get these massive books, thinking of a rainy day, but once I pick them up they are too heavy, physically, so they’re stacked up, begging to be read.”

He watches television – the news (we speak about Isis and the Syrian tragedy), the Marr show on Sunday mornings, and he has been enjoying War and Peace on BBC1. “Andrew Marr gave my book a very good review. He’s come back. He’s survived [a stroke] through a degree of hard willpower to get back to that job, almost as soon as he came out of surgery. But I don’t know him; he was a Guardian man.” (In fact, Marr is more closely associated with the Independent.)

Of the celebrated Peterhouse historians, both Herbert Butterfield (who was a Methodist) and Maurice Cowling were devout Christians. For High Tories, who believe in and accept natural inequalities and the organic theory of society, Christianity was a binding force that held together all social classes, as some believe was the order in late-Victorian England.

“I was a very hardened Catholic,” Worsthorne says, when I mention Cowling’s book Religion and Public Doctrine in Modern England. “My mother was divorced [her second marriage was to Montagu Norman, then the governor of the Bank of England] and she didn’t want my brother and me to be Catholic, so she sent us to Stowe. And I used to annoy her because I read [Hilaire] Belloc. I tried to annoy the history master teaching us Queen Elizabeth I. I said to him: ‘Are you covering up on her behalf: don’t you know she had syphilis?’

“Once I felt very angry about not being made Catholic. But then I went to Cambridge and there was a very Catholic chaplain and he was very snobbish. And in confession I had to tell him I masturbated twice that morning or something, and so it embarrassed me when half an hour later I had to sit next to him at breakfast. I literally gave up going to Mass to get out of this embarrassing situation. But recently I’ve started again. I haven’t actually gone to church but I’ve made my confessions, to a friendly bishop who came to the house.”

So you are a believer?

“Yes. I don’t know which bit I believe. But as Voltaire said: ‘Don’t take a risk.’”

He smiles and lowers his head. We are ready for lunch. 

Jason Cowley is editor of the New Statesman. He has been the editor of Granta, a senior editor at the Observer and a staff writer at the Times.

This article first appeared in the 11 February 2016 issue of the New Statesman, The legacy of Europe's worst battle