This is a metaphor. Photograph: Getty Images
Show Hide image

Your brain on pseudoscience: the rise of popular neurobollocks

The “neuroscience” shelves in bookshops are groaning. But are the works of authors such as Malcolm Gladwell and Jonah Lehrer just self-help books dressed up in a lab coat?

An intellectual pestilence is upon us. Shop shelves groan with books purporting to explain, through snazzy brain-imaging studies, not only how thoughts and emotions function, but how politics and religion work, and what the correct answers are to age-old philosophical controversies. The dazzling real achievements of brain research are routinely pressed into service for questions they were never designed to answer. This is the plague of neuroscientism – aka neurobabble, neurobollocks, or neurotrash – and it’s everywhere.

In my book-strewn lodgings, one literally trips over volumes promising that “the deepest mysteries of what makes us who we are are gradually being unravelled” by neuroscience and cognitive psychology. (Even practising scientists sometimes make such grandiose claims for a general audience, perhaps urged on by their editors: that quotation is from the psychologist Elaine Fox’s interesting book on “the new science of optimism”, Rainy Brain, Sunny Brain, published this summer.) In general, the “neural” explanation has become a gold standard of non-fiction exegesis, adding its own brand of computer-assisted lab-coat bling to a whole new industry of intellectual quackery that affects to elucidate even complex sociocultural phenomena. Chris Mooney’s The Republican Brain: the Science of Why They Deny Science – and Reality disavows “reductionism” yet encourages readers to treat people with whom they disagree more as pathological specimens of brain biology than as rational interlocutors.

The New Atheist polemicist Sam Harris, in The Moral Landscape, interprets brain and other research as showing that there are objective moral truths, enthusiastically inferring – almost as though this were the point all along – that science proves “conservative Islam” is bad.

Happily, a new branch of the neuroscienceexplains everything genre may be created at any time by the simple expedient of adding the prefix “neuro” to whatever you are talking about. Thus, “neuroeconomics” is the latest in a long line of rhetorical attempts to sell the dismal science as a hard one; “molecular gastronomy” has now been trumped in the scientised gluttony stakes by “neurogastronomy”; students of Republican and Democratic brains are doing “neuropolitics”; literature academics practise “neurocriticism”. There is “neurotheology”, “neuromagic” (according to Sleights of Mind, an amusing book about how conjurors exploit perceptual bias) and even “neuromarketing”. Hoping it’s not too late to jump on the bandwagon, I have decided to announce that I, too, am skilled in the newly minted fields of neuroprocrastination and neuroflâneurship.

Illumination is promised on a personal as well as a political level by the junk enlightenment of the popular brain industry. How can I become more creative? How can I make better decisions? How can I be happier? Or thinner? Never fear: brain research has the answers. It is self-help armoured in hard science. Life advice is the hook for nearly all such books. (Some cram the hard sell right into the title – such as John B Arden’s Rewire Your Brain: Think Your Way to a Better Life.) Quite consistently, heir recommendations boil down to a kind of neo- Stoicism, drizzled with brain-juice. In a selfcongratulatory egalitarian age, you can no longer tell people to improve themselves morally. So self-improvement is couched in instrumental, scientifically approved terms.

The idea that a neurological explanation could exhaust the meaning of experience was already being mocked as “medical materialism” by the psychologist William James a century ago. And today’s ubiquitous rhetorical confidence about how the brain works papers over a still-enormous scientific uncertainty. Paul Fletcher, professor of health neuroscience at the University of Cambridge, says that he gets “exasperated” by much popular coverage of neuroimaging research, which assumes that “activity in a brain region is the answer to some profound question about psychological processes. This is very hard to justify given how little we currently know about what different regions of the brain actually do.” Too often, he tells me in an email correspondence, a popular writer will “opt for some sort of neuro-flapdoodle in which a highly simplistic and questionable point is accompanied by a suitably grand-sounding neural term and thus acquires a weightiness that it really doesn’t deserve. In my view, this is no different to some mountebank selling quacksalve by talking about the physics of water molecules’ memories, or a beautician talking about action liposomes.”

Shades of grey

The human brain, it is said, is the most complex object in the known universe. That a part of it “lights up” on an fMRI scan does not mean the rest is inactive; nor is it obvious what any such lighting-up indicates; nor is it straightforward to infer general lessons about life from experiments conducted under highly artificial conditions. Nor do we have the faintest clue about the biggest mystery of all – how does a lump of wet grey matter produce the conscious experience you are having right now, reading this paragraph? How come the brain gives rise to the mind? No one knows.

So, instead, here is a recipe for writing a hit popular brain book. You start each chapter with a pat anecdote about an individual’s professional or entrepreneurial success, or narrow escape from peril. You then mine the neuroscientific research for an apparently relevant specific result and narrate the experiment, perhaps interviewing the scientist involved and describing his hair. You then climax in a fit of premature extrapolation, inferring from the scientific result a calming bromide about what it is to function optimally as a modern human being. Voilà, a laboratory-sanctioned Big Idea in digestible narrative form. This is what the psychologist Christopher Chabris has named the “story-study-lesson” model, perhaps first perfected by one Malcolm Gladwell. A series of these threesomes may be packaged into a book, and then resold again and again as a stand-up act on the wonderfully lucrative corporate lecture circuit.

Such is the rigid formula of Imagine: How Creativity Works, published in March this year by the American writer Jonah Lehrer. The book is a shatteringly glib mishmash of magazine yarn, bizarrely incompetent literary criticism, inspiring business stories about mops and dolls and zany overinterpretation of research findings in neuroscience and psychology. Lehrer responded to my hostile review of the book by claiming that I thought the science he was writing about was “useless”, but such garbage needs to be denounced precisely in defence of the achievements of science. (In a sense, as Paul Fletcher points out, such books are “anti science, given that science is supposed to be  our protection against believing whatever we find most convenient, comforting or compelling”.) More recently, Lehrer admitted fabricating quotes by Bob Dylan in Imagine, which was hastily withdrawn from sale, and he resigned from his post at the New Yorker. To invent things supposedly said by the most obsessively studied popular artist of our age is a surprising gambit. Perhaps Lehrer misunderstood his own advice about creativity.

Mastering one’s own brain is also the key to survival in a dog-eat-dog corporate world, as promised by the cognitive scientist Art Markman’s Smart Thinking: How to Think Big, Innovate and Outperform Your Rivals. Meanwhile, the field (or cult) of “neurolinguistic programming” (NLP) sells techniques not only of self-overcoming but of domination over others. (According to a recent NLP handbook, you can “create virtually any and all states” in other people by using “embedded commands”.) The employee using such arcane neurowisdom will get promoted over the heads of his colleagues; the executive will discover expert-sanctioned ways to render his underlings more docile and productive, harnessing “creativity” for profit.

Waterstones now even has a display section labelled “Smart Thinking”, stocked with pop brain tracts. The true function of such books, of course, is to free readers from the responsibility of thinking for themselves. This is made eerily explicit in the psychologist Jonathan Haidt’s The Righteous Mind, published last March, which claims to show that “moral knowledge” is best obtained through “intuition” (arising from unconscious brain processing) rather than by explicit reasoning. “Anyone who values truth should stop worshipping reason,” Haidt enthuses, in a perverse manifesto for autolobotomy. I made an Olympian effort to take his advice seriously, and found myself rejecting the reasoning of his entire book.

Modern neuro-self-help pictures the brain as a kind of recalcitrant Windows PC. You know there is obscure stuff going on under the hood, so you tinker delicately with what you can see to try to coax it into working the way you want. In an earlier age, thinkers pictured the brain as a marvellously subtle clockwork mechanism, that being the cutting-edge high technology of the day. Our own brain-as-computer metaphor has been around for decades: there is the “hardware”, made up of different physical parts (the brain), and the “software”, processing routines that use different neuronal “circuits”. Updating things a bit for the kids, the evolutionary psychologist Robert Kurzban, in Why Everyone (Else) Is a Hypocrite, explains that the brain is like an iPhone running a bunch of different apps.

Such metaphors are apt to a degree, as long as you remember to get them the right way round. (Gladwell, in Blink – whose motivational selfhelp slogan is that “we can control rapid cognition” – burblingly describes the fusiform gyrus as “an incredibly sophisticated piece of brain software”, though the fusiform gyrus is a physical area of the brain, and so analogous to “hardware” not “software”.) But these writers tend to reach for just one functional story about a brain subsystem – the story that fits with their Big Idea – while ignoring other roles the same system might play. This can lead to a comical inconsistency across different books, and even within the oeuvre of a single author.

Is dopamine “the molecule of intuition”, as Jonah Lehrer risibly suggested in The Decisive Moment (2009), or is it the basis of “the neural highway that’s responsible for generating the pleasurable emotions”, as he wrote in Imagine? (Meanwhile, Susan Cain’s Quiet: the Power of Introverts in a World That Can’t Stop Talking calls dopamine the “reward chemical” and postulates that extroverts are more responsive to it.) Other recurring stars of the pop literature are the hormone oxytocin (the “love chemical”) and mirror neurons, which allegedly explain empathy. Jonathan Haidt tells the weirdly unexplanatory micro-story that, in one experiment, “The subjects used their mirror neurons, empathised, and felt the other’s pain.” If I tell you to use your mirror neurons, do you know what to do? Alternatively, can you do as Lehrer advises and “listen to” your prefrontal cortex? Self-help can be a tricky business.

Cherry-picking

Distortion of what and how much we know is bound to occur, Paul Fletcher points out, if the literature is cherry-picked.

“Having outlined your theory,” he says, “you can then cite a finding from a neuroimaging study identifying, for example, activity in a brain region such as the insula . . . You then select from among the many theories of insula function, choosing the one that best fits with your overall hypothesis, but neglecting to mention that nobody really knows what the insula does or that there are many ideas about its possible function.”

But the great movie-monster of nearly all the pop brain literature is another region: the amygdala. It is routinely described as the “ancient” or “primitive” brain, scarily atavistic. There is strong evidence for the amygdala’s role in fear, but then fear is one of the most heavily studied emotions; popularisers downplay or ignore the amygdala’s associations with the cuddlier emotions and memory. The implicit picture is of our uneasy coexistence with a beast inside the head, which needs to be controlled if we are to be happy, or at least liberal. (In The Republican Brain, Mooney suggests that “conservatives and authoritarians” might be the nasty way they are because they have a “more active amygdala”.) René Descartes located the soul in the pineal gland; the moral of modern pop neuroscience is that original sin is physical – a bestial, demonic proto-brain lurking at the heart of darkness within our own skulls. It’s an angry ghost in the machine.

Indeed, despite their technical paraphernalia of neurotransmitters and anterior temporal gyruses, modern pop brain books are offering a spiritual topography. Such is the seductive appeal of fMRI brain scans, their splashes of red, yellow and green lighting up what looks like a black intracranial vacuum. In mass culture, the fMRI scan (usually merged from several individuals) has become a secular icon, the converse of a Hubble Space Telescope image. The latter shows us awe-inspiring vistas of distant nebulae, as though painstakingly airbrushed by a sci-fi book-jacket artist; the former peers the other way, into psychedelic inner space. And the pictures, like religious icons, inspire uncritical devotion: a 2008 study, Fletcher notes, showed that “people – even neuroscience undergrads – are more likely to believe a brain scan than a bar graph”.

In The Invisible Gorilla, Christopher Chabris and his collaborator Daniel Simons advise readers to be wary of such “brain porn”, but popular magazines, science websites and books are frenzied consumers and hypers of these scans. “This is your brain on music”, announces a caption to a set of fMRI images, and we are invited to conclude that we now understand more about the experience of listening to music. The “This is your brain on” meme, it seems, is indefinitely extensible: Google results offer “This is your brain on poker”, “This is your brain on metaphor”, “This is your brain on diet soda”, “This is your brain on God” and so on, ad nauseam. I hereby volunteer to submit to a functional magnetic-resonance imaging scan while reading a stack of pop neuroscience volumes, for an illuminating series of pictures entitled This Is Your Brain on Stupid Books About Your Brain.

None of the foregoing should be taken to imply that fMRI and other brain-investigation techniques are useless: there is beautiful and amazing science in how they work and what well-designed experiments can teach us. “One of my favourites,” Fletcher says, “is the observation that one can take measures of brain activity (either using fMRI or EEG) while someone is learning . . . a list of words, and that activity can actually predict whether particular words will be remembered when the person is tested later (even the next day). This to me demonstrates something important – that observing activity in the brain can tell us something about how somebody is processing stimuli in ways that the person themselves is unable to report. With measures like that, we can begin to see how valuable it is to measure brain activity – it is giving us information that would otherwise be hidden from us.”

In this light, one might humbly venture a preliminary diagnosis of the pop brain hacks’ chronic intellectual error. It is that they misleadingly assume we always know how to interpret such “hidden” information, and that it is always more reliably meaningful than what lies in plain view. The hucksters of neuroscientism are the conspiracy theorists of the human animal, the 9/11 Truthers of the life of the mind.

Steven Poole is the author of the forthcoming book “You Aren’t What You Eat”, which will be published by Union Books in October.

This article was updated on 18 September 2012.

This article first appeared in the 10 September 2012 issue of the New Statesman, Autumn politics special

Tate London 2014
Show Hide image

The good daughter

The truth is I don’t want to be a full-time carer, any more than I wanted to be a full-time mother. And I don’t want to live with my ma any more than she wants to live with me.

In Tate Britain is a painting by the Victorian artist George Elgar Hicks of a woman ministering tenderly to her invalid father. It is called Comfort of Old Age. The work is the final panel of Hicks’s triptych Woman’s Mission. The first part, Guide of Childhood, in which the same figure teaches her little boy to walk, has been lost. But the second panel also hangs at the Tate in London: Companion of Manhood shows our heroine consoling her husband after ghastly news.

Hicks depicted “woman” in her three guises – mother, wife, daughter – and in her ideal state, the selfless provider of guidance, solace and care. Her life has meaning only in so far as it nourishes and facilitates the lives of others, principally men.

Domestic and emotional labour, we call it now. Feminists have long campaigned both for this to be acknowledged as real work and for men to do their share. Women cannot reach their potential at the office, notes Facebook’s Sheryl Sandberg in her book Lean In, until men pull their weight at home. But this has always been the toughest, messiest fight, because it is about domestic harmony, varying standards of personal hygiene, nagging, sulking and love. Besides, there is an enduring sense, little changed since Hicks’s day, that not only are women better at caring duties, but it is their natural lot.

I have spent a long time in the first two panels of the triptych: a partner/wife for 30 years, a mother for 21. (My two sons are grown and pretty much gone.) And I have seen, in the course of my adult life, enormous progress in those two domains. Men no longer assume that wives will dump their careers to follow them on foreign postings, for instance, or that mothers cannot work. According to research by the Office for National Statistics, women still do 40 per cent more household chores than men but, growing up, I never saw a man make dinner, let alone push a pram. Marriages are increasingly equal partnerships and each generation of fathers is more engaged.

Now I have reached the third panel, the trickiest bit of the triptych. My 93-year-old mother is 200 miles away in Doncaster, and since my father died, five years ago, she has been living alone. She is – I must stress – admirable, independent, uncomplaining and tough. A stoic. Someone who doesn’t mourn her every lost faculty but relishes what she can still do. Yet almost everyone she ever knew is dead, and I am her only child: her principal Comfort of Old Age.

For a long time, the landscape was a series of plateaus and small dips. Her little house acquired rails, walking frames, adaptations; she wears an emergency pendant. But until she broke her hip four years ago, she wouldn’t even have a cleaner. (“I don’t want strangers in my house.”) She managed. Just. But since Christmas the terrain has shifted. A persistent infection, two collapses, three ambulance rides, tachycardia (in which your heart beats to the point of explosion), but then, after three weeks, back home. Finally I persuaded her to have carers – nice, kindly, expensive – for an hour five times a week. (She demanded days off.) A slightly lower plateau.

Then, a few weeks ago, a neighbour called to say that my ma’s curtains were still closed at 4pm. She was found dehydrated, hallucinating. (She hadn’t pressed her emergency button; it was a non-carer day.) I hurriedly packed my bag for God knows how long, then scrambled north to sit by her bedside believing, for the third time this year, that I was watching her die.

For three weeks, on and off, I slept alone in my teenage single bed, in the house where I grew up, weeping every time I opened a cupboard to see her cake tins or Easter eggs for her grandsons. That week, I read a news report about how having children makes people live two years longer. Of course! As her daughter, I was her advocate, hassling doctors for information, visiting, reassuring, making sure she was fed, washing her soiled clothes (even long-stay units won’t do laundry), trying to figure out what to do next. God help the childless! Really, who will speak for them?

Finally, having wrestled her into (almost) daily care – she is very stubborn – I returned to London to find a letter. I am a Times columnist and write a weekly notebook slot, occasionally featuring my mother. I am used to harsh reader critiques of my life. But this, I must say, stung. It was from a man who lives in Cheshire (he had supplied his name and address), and he wanted me to know what a terrible person I am. “I have been puzzled when reading your column over the past months how you have been able to leave your mother – whose serious health issues you have used as copy . . . to holiday in Mexico, East Anglia and Norway.” I was “selfish and self-regarding”, and I should be ashamed.

He was not the first. Online posters often chide me for maternal neglect, and otherwise kind letters sometimes conclude: “But I do think your mother should move in with you.” Anyway, my egregious Mexican holiday had been long delayed by her illness and although she was well when I left, I was braced to fly back at any moment. The Norway trip was to visit my son on his 21st birthday. No matter. How dare I have a life.

I was reminded of when my children were young and I was a magazine editor. The guilt-tripping, the moral judgement: the looks from full-time mothers, the pursed lips from older relatives. Why bother having kids if you work full-time? Back then, I was “selfish and self-regarding”, too. My husband, who worked vastly longer hours, was blameless.

So let me warn you that just when you’re free from being judged as a mother, you’ll be judged as a daughter. It is the last chance for reactionary types who resent women’s career success, or just their freedom to live how they choose, to have a dig. Look at this selfish bitch, weekending in East Anglia when she should be a Comfort of Old Age.

When we say someone is a Good Dad, it means he turns up to football matches and parents’ evenings, gives sensible advice, isn’t a derelict alcoholic or a deserter. I know many fathers do much, much more. But that is the bar to Good Dadhood. It is pretty low. To qualify as a Good Mother, however, a woman must basically subsume her entire existence into her children and household and may only work part-time, if at all.

So, what is a Good Daughter? A US report showed in 2014 that daughters were twice as likely as sons to care for their elderly parents. In a survey of 26,000 older Americans, Angelina Grigoryeva, a sociologist at Princeton University, discovered that daughters provide as much care as they can manage, while sons do as little as they can get away with. If they have sisters or even wives, men are likely to leave it to them. I can find no equivalent UK study, but I’d bet the same is true here.

I know many sons who help out with ageing parents: Sunday care-home visits or a spot of DIY. Some do the truly grim stuff, such as washing and toileting a frail, dementia-patient father. And all sons – unless they are estranged, or cruel, or in prison – are Good Sons. Being a Good Daughter is a much tougher gig. However often I go north, sort out bills, buy new ironing boards, listen to my mother’s worries, take her shopping, organise her Christmas presents and stay awake worrying, it won’t be enough. A friend visits her disabled mother every day, despite her family and career, sorts out wheelchairs and carers, runs errands. Her three brothers drop by for ten minutes once a fortnight: so busy, so important! Yet my friend’s care is a given, and her brothers are “marvellous”. A truly Good Daughter would quit her job, have her old mother move in and tend to her alone.

The truth is I don’t want to be a full-time carer, any more than I wanted to be a full-time mother. And I don’t want to live with my ma any more than she wants to live with me. Now that I’ve served out my motherhood years, I want to do other things with my life besides looking after people. Is that a shocking admission? Men wouldn’t give it a second thought.

Yet politicians of left and right are always telling us that the solution to our screwed-up social-care system is the family. To socialists, the “care industry” is further evidence of marketisation and the profit motive taking over the personal sphere. Jeremy Hunt, the Health Secretary, has said that he favours the “Asian model” and the care minister David Mowat said recently that we must care for our parents as unquestioningly as we do our children. In practice, these all amount to the same thing: women, chiefly daughters and daughters-in-law, toiling away unpaid.

After Christmas, while my mother was living with me, frail and recuperating from her infection, I hired a private carer so that I could work. This lovely woman was boundlessly kind, calm, patient, unfazed: I am none of these things. Ask me to fix the car, get sense from a doctor, shout at the council: I’m Action Daughter, at your service. But expect me to sit still in a room making nice for hours and I am crap. In Hicks’s Woman’s Mission, I have failed.

A Times reader chastised me for hiring help: “Well, I’d expect to look after my own mother myself.” And I was reminded once more of early motherhood, when I employed a nanny. Yes, a nanny, not a childminder or a nursery. I know the word makes left-wing men crazy: you cold, rich, privileged cow. That nanny, funnily enough, allowed both my husband and me to work, but it was me who got the rap.

Even hiring a cleaner is “problematic”. A good feminist shouldn’t expect a poorer woman to clear up after her, I hear. To which I reply: my mother was a cleaner for thirty years and her meagre wages paid for my new shoes. When a couple hire a cleaner, it is nearly always to compensate for the shortfall in male domestic labour, yet it is the woman, again, who has somehow failed.

In the third part of the triptych, paid help for elderly parents is even more of a dereliction of female duty. My mother’s next-door neighbour has cared for her invalid father, unaided, for 20 years; a friend has remodelled her house to accommodate her elderly parents. Across Britain are millions of people who care for relatives with little respite. When I say that a private carer now visits my mother, I do so with shame because, most days, this is the only company she receives. A nice lady called Sue helps with her jigsaw puzzle, chats to her, does some light housework and fetches her shopping. But what she is paying for is a surrogate me.

It tears up my heart. Yet it is complicated. What if you live far from your home town: should you be expected to return? My unmarried aunt came back after an interesting single life to live with my grandmother until her death. Her siblings didn’t thank her for this sacrifice. Indeed, without the status of marriage, she was treated with disdain.

Last month, as a Nigerian health assistant helped Ma to the hospital bathroom, I remarked that she lives alone. “Why?” came the horrified response. In her culture, this made no sense. But northern European society has evolved an individualism that often transcends notions of family and duty. This applies to the old and offspring alike.

Largely our elderly do not want, Asian-style, to be infantilised by their children, or bossed around by their daughters-in-law. (The claim that Indian parents are “revered” is undermined by rampant elder abuse.) My ma wants to watch Corrie, eat quiche, not feel she is in the way. “I like to please myself,” is her refrain. Her home of almost 50 years is her carapace: her central fear is of being too ill to stay. Despite the much-discussed return of “multigenerational living”, the most popular British solution is the “granny annex”, where an old person maintains autonomy behind her own front door.

Moreover, members of the baby-boomer generation recoil at living with their parents. We spent our teenage years trying to escape. What if your upbringing featured divorce, personality clashes, arguments, abuse? What if, like me, you left your working-class culture for a completely different life – what if you have little in common? Or your widowed father now expects you to run around after him like a skivvy, just as he did your mum? You can reject your roots for your entire adulthood, then your parents’ frailty yanks you home.

Now those Guide of Childhood years seem simple and golden, although the parallels are striking. From stair gates to stairlifts; from pushchairs to wheelchairs; the incontinence provision; the helplessness. But raising children is largely a cheerful, upward trajectory. Elderly care is an uneven descent, via those dips and plateaus, towards some hidden crevasse. There is no compensatory boasting, showing cute snaps on your phone. You learn not to mention geriatric travails. People look uncomfortable or bored: too grim.

But, just as a child shows you the world anew – look, a spider, a leaf, the sea, Christmas! – through clear, unjaded eyes, older people reveal what truly matters in the end. A reader remarked that it was probably best that my mother, at 93, now died. I replied that she gets more joy in M&S than some get from a Caribbean cruise. With age, the world distils down to elemental pleasures: seeing a grandchild, a piece of cake, a sunny day, the warmth of a hand. When my father was very close to death and when recently my ma was at her sickest, both still managed to utter the words “I love you”. Just as when a frightened child cries for you in the night, you are utterly irreplaceable, needed.

And it will be your turn soon, when your parents are old. We are living longer, often fading out in medically preserved decrepitude over many years. I can’t understand why both as individuals and as a society we refuse to plan. Well, actually I can. It’s horrible. As my mother always says: “When it happens, it happens.”

Yet there is so much we could do. Come up with a cross-party agreement on how to fund social care through the tax system. Invest money and imagination in ways that old people can remain in their home, rather than slash home help. Develop friendship schemes and clubs, so the lonely aren’t so dependent on faraway children. Enable the old to use the internet: few are online, though no one would benefit from it more. Rip up the care-home model in which the elderly are objects in a chair: let people be their full human selves until the end.

Above all, we must redraw that final panel of the triptych. Don’t judge daughters more harshly than sons. Don’t let men slink away from their fair share. Don’t wield the family as a glib solution. Instead, acknowledge that it is hard, heart-rending work, being a Comfort of Old Age. 

Janice Turner is a columnist for the Times

This article first appeared in the 18 May 2017 issue of the New Statesman, Age of Lies

0800 7318496