More than a number: Benjamin argues that we can't escape the facts of ageing. Photo: Muir Vidler
Show Hide image

Marina Benjamin: what it means to be a woman aged 50

As she prepares for her 50th birthday, the author and journalist reflects on what it means to be “middle-aged” – and on a journey she knows never ends well.

Life’s defining moments do not always announce themselves with the fanfare of celebration (big birthdays, weddings) or trauma (puberty, divorce). Sometimes they’re like stealth bombers; they come out of nowhere and blow up things soundlessly. Two years ago I experienced just such a moment in the middle of the night. I woke up wanting to go to the bathroom and swung out of bed to stand up. I took a single step in the right direction, then fell to the floor like a plank.

There was the blunt thud of skull hitting wood and the slap of impact that split open the skin on my brow bone. My husband leapt out of bed to put the light on, alert as if the crash had been an intruder. By this time I’d managed to sit up. Blood was dripping from my eye on to my hand and I could feel the throb of nascent swellings at my ankle, hip and shoulder. I remember thinking: “This is the kind of game-changing fall that happens to old people” – bone-breaking, concussion-inducing – not to women in their late forties. I swallowed a couple of painkillers, cleaned myself up and went back to sleep. The following day my eye-socket was a reddish-purple golf ball, lids glued into a slit, and my whole body ached.

In itself, the fall was banal. A clear-cut case of somnambulism; my mind had been awake enough to formulate a conscious intention but the neural pathways of my motor system still slumbered. That day and the next I stayed home, unwilling to suffer people staring. I nursed some angry bruises; but thereafter I got on with things as if nothing had happened.

Looking back, I recognise that the fall registered much deeper. It was sloppy and uncontrolled, as if I were a marionette and someone else, someone malicious, the string-puller. I was a mere player, a pawn, a flimsy vessel bobbing on choppy seas. Worse, like some bizarre prefiguring of my future life, my fall seemed to contain within it every other fall I would henceforth suffer.

From that instant on, I’ve never regained an absolute trust that my body will automatically fall into line with my will: from now on it will falter and fail. I can no longer depend on it to function properly. This, it seems to me, is solid indication that my youth has ended and middle age begun.

These days, when we are persistently told that age is all in the mind, that 40 is the new 30, and 50 the new 40; when entire wings of the cosmetics and medical industries are dedicated to rolling back the effects of passing time; when women are giving birth to first children in their late forties and fifties; when we are all, men and women alike, living healthily for longer, working later and shunning the putting out to pasture we once happily greeted as “retirement”; why, when such things are the new norms, would anyone elect themselves to membership of that most undesirable of clubs, the middle-aged? Shouldn’t I just dismiss my fall as an accident? I still run five kilometres three or four times a week. I work and I parent. I switch and click between being a wife, daughter, mother and friend. I am nowhere near the end of my productive life, as a writer or anything else. And yet I know as surely as day is not night that one season of my life has ended and another begun.

You might ask how I know, indeed, how anyone knows when they’ve arrived at middle age. I’ll admit that it remains fuzzy as to whether middle age qualifies as a biologically distinct phase of life (one that comes with its own neurological and biochemical map) or is just a label we give to a period of mental adjustment that helps us accommodate vague feelings of loss. Then again, perhaps it is merely a socio-cultural construction, no more trustworthy than any marketing category: a shorthand way of dividing people up by their attitudes and lifestyle choices?

When the term “middle age” came into general use in the late 19th century, it was principally in a socio-economic setting. Empire and industrialisation had expanded and enriched the middle classes, and women who had finished raising children could enjoy another decade or two of vigour and relevance. Middle age was actually admired: these women were mature, worldly creatures who had, as the modern saying goes, “freedom to” as well as “freedom from”. The negative tarnish came with the mass production of the 1920s and the theories of scientific management that underpinned it, sharpening our association of youth with productivity and middle age with decreasing efficiency.

You could argue that middle age is thoroughly overdetermined, as Simone de Beau­voir seemed to suggest in The Coming of Age. Writing in 1970, when she was 62, de Beauvoir pushed back against a quiescent society that expected people to grow more “serene” as they grew older. With measured eloquence, she wheeled in whole bodies of literature and philosophy to swat down this idea of resigned acceptance. Instead, she argued, we should accommodate old age through a process of continual, consciously engaged modification. “Life is an unstable system in which balance is continually lost and continually recovered,” she wrote.

This chimes with my sense that we shift this way and that – sometimes literally, as with my fall – before correcting for overzealousness or caution. Though de Beauvoir was writing more about old age than middle age, her labelling of bodily decline, economic redundancy and social marginalisation as important parameters in defining how we age fits with the idea that entering middle age is a kind of subjective reckoning. I’m picturing a Venn diagram that captures the intersection of de Beauvoir’s three factors: middle age is that shady area where the circles overlap. It’s a dappled spot, where the light is fading and the chill of winter starts to set in. The specific age at which we enter this penumbra is different for each of us, but the common quality is a profound sense of alteration and a dawning understanding, dim at first, that there is no point of re-entry to the bright terrain of youth.

In the past year, Penelope Lively, Julia Twigg, Lynne Segal, Anne Karpf, Angela Neustatter and a clutch of their American peers all published books on ageing, attempting to pick up where de Beauvoir left off. These women are a generation older than I am. They’ve been through the wars – menopause, middle age – and emerged unscathed. Now they claim to be wiser, happier, bolder, calmer, more flexible, open and, in some cases, more in touch with youth than before. They offer a relentless good cheer, as if it were permissible to write about late life only by becoming your own superheroine. And they appear to have signed up, one and all, to the delusional idea that you are only as old as you feel.

And it really is delusional. My own mother, youthful in mind as she ever was, would guffaw if, in the face of her ongoing problems with mobility, memory loss and regular if episodic bereavement, I attempted to console her by announcing that 80 was the new 70. In most countries, average life expectancy continues to hover around the late-seventies mark (not far off the Psalmist’s three score and ten) but in developing countries it is much lower. At 82, my mother acknowledges that she’s into borrowed time, like those statistical outliers who live beyond 90 and 100 and skew popular perceptions of this ultimate numbers game. Yes, medicine has increased life expectancy – but not as much, or as broadly, as one might think.

As I gear up to turn 50 this summer what has lodged in my mind is this: that it is a mathematical near-certainty that, with my next birthday, I will have passed the halfway mark. That from now on growing older will be less about marking the age I’ve arrived at than about counting down what is left. At 50 I will quite literally be over the hill; ahead of me, the incline runs downwards. And it doesn’t end well.

Last autumn, some 18 months after my fall, I had a hysterectomy. To be precise, it was a sub-total hysterectomy with bilateral salpingo-oophrectomy. All that’s left is my cervix and I kept that for sentimental reasons. It took weeks to recover from the surgery, during which time I experienced a full-bodied plunge into instant menopause. Joints were popping and bones aching. It was impossible to sleep. Every hour and a half, like clockwork, I’d wake up drenched in sweat, throw off my covers and run to the bathroom window to salute the moon – at least that’s how I now think of my stripped-down attempts at rapid cooling.

A kindly friend put a book called The Wisdom of Menopause into my hands, and I gratefully scurried away to prise out its time-trawled pearls. Sadly, this bestselling book by Christiane Northrup, MD turned out to be an embittered tirade against marriage and family – as if our ties were good only for holding us back, rather than up – and, in the worst tradition of US self-help literature, it was lecturing and strident. More edifying was Jane Shilling’s melancholy and poetic memoir of midlife The Stranger in the Mirror, a book honest enough to acknowledge the effrontery of ageing.

On top of affront, of course, there is grief, bewilderment, alienation, frustration – everything you might associate with being forced to cross a border into a foreign land only to be informed that you can never go back; that your passport has been torn up and your old home ransacked. As a new arrival in this strange nation, I wish to parse the experience before its lessons evaporate or transform. To that end, I have pressed into service oestrogen, my new drug of choice.

Oestrogen is the soft end of age-reversing remedies. It is marketed as “natural” – even though much of it is engineered in the laboratory using horse hormone primers. More slyly, it is billed as a “replacement” therapy not a “supplement”. It replenishes depleted stores, topping up your parched system with nothing more than you already had. Like a debt repaid; you’re entitled to it.

And yet oestrogen’s effects are little short of miraculous. It strengthens nails and bones, boosts energy, lifts libido, makes your skin glow and your hair shine. After taking it for a month, I felt as though I’d been holidaying in Thailand. After two, as if I’d just passed my MOT. And I’ve been evangelising about oestrogen ever since, shamelessly pushing it on friends overcome by fatigue, hot flashes, mood swings and insomnia – friends who, like me, are aghast that instead of gently drifting into midlife, midlife has rudely flung itself at them, exploding like a bag of flour.

Using oestrogen is, however, hallucino­genic. Like taking morphine during labour, it insinuates a languorous pause into an otherwise relentless process. Oestrogen heightens my sense of being at a threshold that demands I make conscious decisions about how to tackle ageing. It is in my power now (for as long as I take the stuff) to call the shots on how rapidly I’m willing to let go of my youth. But what exactly should I do? And where should I draw the line? Choice, however illusory, has entered the equation – and with choice comes temptation.

I can see, for example, how easy it might be to do just a little something. A tiny nip and tuck here, a harmless injection there; a barely noticeable lift, suction or augmentation. These reveries of self-improvement taunt me periodically, though they are quickly checked whenever I come across monstrous images of, say, Madonna, her face distorted by prosthetics or fillers, and the fine line between surgery and butchery is brought home with the thumping finality of a cleaver hitting the block.

Although I can see through such determined resistance to ageing into the inner weakness it betrays, I don’t believe for a minute that the smugness that comes with self-denial is any better – all those go-grey campaigners getting off on feeling superior to women who faff around with hair dye. It’s such a Pyrrhic victory. Unless they ditch their granny-like pieties for the unruly witchiness championed by the likes of Germaine Greer, then I feel they’ve nothing to teach me.

Besides, when I journey down that path of imaginative projection, promising myself I will stop hurling spokes into the spinning dials of my body clock, I find that I’m still far from happy about ageing. I feel unprepared for it. Caught on the hop. Exposed. Most of the time I pretend it isn’t happening, only to be pulled up short by that terrible sense of dissonance occasioned: a) by a chance encounter with a mirror, and b) by friends I haven’t seen in a while, when the unchanging, inner me (source of identity, stability, comfort) is forced to confront a visible exterior that’s been subjected to a Dorian Gray-style makeover.

“You look exactly the same,” a friend I’d not seen in a decade told me recently. “Only fuller.” What stung most was that he did look exactly the same. He didn’t even have the graying temples that supposedly confer “dignity” on middle-aged men. Of course men, accustomed in their prime to greater social and economic power than women, often fall very hard in midlife, not least because there are fewer routes to self-reinvention open to them as they age than reveal themselves to women by way of grandmothering, voluntary work, or the Women’s Institute and its modern analogues of baking, knitting, music or gossip circles. (My mother has developed a whole new eightysomething network through playing bridge, which is a 90 per cent female pursuit, as far as I can tell.)

Lonesome or not, men still manage to remain visible as they age, while women are quietly removed from view, especially in high-visibility professions such as the stage and media. Last year the actress Kristin Scott Thomas was widely reported complaining that in midlife she is no longer seen. “Somehow, you just vanish,” she said. You talk and people affect not to hear you. Or they bump into you in the street. Her disclosure struck terror into the heart of every middle-aged woman I know: if someone as blindingly gorgeous and talented as Scott Thomas could disappear, what hope was there for the rest of us?

The serious point about being invisible is the poverty of viable alternatives. You might think, on the plus side, that if you are beneath regard there is no pressure to conform, or even behave. You can thumb your nose at convention and no one will chide you for it. Like the mischievous old woman in Jenny Joseph’s poem, who promises to rattle her stick along the railings and blow her pension on brandy and fancy gloves, you can make up in midlife for the sobriety of your youth.

But although this – what to call it . . . freedom by omission? – holds out the promise of gay abandon, I’m not convinced that the solution to the painfulness of moving forward is a simple flip into reverse gear. Jenny Joseph’s idealisations of a second childhood (who else but children can be so irresponsible?) are ultimately infantilising. Yet the Loose Women nudge-and-wink alternative of turbocharging sexuality on the other side of fertility feels too much like parody.

The trouble with such attempts to reset the clock is that they play directly into societal pressures that keep women perpetually on the back foot. In our post-industrial society, which demands that we keep redundancy at bay by working ever longer hours for a greater number of years, it becomes imperative to prove that you’re still in the game. That you can keep up with younger colleagues, work nights and weekends. That you can innovate and adapt – else those new brooms will sweep you aside faster than you can say Rip Van Winkle.

I’m not sure how, in this brave new world, where economic efficiency is the true driver behind age-appropriate expectations of how to behave, middle-aged women are supposed to find their way. But I do know that falling is out of the question.

Marina Benjamin is the author of “Rocket Dreams” and “Last Days in Babylon” and is a senior editor on Aeon Magazine. She tweets as @marinab52

This article first appeared in the 08 May 2014 issue of the New Statesman, India's worst nightmare?

An artist's version of the Reichstag fire, which Hitler blamed on the communists. CREDIT: DEZAIN UNKIE/ ALAMY
Show Hide image

The art of the big lie: the history of fake news

From the Reichstag fire to Stalin’s show trials, the craft of disinformation is nothing new.

We live, we’re told, in a post-truth era. The internet has hyped up postmodern relativism, and created a kind of gullible cynicism – “nothing is true, and who cares anyway?” But the thing that exploits this mindset is what the Russians call dezinformatsiya. Disinformation – strategic deceit – isn’t new, of course. It has played a part in the battle that has raged between mass democracy and its enemies since at least the First World War.

Letting ordinary people pick governments depends on shared trust in information, and this is vulnerable to attack – not just by politicians who want to manipulate democracy, but by those on the extremes who want to destroy it. In 1924, the first Labour government faced an election. With four days to go, the Daily Mail published a secret letter in which the leading Bolshevik Grigory Zinoviev heralded the government’s treaties with the Soviets as a way to help recruit British workers for Leninism. Labour’s vote actually went up, but the Liberal share collapsed, and the Conservatives returned to power.

We still don’t know exactly who forged the “Zinoviev Letter”, even after exhaustive investigations of British and Soviet intelligence archives in the late 1990s by the then chief historian of the Foreign Office, Gill Bennett. She concluded that the most likely culprits were White Russian anti-Bolsheviks, outraged at Labour’s treaties with Moscow, probably abetted by sympathetic individuals in British intelligence. But whatever the precise provenance, the case demonstrates a principle that has been in use ever since: cultivate your lie from a germ of truth. Zinoviev and the Comintern were actively engaged in trying to stir revolution – in Germany, for example. Those who handled the letter on its journey from the forger’s desk to the front pages – MI6 officers, Foreign Office officials, Fleet Street editors – were all too ready to believe it, because it articulated their fear that mass democracy might open the door to Bolshevism.

Another phantom communist insurrection opened the way to a more ferocious use of disinformation against democracy. On the night of 27 February 1933, Germany’s new part-Nazi coalition was not yet secure in power when news started to hum around Berlin that the Reichstag was on fire. A lone left-wing Dutchman, Marinus van der Lubbe, was caught on the site and said he was solely responsible. But Hitler assumed it was a communist plot, and seized the opportunity to do what he wanted to do anyway: destroy them. The suppression of the communists was successful, but the claim it was based on rapidly collapsed. When the Comintern agent Gyorgy Dimitrov was tried for organising the fire, alongside fellow communists, he mocked the charges against him, which were dismissed for lack of evidence.

Because it involves venturing far from the truth, disinformation can slip from its authors’ control. The Nazis failed to pin blame on the communists – and then the communists pinned blame on the Nazis. Dimitrov’s comrade Willi Münzenberg swiftly organised propaganda suggesting that the fire was too convenient to be Nazi good luck. A “counter-trial” was convened in London; a volume called The Brown Book of the Reichstag Fire and Hitler Terror was rushed into print, mixing real accounts of Nazi persecution of communists – the germ of truth again – with dubious documentary evidence that they had started the fire. Unlike the Nazis’ disinformation, this version stuck, for decades.

Historians such as Richard Evans have argued that both stories about the fire were false, and it really was one man’s doing. But this case demonstrates another disinformation technique still at work today: hide your involvement behind others, as Münzenberg did with the British great and good who campaigned for the Reichstag prisoners. In the Cold War, the real source of disinformation was disguised with the help of front groups, journalistic “agents of influence”, and the trick of planting a fake story in an obscure foreign newspaper, then watching as the news agencies picked it up. (Today, you just wait for retweets.)

In power, the Nazis made much use of a fictitious plot that did, abominably, have traction: The Protocols of the Elders of Zion, a forged text first published in Russia in 1903, claimed to be a record of a secret Jewish conspiracy to take over the world – not least by means of its supposed control of everyone from bankers to revolutionaries. As Richard Evans observes, “If you subject people to a barrage of lies, in the end they’ll begin to think well maybe they’re not all true, but there must be something in it.” In Mein Kampf, Hitler argued that the “big lie” always carries credibility – an approach some see at work not only in the Nazis’ constant promotion of the Protocols but in the pretence that their Kristallnacht pogrom in 1938 was spontaneous. (It is ironic that Hitler coined the “big lie” as part of an attack on the Jews’ supposed talent for falsehood.) Today, the daring of the big lie retains its force: even if no one believes it, it makes smaller untruths less objectionable in comparison. It stuns opponents into silence.

Unlike the Nazis, the Bolshevik leaders were shaped by decades as hunted revolutionaries, dodging the Tsarist secret police, who themselves had had a hand in the confection of the Protocols. They occupied the paranoid world of life underground, governed by deceit and counter-deceit, where any friend could be an informer. By the time they finally won power, disinformation was the Bolsheviks’ natural response to the enemies they saw everywhere. And that instinct endures in Russia even now.

In a competitive field, perhaps the show trial is the Soviet exercise in upending the truth that is most instructive today. These sinister theatricals involved the defendants “confessing” their crimes with great
sincerity and detail, even if the charges were ludicrous. By 1936, Stalin felt emboldened to drag his most senior rivals through this process – starting with Grigory Zinoviev.

The show trial is disinformation at its cruellest: coercing someone falsely to condemn themselves to death, in so convincing a way that the world’s press writes it up as truth. One technique involved was perfected by the main prosecutor, Andrey Vyshinsky, who bombarded the defendants with insults such as “scum”, “mad dogs” and “excrement”. Besides intimidating the victim, this helped to distract attention from the absurdity of the charges. Barrages of invective on Twitter are still useful for smearing and silencing enemies.


The show trials were effective partly because they deftly reversed the truth. To conspire to destroy the defendants, Stalin accused them of conspiring to destroy him. He imposed impossible targets on straining Soviet factories; when accidents followed, the managers were forced to confess to “sabotage”. Like Hitler, Stalin made a point of saying the opposite of what he did. In 1936, the first year of the Great Terror, he had a rather liberal new Soviet constitution published. Many in the West chose to believe it. As with the Nazis’ “big lie”, shameless audacity is a disinformation strategy in itself. It must have been hard to accept that any regime could compel such convincing false confessions, or fake an entire constitution.

No one has quite attempted that scale of deceit in the post-truth era, but reversing the truth remains a potent trick. Just think of how Donald Trump countered the accusation that he was spreading “fake news” by making the term his own – turning the charge on his accusers, and even claiming he’d coined it.

Post-truth describes a new abandonment of the very idea of objective truth. But George Orwell was already concerned that this concept was under attack in 1946, helped along by the complacency of dictatorship-friendly Western intellectuals. “What is new in totalitarianism,” he warned in his essay “The Prevention of Literature”, “is that its doctrines are not only unchallengeable but also unstable. They have to be accepted on pain of damnation, but on the other hand they are always liable to be altered on a moment’s notice.”

A few years later, the political theorist Hannah Arendt argued that Nazis and Stalinists, each immersed in their grand conspiratorial fictions, had already reached this point in the 1930s – and that they had exploited a similar sense of alienation and confusion in ordinary people. As she wrote in her 1951 book, The Origins of Totalitarianism: “In an ever-changing, incomprehensible world the masses had reached the point where they would, at the same time, believe everything and nothing, think that everything was possible and that nothing was true.” There is a reason that sales of Arendt’s masterwork – and Orwell’s Nineteen Eighty-Four – have spiked since November 2016.

During the Cold War, as the CIA got in on the act, disinformation became less dramatic, more surreptitious. But show trials and forced confessions continued. During the Korean War, the Chinese and North Koreans induced a series of captured US airmen to confess to dropping bacteriological weapons on North Korea. One lamented that he could barely face his family after what he’d done. The pilots were brought before an International Scientific Commission, led by the eminent Cambridge scientist Joseph Needham, which investigated the charges. A documentary film, Oppose Bacteriological Warfare, was made, showing the pilots confessing and Needham’s Commission peering at spiders in the snow. But the story was fake.

The germ warfare hoax was a brilliant exercise in turning democracy’s expectations against it. Scientists’ judgements, campaigning documentary, impassioned confession – if you couldn’t believe all that, what could you believe? For the genius of disinformation is that even exposure doesn’t disable it. All it really has to do is sow doubt and confusion. The story was finally shown to be fraudulent in 1998, through documents transcribed from Soviet archives. The transcripts were authenticated by the historian Kathryn Weathersby, an expert on the archives. But as Dr Weathersby laments, “People come back and say ‘Well, yeah, but, you know, they could have done it, it could have happened.’”

There’s an insidious problem here: the same language is used to express blanket cynicism as empirical scepticism. As Arendt argued, gullibility and cynicism can become one. If opponents of democracy can destroy the very idea of shared, trusted information, they can hope to destabilise democracy itself.

But there is a glimmer of hope here too. The fusion of cynicism and gullibility can also afflict the practitioners of disinformation. The most effective lie involves some self-deception. So the show trial victims seem to have internalised the accusations against them, at least for a while, but so did their tormentors. As the historian Robert Service has written, “Stalin frequently lied to the world when he was simultaneously lying to himself.”

Democracy might be vulnerable because of its reliance on the idea of shared truth – but authoritarianism has a way of undermining itself by getting lost in its own fictions. Disinformation is not only a danger to its targets. 

Phil Tinline’s documentary “Disinformation: A User’s Guide” will be broadcast on BBC Radio 4 at 8pm, 17 March

This article first appeared in the 08 May 2014 issue of the New Statesman, India's worst nightmare?