Show Hide image

John Pilger on the Dagan Plan and Gaza under fire

Every war Israel has waged since 1948 has had the same objective: expulsion of the native people. 

"When the truth is replaced by silence," the Soviet dissident Yevgeny Yevtushenko said, "the silence is a lie." It may appear that the silence on Gaza is broken. The small cocoons of murdered children, wrapped in green, together with boxes containing their dismembered parents, and the cries of grief and rage of everyone in that death camp by the sea can be witnessed on al-Jazeera and YouTube, even glimpsed on the BBC. But Russia's incorrigible poet was not referring to the ephemera we call news; he was asking why those who knew the why never spoke it, and so denied it. Among the Anglo-American intelligentsia, this is especially striking. It is they who hold the keys to the great storehouses of knowledge: the historiographies and archives that lead us to the why.

They know that the horror now raining on Gaza has little to do with Hamas or, absurdly, "Israel's right to exist". They know the opposite to be true: that Palestine's right to exist was cancelled 61 years ago and that the expulsion and, if necessary, extinction of the indigenous people was planned and executed by the founders of Israel. They know, for example, that the infamous "Plan D" of 1947-48 resulted in the murderous depopulation of 369 Palestinian towns and villages by the Haganah (Israeli army) and that massacre upon massacre of Palestinian civilians in such places as Deir Yassin, al-Dawayima, Eilaboun, Jish, Ramle and Lydda are referred to in official records as "ethnic cleansing". Arriving at a scene of this carnage, David Ben-Gurion, Israel's first prime minister, was asked by a general, Yigal Allon: "What shall we do with the Arabs?" Ben-Gurion, reported the Israeli historian Benny Morris, "made a dismissive, energetic gesture with his hand and said, 'Expel them'".

The order to expel an entire population "without attention to age" was signed by Yitzhak Rabin, a future prime minister promoted by the world's most efficient propaganda as a peacemaker. The terrible irony of this was addressed only in passing, such as when the Mapam party co-leader Meir Ya'ari noted "how easily" Israel's leaders spoke of how it was "possible and permissible to take women, children and old men and to fill the road with them because such is the imperative of strategy. And this we say . . . who remember who used this means against our people during the [Second World] War . . . I am appalled."

Every subsequent "war" Israel has waged has had the same objective: the expulsion of the native people and the theft of more and more land. The lie of David and Goliath, of perennial victim, reached its apogee in 1967 when the propaganda became a righteous fury that claimed the Arab states had struck first against Israel. Since then, mostly Jewish truth-tellers such as Avi Shlaim, Noam Chomsky, Tanya Reinhart, Neve Gordon, Tom Segev, Uri Avnery, Ilan Pappé and Norman Finkelstein have undermined this and other myths and revealed a state shorn of the humane traditions of Judaism, whose unrelenting militarism is the sum of an expansionist, lawless and racist ideology called Zionism. "It seems," wrote the Israeli historian Pappé on 2 January, "that even the most horrendous crimes, such as the genocide in Gaza, are treated as discrete events, unconnected to anything that happened in the past and not associated with any ideology or system . . . Very much as the apartheid ideology explained the oppressive policies of the South African government, this ideology - in its most consensual and simplistic variety - allowed all the Israeli governments in the past and the present to dehumanise the Palestinians wherever they are and strive to destroy them. The means altered from period to period, from location to location, as did the narrative covering up these atrocities. But there is a clear pattern [of genocide]."

In Gaza, the enforced starvation and denial of humanitarian aid, the piracy of life-giving resources such as fuel and water, the denial of medicines, the systematic destruction of infrastructure and killing and maiming of the civilian population, 50 per cent of whom are children, fall within the international standard of the Genocide Convention. "Is it an irresponsible overstatement," asked Richard Falk, UN special rapporteur for human rights in the occupied Palestinian territories and international law authority at Princeton University, "to associate the treatment of Palestinians with this criminalised Nazi record of collective atrocity? I think not."

In describing a “holocaust-in-the making”, Falk was alluding to the Nazis’ establishment of Jewish ghettos in Poland. For one month in 1943, the captive Polish Jews, led by Mordechaj Anielewicz, fought off the German army and the SS, but their resistance was finally crushed and the Nazis exacted their final revenge. Falk is also a Jew. Today’s holocaust-in-the-making, which began with Ben-Gurion’s Plan D, is in its final stages. The difference today is that it is a joint US-Israeli project. The F-16 jet fighters, the 250lb “smart” GBU-39 bombs supplied on the eve of the attack on Gaza, having been approved by a Congress dominated by the Democratic Party, plus the annual $2.4bn in warmaking “aid”, give Washington de facto control. It beggars belief that President-elect Obama was not informed. Outspoken about Russia’s war in Georgia and the terrorism in Mumbai, Obama has maintained a silence on Palestine that marks his approval, which is to be expected, given his obsequiousness to the Tel Aviv regime and its lobbyists during the presidential campaign and his appointment of Zionists as his secretary of state and principal Middle East advisers. When Aretha Franklin sings “Think”, her wonderful 1960s anthem to freedom, at Obama’s inauguration on 20 January, I trust someone with the brave heart of Muntader al-Zaidi, the shoe-thrower, will shout: “Gaza!”

The asymmetry of conquest and terror is clear. Plan D is now "Operation Cast Lead", which is the unfinished "Operation Justified Vengeance". This was launched by Prime Minister Ariel Sharon in 2001 when, with George W Bush's approval, he used F-16s against Palestinian towns and villages for the first time.

 

Why are the academics and teachers silent? Are British universities now no more than “intellectual Tescos”?

 

In that same year, the authoritative Jane's Foreign Report disclosed that the Blair government had given Israel the "green light" to attack the West Bank after it was shown Israel's secret designs for a bloodbath. It was typical of new Labour's enduring complicity in Palestine's agony. However, the Israeli plan, reported Jane's, needed the "trigger" of a suicide bombing which would cause "numerous deaths and injuries [because] the 'revenge' factor is crucial". This would "motivate Israeli soldiers to demolish the Palestinians". What alarmed Sharon and the author of the plan, General Shaul Mofaz, then Israeli chief of staff, was a secret agreement between Yasser Arafat and Hamas to ban suicide attacks. On 23 November 2001 Israeli agents assassinated the Hamas leader Mahmoud Abu Hanoud and got their "trigger": the suicide attacks resumed in response to his killing.

Something uncannily similar happened on 4 November last year when Israeli special forces attacked Gaza, killing six people. Once again, they got their propaganda "trigger": a ceasefire sustained by the Hamas government - which had imprisoned its violators - was shattered as a result of the Israeli attacks, and home-made rockets were fired into what used to be called Palestine before its Arab occupants were "cleansed". On 23 December, Hamas offered to renew the ceasefire, but Israel's charade was such that its all-out assault on Gaza had been planned six months earlier, according to the Israeli daily Haaretz.

Behind this sordid game is the "Dagan Plan", named after General Meir Dagan, who served with Sharon during his bloody invasion of Leba non in 1982. Now head of Mossad, the Israeli intelligence organisation, Dagan is the author of a "solution" that has brought about the imprisonment of Palestinians behind a ghetto wall snaking across the West Bank and in Gaza, now effectively a concentration camp. The establishment of a quisling government in Ramallah, under Mahmoud Abbas, is Dagan's achievement, together with a hasbara (propaganda) campaign, relayed through mostly supine, if intimidated western media, notably in the US, which say Hamas is a terrorist organisation devoted to Israel's destruction and is to "blame" for the massacres and siege of its own people over two generations, since long before its creation. "We have never had it so good," said the Israeli foreign ministry spokesman Gideon Meir in 2006. "The hasbara effort is a well-oiled machine."

In fact, Hamas's real threat is its example as the Arab world's only democratically elected government, drawing its popularity from its resistance to the Palestinians' oppressor and tormentor. This was demonstrated when Hamas foiled a CIA coup in 2007, an event ordained in the western media as "Hamas's seizure of power". Likewise, Hamas is never described as a government, let alone democratic. Neither is its proposal of a ten-year truce reported as a historic recognition of the "reality" of Israel and support for a two-state solution with just one condition: that the Israelis obey international law and end their illegal occupation beyond the 1967 borders. As every annual vote in the UN General Assembly demonstrates, most states agree. On 4 January, the president of the General Assembly, Miguel d'Escoto, described the Israeli attack on Gaza as a "monstrosity".

When the monstrosity is done and the people of Gaza are even more stricken, the Dagan Plan foresees what Sharon called a "1948-style solution" - the destruction of all Palestinian leadership and authority, followed by mass expulsions into smaller and smaller "cantonments", and perhaps, finally, into Jordan. This demolition of institutional and educational life in Gaza is designed to produce, wrote Karma Nabulsi, a Palestinian exile in Britain, "a Hobbesian vision of an anarchic society: truncated, violent, powerless, destroyed, cowed . . . Look to the Iraq of today: that is what [Sharon] had in store for us, and he has nearly achieved it."

Dr Dahlia Wasfi is an American writer on Iraq and Palestine. She has a Jewish mother and an Iraqi Muslim father. "Holocaust denial is anti-Semitic," she wrote on 31 December. "But I'm not talking about the World War II, Mahmoud Ahmadinejad [the president of Iran] or Ashkenazi Jews. What I'm referring to is the holocaust we are all witnessing and responsible for in Gaza today and in Palestine over the past 60 years . . . Since Arabs are Semites, US-Israeli policy doesn't get more anti-Semitic than this." She quoted Rachel Corrie, the young American who went to Palestine to defend Palestinians and was crushed by an Israeli bulldozer. "I am in the midst of a genocide," wrote Corrie, "which I am also indirectly supporting, and for which my government is largely responsible."

Reading the words of both, I am struck by the use of "responsibility". Breaking the lie of silence is not an esoteric abstraction, but an urgent responsibility that falls to those with the privilege of a platform. With the BBC cowed, so too is much of journalism, merely allowing vigorous debate within unmovable, invisible boundaries, ever fearful of the smear of anti-Semitism. The unreported news, meanwhile, is that the death toll in Gaza is the equivalent of 18,000 dead in Britain. Imagine, if you can.

Then there are the academics, the deans and teachers and researchers. Why are they silent as they watch a university bombed and hear the Association of University Teachers in Gaza plead for help? Are British universities now, as Terry Eagleton believes, no more than “intellectual Tescos, churning out a commodity known as graduates rather than greengroceries”?

Then there are the writers. In the dark year of 1939, the Third American Writers' Congress was held at Carnegie Hall in New York and the likes of Thomas Mann and Albert Einstein sent messages and spoke up to ensure that the lie of silence was broken. By one account, 2,500 jammed the auditorium. Today, this mighty voice of realism and morality is said to be obsolete; the literary review pages affect an ironic hauteur of irrelevance; false symbolism is all. As for the readers, their moral and political imagination is to be pacified, not primed. The anti-Muslim Martin Amis expressed this well in Visiting Mrs Nabo kov: "The dominance of the self is not a flaw, it is an evolutionary characteristic; it is just how things are."

If that is how things are, we are diminished as a civilised people. For what happens in Gaza is the defining moment of our time, which either grants war criminals impunity and immunity through our silence, while we contort our own intellect and morality, or it gives us the power to speak out. For the moment I prefer my own memory of Gaza: of the people's courage and resistance and their "luminous humanity", as Karma Nabulsi put it. On my last trip there, I was rewarded with a spectacle of Palestinian flags fluttering in unlikely places. It was dusk and children had done this. No one had told them to do it. They made flagpoles out of sticks tied together, and a few of them climbed on to a wall and held the flag between them, some silently, others crying out. They do this every day when they know foreigners are leaving, in the belief that the world will not forget them.

John Pilger, renowned investigative journalist and documentary film-maker, is one of only two to have twice won British journalism's top award; his documentaries have won academy awards in both the UK and the US. In a New Statesman survey of the 50 heroes of our time, Pilger came fourth behind Aung San Suu Kyi and Nelson Mandela. "John Pilger," wrote Harold Pinter, "unearths, with steely attention facts, the filthy truth. I salute him."

This article first appeared in the 12 January 2009 issue of the New Statesman, The destruction of Gaza

STUART KINLOUGH
Show Hide image

Head in the cloud

As we download ever more of our lives on to electronic devices, are we destroying our own internal memory?

I do not remember my husband’s tele­phone number, or my best friend’s address. I have forgotten my cousin’s birthday, my seven times table, the date my grandfather died. When I write, I keep at least a dozen internet tabs open to look up names and facts I should easily be able to recall. There are so many things I no longer know, simple things that matter to me in practical and personal ways, yet I usually get by just fine. Apart from the few occasions when my phone has run out of battery at a crucial moment, or the day I accidentally plunged it into hot tea, or the evening my handbag was stolen, it hasn’t seemed to matter that I have downloaded most of my working memory on to electronic devices. It feels a small inconvenience, given that I can access information equivalent to tens of billions of books on a gadget that fits into my back pocket.

For thousands of years, human beings have relied on stone tablets, scrolls, books or Post-it notes to remember things that their minds cannot retain, but there is something profoundly different about the way we remember and forget in the internet age. It is not only our memory of facts that is changing. Our episodic memory, the mind’s ability to relive past experiences – the surprising sting of an old humiliation revisited, the thrill and discomfort of a first kiss, those seemingly endless childhood summers – is affected, too. The average Briton now spends almost nine hours a day staring at their phone, computer or television, and when more of our lives are lived on screen, more of our memories will be formed there. We are recording more about ourselves and our experiences than ever before, and though in the past this required deliberate effort, such as sitting down to write a diary, or filing away a letter, or posing for a portrait, today this process can be effortless, even unintentional. Never before have people had access to such comprehensive and accurate personal histories – and so little power to rewrite them.

My internet history faithfully documents my desktop meanderings, even when I resurface from hours of browsing with little memory of where I have been or what I have read. My Gmail account now contains over 35,000 emails received since 2005. It has preserved the banal – long-expired special offers, obsolete arrangements for post-work drinks – alongside the life-changing. Loves and break-ups are chronicled here; jobs, births and weddings are announced; deaths are grieved. My Facebook profile page has developed into a crowdsourced, if assiduously edited, photo album of my social life over the past decade. My phone is a museum of quick-fire text exchanges. With a few clicks, I can retrieve, in mind-numbing detail, information about my previous movements, thoughts and feelings. So could someone else. Even my most private digital memories are not mine alone. They have become data to be restructured, repackaged, aggregated, copied, deleted, monetised or sold by internet firms. Our digital memories extend far beyond our reach.

In the late 1990s the philosopher David Chalmers coined the term “the extended mind” to describe how when we use pen and paper, calculators, or laptops to help us think or remember, these external objects are incorporated into our cognitive processes. “The technology we use becomes part of our minds, extending our minds and indeed our selves into the world,” Chalmers said in a 2011 Ted talk. Our iPhones have not been physically implanted into our brains, he explained, but it’s as if they have been. There’s a big difference between offloading memory on to a notepad and doing it on to a smartphone. One is a passive receptacle, the other is active. A notebook won’t reorganise the information you give it or ping you an alert; its layout and functions won’t change overnight; its contents aren’t part-owned by the stationery firm that made it. The more we extend our minds online, the harder it is becoming to keep control of our digital pasts, or to tell where our memories begin or end. And, while society’s collective memory is expanding at an astonishing rate, our internal, individual ones are shrinking.

***

Our brains are lazy; we are reluctant to remember things when we can in effect delegate the task to someone or something else. You can observe this by listening to couples, who often consult one another’s memories: “What was the name of that nice Chinese restaurant we went to the other day?” Subconsciously, partners distribute responsibility for remembering information according to each other’s strengths. I ask my husband for directions, he consults me on people’s names.

In one study conducted in 1991, psychologists assigned a series of memory exercises to pairs of students, some of whom had been dating for at least three months and some of whom did not know one another. The dating couples remembered more than the non-dating pairs. They also remembered more unique information; when a fact fell into their partner’s area of expertise, they were more likely to forget it.

In a similar way, when we know that a computer can remember something for us we are less likely to remember it ourselves. For a study published by the journal Science in 1991, people were asked to type some trivia facts into a computer. Those who believed the facts would be saved at the end of the experiment remembered less than those who thought they would be deleted – even when they were explicitly asked to memorise them. In an era when technology is doing ever more remembering, it is unsurprising that we are more inclined to forget.

It is sometimes suggested that in time the worry that the internet is making us forgetful will sound as silly as early fears that books would do the same. But the internet is not an incremental step in the progression of written culture, it is revolutionising the way we consume information. When you pull an encyclopaedia down from a library shelf, it is obvious that you are retrieving a fact you have forgotten, or never knew. Google is so fast and easy to use that we can forget we have consulted it at all: we are at risk of confusing the internet’s memory with our own. A Harvard University project in 2013 found that when people were allowed to use Google to check their answers to trivia questions they rated their own intelligence and memories more highly – even if they were given artificially low test results. Students usually believed more often that Google was confirming a fact they already knew, rather than providing them with new information.

This changed when Adrian Ward, now an assistant professor at the University of Austin, who designed the study as part of his PhD research, mimicked a slow internet connection so that students were forced to wait 25 seconds to read the answer to a Google query. The delay, he noted, stripped them of the “feeling of knowing” because they became more aware that they were consulting an external source. In the internet age, Ward writes, people “may offload more and more information while losing sight of the distinction between information stored in their minds and information stored online”.

By blurring the distinction between our personal and our digital memories, modern technology could encourage intellectual complacency, making people less curious about new information because they feel they already know it, and less likely to pay attention to detail because our computers are remembering it. What if the same could be said for our own lives: are we less attentive to our experiences because we know that computers will record them for us?

An experiment by the American psychologist Linda Henkel suggests this could be the case; she has found that when people take photographs at museums they are more likely to forget details of what they have seen. To some extent, we’re all tourists exploring the world from behind a camera, too distracted by our digital memories to inhabit our analogue lives fully. Relying on computers to remember telephone numbers or trivia does not seem to deprive our internal memories of too much – provided you can remember where you’ve stored it, factual information is fairly straightforward to retrieve. Yet a digital memory is a poor substitute for the richness of a personal experience revisited, and our autobiographical memories cannot be “retrieved” by opening the relevant online file.

Our relationship with the past is capricious. Sometimes an old photograph can appear completely unfamiliar, while at other times the faintest hint – the smell of an ex-lover’s perfume on a crowded Tube carriage – can induce overwhelming nostalgia. Remembering is in part a feeling, of recognition, of having been there, of reinhabiting a former self. This feeling is misleading; we often imagine memories offer an authentic insight into our past. They do not.

Memory is closely linked to self-identity, but it is a poor personal record. Remembering is a creative act. It is closely linked to imagining. When people suffer from dementia they are often robbed not only of the past but also of the future; without memory it is hard to construct an idea of future events. We often mistakenly convert our imaginings into memories – scientists call the process “imagination inflation”. This puts biological memories at odds with digital ones. While memories stored online can be retrieved intact, our internal memories are constantly changing and evolving. Each time we relive a memory, we reconfigure it to suit our present needs and world-view. In his book Pieces of Light, an exploration of the new science of memory, the neuroscientist Charles Fernyhough compares the construction of memory to storytelling. To impose meaning on to our chaotic, complex lives we need to know which sections to abridge and which details can be ignored. “We are all natural born storytellers. We are constantly editing and remaking our memory stories as our knowledge and emotions change. They may be fictions, but they are our fictions,” Fernyhough writes.

We do not write these stories alone. The human mind is suggestible. In 2013, scientists at MIT made international headlines when they said they had successfully implanted a false memory into a mouse using a form of light stimulation, but human beings implant false memories into each other all the time, using more low-tech methods. Friends and family members are forever distorting one another’s memories. I remember distinctly being teased for my Dutch accent at school and indignantly telling my mother when I arrived home that, “It’s pronounced one, two, three. Not one, two, tree.” My brother is sure it was him. The anecdote is tightly woven into the story of our pasts, but one of us must be wrong. When we record our personal memories online we open up new possibilities for their verification but we also create different opportunities for their distortion. In subtle ways, internet firms are manipulating our digital memories all the time – and we are often dangerously unaware of it.

***

Facebook occasionally gives me a reminder of Mahmoud Tlissy, the caretaker at my former office in Libya who died quietly of pancreatic cancer in 2011 while the civil war was raging. Every so often he sends me a picture of a multicoloured heart via a free app that outlived him. Mahmoud was a kind man with a sardonic sense of humour, a deep smoker’s laugh and a fondness for recounting his wild days as a student in Prague. I am always pleased to be reminded of him, but I feel uncomfortable because I doubt he would have chosen such a naff way to communicate with me after death. Our digital lives will survive us, sending out e-hearts and populating databases long after we have dropped off the census. When we deposit our life memories online, they start to develop lives of their own.

Those who want to limit the extent to which their lives are recorded digitally are swimming against the tide. Internet firms have a commercial interest in encouraging us not only to offload more personal information online, but also to use digital technology to reflect on our lives. Take Face­book, which was developed as a means of communicating but is becoming a tool for remembering and memorialising, too. The first Facebook users, who were university students in 2004, are mostly in their thirties now. Their graduations, first jobs, first loves, marriages and first children are likely to be recorded on the site; friends who have died young are likely to be mourned on it. The website understands that nostalgia is a powerful marketing tool, and so it has released gimmicky tools, such as automated videos, to help people “look back”.

These new online forms of remembrance are becoming popular. On Instagram and Twitter it is common for users to post sentimental old snaps under the hashtag #tbt, which stands for “Throwback Thursday”. Every day, seven million people check Timehop, an app that says it “helps you see the best moments of your past” by showing you old tweets, photos and online messages. Such tools are presented as a way of enriching our ability to relive the past but they are limiting. We can use them to tell stories about our lives, but the pace and structure of the narrative is defined for us. Remembering is an imaginative act, but internet firms are selling nostalgia by algorithm – and we’re buying it.

At their most ambitious, tech companies are offering the possibility of objective and complete insight into our pasts. In the future, “digital memories” could “[enhance] personal reflection in much the same way as the internet has aided scientific investigations”, the computer scientists Gordon Bell and Jim Gemmell wrote in the magazine Scientific American in 2006. The assumption is that our complex, emotional autobiographic memories can be captured as data to be ordered, quantified and analysed – and that computer programs could make better sense of them than our own, flawed brains. The pair have been collaborating on a Microsoft Research “life-logging” project since 2001, in which Bell logs everything he has said, written, seen and heard into a specially designed database.

Bell understood that the greatest challenge is finding a way to make digital archives usable. Without a program to help us extract information, digital memories are virtually useless: imagine trying to retrieve a telephone number from a month’s worth of continuous video footage. In our increasingly life-logged futures, we will all depend on powerful computer programs to index, analyse, repackage and retrieve our digital memories for us. The act of remembering will become automated. We will no longer make our “own fictions”.

This might sound like a distant sci-fi fantasy, but we are a long way there. Billions of people share their news and views by email or on social media daily, and unwittingly leave digital trails as they browse the web. The use of tracking devices to measure and record sleep, diet, exercise patterns, health and even mood is increasing. In the future, these comprehensive databases could prove very useful. When you go to the doctor, you might be able to provide details of your precise diet, exercise and sleep patterns. When a relationship breaks down you could be left with many gigabytes of digital memory to explore and make sense of. Did you really ­always put him down? Should you have broken up four years ago? In a few years’ time there could be an app for that.

Our reliance on digital memories is self-perpetuating: the more we depend on computer memories to provide us with detailed personal data, the more inadequate our own minds seem. Yet the fallibility of the human memory isn’t a design flaw, it is one of its best features. Recently, I typed the name of an ex-boyfriend into my Gmail search bar. This wasn’t like opening a box of old letters. For a start, I could access both sides of our email correspondence. Second, I could browse dozens of G-chats, instant messaging conversations so mundane and spontaneous that reading them can feel more like eavesdropping on a former self, or a stranger. The messages surprised me. I had remembered the relationship as short-lived and volatile but lacking any depth of feeling. So why had I sent those long, late-night emails? And what could explain his shorter, no less dramatic replies, “Will u ever speak to me again? You will ignore this I suspect but I love you.” Did he love me? Was I really so hurt? I barely recognise myself as the author of my messages; the feelings seem to belong to someone else.

My digital archives will offer a very different narrative from the half-truths and lies I tell myself, but I am more at home with my fictions. The “me” at the centre of my own memories is constantly evolving, but my digital identity is frozen in time. I feel a different person now; my computer suggests otherwise. Practically, this can pose problems (many of us are in possession of teenage social media posts we hope will never be made public) and psychologically it matters, too. To a greater or lesser extent, we all want to shed our former selves – but digital memories keep us firmly connected to our past. Forgetting and misremembering is a source of freedom: the freedom to reinvent oneself, to move on, to rewrite our stories. It means that old wounds need not hurt for ever, that love can be allowed to fade, that people can change.

With every passing year, we are shackling ourselves more tightly to our digital legacies, and relying more heavily on computer programs to narrate our personal histories for us. It is becoming ever harder to escape the past, or remake the future. Years from now, our digitally enhanced memories could allow us to have near-perfect recall, but who would want to live with their head in the cloud?

Sophie McBain is an NS contributing writer. This article was a runner-up in the 2015 Bodley Head FT Essay Prize

Sophie McBain is a freelance writer based in Cairo. She was previously an assistant editor at the New Statesman.

This article first appeared in the 18 February 2016 issue of the New Statesman, A storm is coming