Show Hide image

The hysteric moment

Novelists have increasingly faced the challenge of trying to compete with a culture that is a step ahead of them.

Just before Christmas five years ago, I spent an afternoon in the company of the novelist Zadie Smith and the literary critic James Wood (then of the New Republic, now of the New Yorker). I'd been asked by another magazine to oversee a conversation between Smith and Wood on "the future of the novel and the function of criticism". The idea was that they would continue a public colloquy that had begun in 2002, after Wood wrote a withering and pitiless review of Smith's second novel, The Autograph Man.

Wood had found the novel to be little more than a tissue of "smirking epigraphs" fatally in thrall to the example of American writers such as Dave Eggers and the late David Foster Wallace, of whom he mostly disapproved. Eggers and Wallace were practitioners of something he called "hysterical realism" and their novels burned brightly with an unnourishing sub-Dickensian dazzle. These were smart guys writing big, ambitious books that tried to do nothing less than pin down and analyse an entire culture. And while they were busy practising cultural theory by fictional means, the novel's traditional quarries of character and consciousness got left behind. (In fact, Wallace's case was much more complicated than Wood tended to make it seem, and he actually shared many of the critic's misgivings about the moral and aesthetic legacy of postmodernism, of which hysterical realism could be said to be a variant or tributary.)

By Wood's account, the "hysterical realist" novel - the novel of "information" which can't decide if its job is simply to reflect the cognitive superabundance of life under late capitalism or, as they say in seminar rooms from Berkeley to Bloomsbury, to critique it - had, by the early 2000s, become one of, if not the, dominant mode in British and American fiction. And The Autograph Man, whose protagonist is a half-Chinese, half-Jewish dealer in the signatures of dead celebrities, faithfully mimics its most distinctive narrative tics - Smith is always pointing out, for instance, "that her characters, on the brink of a momentous access of feeling, are undermined by their sense that they are not ­being original, that TV has preceded them". An observation, Wood suggested, that it "may be time to retire".

That same afternoon, Smith told me that she had taken Wood's review "to heart" - and, indeed, you could see signs of this in several of the critical essays she wrote during this period for the Guardian and the New York Review of Books. These were much more likely to cite E M Forster than David Foster Wallace. There were further indications of this shift in her third novel, On Beauty, published in 2005, which was altogether more decorous than either The Autograph Man or her debut, White Teeth, and which she described as a "homage" to Forster (the book borrows its structure explicitly from Howards End). Now she and Wood were in agreement: "the culture [was] doing strange things to novels". Smith confessed that she found the "idea that you can't write a book without it being put through the processing machine of culture really quite frightening".

So, this was the sound of a generation dis­covering for itself a predicament described by Philip Roth in a celebrated essay published more than 40 years earlier, where he'd written that "the actuality is continually outdoing our talents, and the culture tosses up figures almost daily that are the envy of any novelist". For both Smith and Wood, none of their contemporaries had come closer to properly articulating these anxieties for the early 21st century than the American writer Jonathan Franzen. His sprawling third novel, The Corrections, published in 2001, was in part the product of several years' worth of agonised reflections on the place of fiction in a culture that was increasingly and aggressively indifferent to it.

In 1996, Franzen had written an essay for Harper's magazine, "Perchance to Dream", the arguments of which continued to reverberate in a certain stratum of the literary intelligentsia on both sides of the Atlantic in the early years of the new century. The "Harper's essay", as it became known, was both a 20-page howl of despair at the decline of the big, ambitious "social novel" that connects the personal with the societal and a kind of renunciation, in which Franzen declared that in fact the very idea of writing fiction which sought to "engage" with the culture should be given up, now that there are technologies - film and television, principally - that "do a much better job of social instruction".

That culture is so grossly productive of novelties that to engage with it, Franzen concluded, was to "risk writing fiction that [made] the same point over and over: technological consumerism is an infernal machine, technological consumerism is an infernal machine . . ." If the improving ­mission of the novel of social instruction was at an end, what was left was the solace of "sentences of such authenticity that refuge can be taken in them".

The Harper's essay wasn't merely programmatic, however. Much of its considerable interest lay in its account of the genesis of The Corrections (indeed, few reviewers were able to resist using the piece as a lens through which to view the novel). Franzen recalled being "para­lysed" with what would become The Corrections. "I was torturing the story, stretching it to accommodate ever more of those things-in-the-world that impinge on the enterprise of fiction writing." He found that he couldn't help bulking up his "story" until it became "bloated with issues". Liberation, he implied, arrived once he realised he wasn't obliged to dramatise the "important issues of the day".

But The Corrections is not wholly successful in extricating itself from the horns of this di­lemma. Franzen found that it was much harder to give up the impulse to anatomise the culture than the Harper's essay had implied. And his failure to do so was symptomatic. "There are certain places in that novel," Smith said, "and I know I've written them myself in my novels, where the engagement is not with the novel as an organic form, with the characters, with the story, but is a matter of coming straight up to face the writer. It's not the novel I want to write and it's maybe not the novel a lot of people want to read any more. If the novel is going to stake its claim to being a separate part of the culture, then it needs not to be direct commentary."

It is tempting, in retrospect, to read those remarks of Smith's as setting out a programme - one that comes to fruition in "Two Directions for the Novel", an essay included in her most recent book, Changing My Mind. Here she describes Tom McCarthy's intricate, allegorical novel Remainder as an attempt to answer the question of how fiction might stake its claim to being a "separate part of the culture". But it is not clear from this how far she, and we, have travelled, because the dichotomy Smith presents - between the realist novel and the self-enclosed allegory - is pretty much the same one that Franzen was trying to think his way out of at the start of the decade.

Jonathan Derbyshire is culture editor of the New Statesman

Jonathan Derbyshire is Managing Editor of Prospect. He was formerly Culture Editor of the New Statesman.

This article first appeared in the 14 December 2009 issue of the New Statesman, The Muslim Jesus

Picture: Ralph Steadman
Show Hide image

The age of disorder: why technology is the greatest threat to humankind

Disruptive technologies might change the very nature of humanity – and no nation can fight on its own.

Though human beings are social animals, for millions of years they lived in small, intimate communities numbering no more than a few dozen people. Even today, as the evolutionary biologist Robin Dunbar has shown, most human beings find it impossible properly to know more than 150 individuals, irrespective of how many Face­book “friends” they boast. Human beings easily develop loyalty to small, intimate groups such as a tribe, an infantry company or a family business, but it is hardly natural for them to be loyal to millions of strangers. Such mass loyalties have appeared only in the past few thousand years as a means of solving practical problems that no single tribe could solve by itself. Ancient Egypt was created to help human beings gain control of the River Nile, and ancient China coalesced to help the people restrain the turbulent Yellow River.

Nations solved some problems and created new ones. In particular, big nations led to big wars. Yet people were willing to pay the price in blood, because nations provided them with unprecedented levels of security and prosperity. In the 19th and early 20th centuries the nationalist deal still looked very attractive. Nationalism was leading to horrendous conflicts on an unprecedented scale, but modern nation states also built systems of health care, education and welfare. National health services made Passchendaele and Verdun seem worthwhile.

Yet the invention of nuclear weapons sharply tilted the balance of the deal. After Hiroshima, people no longer feared that nationalism would lead to mere war: they began to fear it would lead to nuclear war. Total annihilation has a way of ­sharpening people’s minds, and thanks in no small measure to the atomic bomb, the impossible happened and the nationalist genie was squeezed at least halfway back into its bottle. Just as the ancient villagers of the Yellow River Basin redirected some of their loyalty from local clans to a much bigger nation that restrained the dangerous river, so in the nuclear age a global community gradually developed over and above the various nations because only such a community could restrain the nuclear demon.

In the 1964 US presidential campaign, Lyndon B Johnson aired the “Daisy” advertisement, one of the most successful pieces of propaganda in the annals of television. The advert opens with a little girl picking and counting the petals of a daisy, but when she reaches ten, a metallic male voice takes over, counting back from ten to zero as in a missile launch countdown. Upon it reaching zero, the bright flash of a nuclear explosion fills the screen, and Candidate Johnson addresses the American public: “These are the stakes – to make a world in which all of God’s children can live, or to go into the dark. We must either love each other. Or we must die.” We often associate the slogan “Make love, not war” with the late-1960s counterculture, but already in 1964 it was accepted wisdom, even among hard-nosed politicians such as Johnson.

During the Cold War, nationalism took a back seat to a more global approach to international politics, and when the Cold War ended, globalisation seemed to be the irresistible wave of the future. It was expected that humankind would leave nationalistic politics behind, as a relic of more primitive times that might appeal at most to the ill-informed inhabitants of a few under­developed countries. Events in 2016 proved, however, that nationalism still has a powerful hold even on the citizens of Europe and the United States, not to mention Russia, India and China. Alienated by the impersonal forces of global capitalism, and fearing for the fate of national systems of health, education and welfare, people all over the world seek reassurance and meaning in the bosom of the nation.

Yet the question raised by Johnson in the Daisy advertisement is even more pertinent today than it was in 1964. Will we make a world in which all human beings can live together, or will we all go into the dark? Can Donald Trump, Vladimir Putin and their like save the world by appealing to our national sentiments, or is the current nationalist spate a form of escapism from the intractable global problems we face?

***

Let’s start with nuclear war. When the Daisy advert aired, two years after the Cuban missile crisis, nuclear annihilation was a palpable threat. Pundits and laypeople alike feared that humankind did not have the wisdom to avert destruction, and that it was only a matter of time before the Cold War turned scorching hot. In fact, humankind successfully rose to the nuclear challenge. Americans, Soviets, Europeans and Chinese changed the way geopolitics had been conducted for millennia, so that the Cold War ended with little bloodshed, and a new internationalist world order fostered an era of unprecedented peace. Not only was nuclear war averted, but war of all kinds declined. Since 1945, surprisingly few borders have been redrawn through naked aggression, and most countries have ceased to use war as a standard political tool. In 2016, despite wars in Syria, Ukraine and other hot spots, fewer people died from human violence than from obesity, car accidents or suicide. This may well have been the greatest political and moral achievement of our times.

Unfortunately, we are so used to this achievement that we take it for granted. This is partly why people allow themselves to play with fire, and that includes not only the latest Russian adventures in eastern Europe and the Middle East, but also the choices made by European and American voters.

The Brexit debate in Britain revolved mainly around questions of economics and immigration, while the EU’s vital contribution to European and global peace has largely been ignored. After centuries of terrible bloodshed, the French, Germans, Italians and Britons have finally built a mechanism that ensures continental harmony – only to have the British public throw a wrench into the miracle machine. Meanwhile, Donald Trump mixes calls for US isolationism with plans to strengthen the country’s nuclear arsenal and reignite a nuclear arms race, thereby threatening to undo the hard-won gains of the past decades and bring us back to the brink of nuclear annihilation.

It was extremely difficult to construct the internationalist regime that prevented nuclear war and safeguarded global peace. No doubt we need to adapt this regime to changing conditions in the world: for example, by relying less on the United States and giving a greater role to non-Western powers such as China and India. But abandoning this regime altogether and reverting to nationalist power politics would be an ­irresponsible gamble.

True, in the past, countries played the ­nationalist politics game without destroying human civilisation. But that was in the pre-Hiroshima era. Since then, nuclear weapons have raised the stakes and changed the fundamental nature of war and politics. No matter whom American voters elect to the presidency, the atom bomb is still there and E still equals MC². As long as human beings know how to enrich uranium and plutonium, their survival will depend on privileging the prevention of nuclear war over the interests of any particular nation. Zealous nationalists should ask themselves whether their nation by itself, without a robust system of international co-operation, can protect the world – or even itself – from nuclear destruction.

On top of nuclear war, in the coming decades humankind will face a new threat to its existence that hardly registered on the political radar in 1964: climate change. If we continue with our present course it is likely that global warming, ocean acidification and ecological degradation will result in unprecedented economic, political and social problems, and might well destroy the foundations of human prosperity. What is the nationalist answer to climate change? How can any nation, however powerful, stop global warming on its own? Will the US build a wall against rising oceans?

When it comes to climate, countries are not sovereign, but are at the mercy of actions taken by governments on the other side of the planet. As long as 200 governments pursue 200 different ecological strategies, shaped by their unique needs and interests, none is likely to succeed. With present-day technology, any serious measures to stop global warming are likely to slow down economic growth. Such a policy carries an unbearable political price if it is undertaken by a single country while others continue with business as usual. Any US administration that deliberately slowed down economic growth for environmental reasons would be bound to lose the next election; a Chinese administration that does so courts revolution tomorrow morning. In a nationalist and xenophobic world no government will sacrifice itself for the greater good of humanity, as Trump’s actions show.

***

Indeed, nationalism is even more dangerous in the context of climate change than that of nuclear war. An atomic bomb is such an obvious and immediate threat that even the most ardent nationalist cannot ignore it. Global warming, by contrast, is a much more vague and protracted menace. Hence, whenever environmental considerations demand some painful sacrifice, nationalists will be tempted to put the national interest first, reassuring themselves that we can worry about the environment later, or just leave it to people elsewhere. Alternatively, as in the case of Trump, they may simply deny the problem. It isn’t a coincidence that scepticism about climate change is usually the preserve of nationalist politicians. They have no answer to the problem, and so they prefer to believe it does not exist.

The same dynamics are likely to spoil any nationalist antidote to the third large threat to human existence in the 21st century: technological disruption. New technologies, particularly in the fields of bioengineering and artificial intelligence, will soon give humankind unprecedented, godlike powers. Whereas previously human beings learned to produce food, weapons and vehicles, in the coming decades our main products will probably be bodies, brains and minds. However, it is extremely difficult to foresee the potential impact of such technologies. They open the door to an entire supermarket of doomsday scenarios.

If and when artificial intelligence (AI) surpasses human intelligence, it may be given control of weapon systems and crucial decisions, with potentially calamitous consequences. In addition, as AI outperforms human beings in ever more tasks, it might push billions of us out of the job market, creating a new “useless class” of people, devoid of both economic value and political power. Meanwhile, given enough biometric data and enough computing power, external algorithms could know us better than we know ourselves, and then governments and corporations could predict our decisions, mani­pulate our emotions and gain absolute control over our lives.

On the bioengineering front, breakthroughs in genetics, nanotechnology and direct brain/computer interfaces could unleash deadly new epidemics or disturb our internal mental balance. In past centuries we have gained control of the world outside us and reshaped the planet, but because we didn’t understand the complexity of the global ecology, the changes we made inadvertently disrupted the entire ecological system. In the coming century we will gain control of the world inside us and reshape our bodies and brains, but because we don’t understand the complexity of our own minds, the changes we will make might disrupt our mental system. In addition, bioengineering might for the first time in history translate economic inequality into biological inequality, creating an upper caste of enhanced superhumans, and relegating the poor to the dustbin of evolution.

What is the nationalist answer to these menaces? As in the case of global warming, so, too, with technological disruption: the nation state is the wrong framework to address the threat. Given that research and development are not the monopoly of any one country, even a superpower such as the US or China cannot restrict them by itself. If the US government forbids the genetic engineering of human embryos, it won’t prevent North Korean scientists from doing such work. And if the resulting developments confer on North Korea some crucial economic or military advantage, the US will be tempted to break its own ban. Particularly in a xenophobic, dog-eat-dog world, if even a single country chooses to pursue a high-risk, high-gain technological path, other countries will be forced to do the same, because nobody can afford to remain behind. In order to avoid such a race to the bottom, humankind will probably need some kind of global identity and loyalty.

Whereas nuclear war and climate change threaten only the physical survival of humankind, disruptive technologies might change the very nature of humanity, and are therefore entangled with human beings’ deepest ethical and religious beliefs. Although everyone agrees that we should avoid nuclear war and ecological meltdown, people have widely differing opinions about using bioengineering and AI to upgrade human beings and to create new life forms. If we fail to cobble together globally accepted ethical guidelines, it will be open season for Dr Frankenstein.

When it comes to formulating such ethical guidelines, nationalism suffers above all from a failure of the imagination. Nationalists think in terms of territorial conflicts lasting centuries, whereas the technological revolutions of the 21st century should be understood in cosmic terms. Ever since its appearance on Earth, four billion years ago, life has been governed by the laws of natural selection. During those aeons, whether you were a virus or a dinosaur, you evolved according to the principles of natural selection. No matter what strange shapes life took, it remained confined to the organic realm. Whether a cactus or a whale, you were made of organic compounds. Now science might replace natural selection with intelligent design, and might even start creating non-organic life forms. After four billion years of organic life shaped by natural selection, science is ushering in an era of inorganic life shaped by intelligent design. What has Israeli, Russian or French nationalism got to say about this? In order to make wise choices about the future of life we need to go way beyond the nationalist viewpoint and look at things from a much wider perspective.

***

The nationalist wave sweeping across the world cannot turn the clock back to 1939 or 1914. Technology has changed everything by creating a set of global threats to human existence that no nation can fight on its own. A common enemy is the best catalyst for forging a common identity, and humankind now has three such enemies – nuclear war, climate change and disruptive technology. If, despite these threats, we choose to privilege our particular national loyalties above everything else, the results may be far worse than in 1914 and 1939.

A much better path is the one outlined in the EU’s constitution, which states that “while remaining proud of their own national identities and history, the peoples of Europe are determined to transcend their former divisions and, united ever more closely, to forge a common destiny”. There is still plenty of room in the world for the kind of patriotism that celebrates the uniqueness of my nation and stresses my special obligations towards it. Yet, if we want to survive and flourish, humankind has little choice but to complement such local loyalties with substantial obligations towards a global community.

In previous centuries national identities were forged because human beings faced problems and discovered opportunities that went far beyond the scope of ­local tribes, and which only countrywide co-operation could hope to handle. In the 21st century, nations find themselves in the same situation as the old tribes.

We need a new global identity, because national institutions are incapable of managing a set of unprecedented global challenges. We now have a global ecology, a global economy and a global science – but we are still stuck with only national politics. This mismatch prevents the political system from countering our main problems effectively.

To have effective politics, we must either de-globalise the ecology, the economy and the march of science, or we must globalise our politics. As it is impossible to ­de-globalise the ecology and the march of science, and as the cost of de-globalising the economy will probably be ruinous, the only solution is to globalise politics.

Yuval Noah Harari lectures at the Hebrew University of Jerusalem. His latest book is “Homo Deus: a Brief History of Tomorrow” (Vintage)

This article first appeared in the 14 December 2009 issue of the New Statesman, The Muslim Jesus