“Towards the light, towards knowledge!” A 1960s Soviet propaganda poster advocates science over religion. (Bridgeman Art Library)
Show Hide image

The ghost at the atheist feast: was Nietzsche right about religion?

John Gray reviews “The Age of Nothing” by Peter Watson and “Culture and the Death of God” by Terry Eagleton.

The Age of Nothing: How We Sought to Live Since the Death of God
Peter Watson
Weidenfeld & Nicolson, 624pp, £30

Culture and the Death of God
Terry Eagleton
Yale University Press, 264pp, £18.99

There can be little doubt that Nietzsche is the most important figure in modern atheism, but you would never know it from reading the current crop of unbelievers, who rarely cite his arguments or even mention him. Today’s atheists cultivate a broad ignorance of the history of the ideas they fervently preach, and there are many reasons why they might prefer that the 19th-century German thinker be consigned to the memory hole. With few exceptions, contemporary atheists are earnest and militant liberals. Awkwardly, Nietzsche pointed out that liberal values derive from Jewish and Christian monotheism, and rejected these values for that very reason. There is no basis – whether in logic or history – for the prevailing notion that atheism and liberalism go together. Illustrating this fact, Nietzsche can only be an embarrassment for atheists today. Worse, they can’t help dimly suspecting they embody precisely the kind of pious freethinker that Nietzsche despised and mocked: loud in their mawkish reverence for humanity, and stridently censorious of any criticism of liberal hopes.

Against this background, it is refreshing that Peter Watson and Terry Eagleton take Nietzsche as the central reference point for their inquiries into the retreat of theism. For Watson, an accomplished intellectual historian, Nietzsche diagnosed the “nihilist predicament” in which the high-bourgeois civilisation that preceded the Great War unwittingly found itself.

First published in 1882, Nietzsche’s dictum “God is dead” described a situation in which science (notably Darwinism) had revealed “a world with no inherent order or meaning”. With theism no longer credible, meaning would have to be made in future by human beings – but what kind of meaning, and by which human beings? In a vividly engaging conspectus of the formative ideas of the past century, The Age of Nothing shows how Nietzsche’s diagnosis evoked responses in many areas of cultural life, including some surprising parts of the political spectrum.

While it is widely known that Nietzsche’s ideas were used as a rationale for imperialism, and later fascism and Nazism, Watson recounts how Nietzsche had a great impact on Bolshevik thinking, too. The first Soviet director of education, Anatoly Lunacharsky (who was also in charge of state censorship of the arts and bore the delicious title of Commissar of Enlightenment), saw himself as promoting a communist version of the Superman. “In labour, in technology,” he wrote, in a passage cited by Watson, “[the new man] found himself to be a god and dictated his will to the world.”

Trotsky thought much the same, opining that socialism would create “a higher social-biologic type”. Lenin always resisted the importation of Nietzsche’s ideas into Bolshevism. But the Soviet leader kept a copy of Nietzsche’s Birth of Tragedy in his personal library and one of Zarathustra in his Kremlin office, and there is more than a hint of the cult of the will in Lenin’s decree ordering the building of “God-defying towers” throughout the new Soviet state.

It seems that few if any of these towers were constructed, the Soviet authorities devoting their energy instead to incessant anti-religion campaigns. A League of Militant Atheists was set up to spread the message that “religion was scientifically falsifiable”. Religious buildings were seized, looted and given over to other uses, or else razed. Hundreds of thousands of believers perished, but the new humanity that they and their admirers in western countries confidently anticipated has remained elusive. A Soviet census in 1937 showed that “religious belief and activity were still quite pervasive”. Indeed, just a few weeks ago, Vladimir Putin – scion of the KGB, the quintessential Soviet institution that is a product of over 70 years of “scientific atheism” – led the celebrations of Orthodox Christmas.

In many parts of the world at present, there is no sign of religion dying away: quite the reverse. Yet Watson is not mistaken in thinking that throughout much of the 20th century “the death of God” was a cultural fact, and he astutely follows up the various ways in which the Nietzschean imperative – the need to construct a system of values that does not rely on any form of transcendental belief – shaped thinking in many fields. A purely secular ethic had been attempted before (the utilitarian philosophies of Jeremy Bentham and John Stuart Mill are obvious examples) but Nietzsche made the task incomparably more difficult by identifying the theistic concepts and values on which these and other secular moralities relied. Ranging widely, Watson tracks the pursuit of a convincing response to Nietzsche in philosophers as various as Henri Bergson, William James and G E Moore, painters such as Matisse and Kandinsky, futurist composers and modernist poets (notably Mallarmé and Wallace Stevens), movements such as the Beats and the Sixties counterculture and a host of psychotherapeutic cults.

If Watson shows how Nietzsche’s challenge resonated throughout pretty well every area of cultural life, for Eagleton this focus on culture is a distraction, if not a crass mistake. Discussing Edmund Burke and T S Eliot, both of whom viewed religion largely in cultural terms even though they were believers, he asks rhetorically: “Might culture succeed in becoming the sacred discourse of a post-religious age, binding people and intelligentsia in spiritual union? Could it bring the most occult of truths to bear on everyday conduct, in the manner of religious faith?” Historically, the idea that religion is separate from culture is highly anomalous – a peculiarly Christian notion, with no counterpart in pre-Christian antiquity or non-western beliefs. But Eagleton isn’t much interested in other religions, and for him it is clear that the answer to his question must be “No”.

It’s not simply that culture lacks the emotional power of religion: “No symbolic form in history has matched religion’s ability to link the most exalted of truths to the daily existence of countless men and women.” More to the point, religion – particularly Christianity – embodies a sharp critique of culture. A standing protest against the repression that accompanies any social order, the Christian message brings “the grossly inconvenient news that our forms of life must undergo radical dissolution if they are to be reborn as just and compassionate communities”. In making this demand, Eagleton concludes, “Christianity is arguably a more tragic creed than Nietzsche’s own doctrine, precisely because it regards suffering as unacceptable.”

It’s an interesting suggestion, but neither the Christian religion nor Nietzsche’s philosophy can be said to express a tragic sense of life. If Yeshua (the Jewish prophet later known as Jesus) had died on the cross and stayed dead, that would have been a tragedy. In the Christian story, however, he was resurrected and came back into the world. Possibly this is why Dante’s great poem wasn’t called The Divine Tragedy. In the sense in which it was understood by the ancients, tragedy implies necessity and unalterable finality. According to Christianity, on the other hand, there is nothing that cannot be redeemed by divine grace and even death can be annulled.

Nor was Nietzsche, at bottom, a tragic thinker. His early work contained a profound interrogation of liberal rationalism, a modern view of things that contains no tragedies, only unfortunate mistakes and inspirational learning experiences. Against this banal creed, Nietzsche wanted to revive the tragic world-view of the ancient Greeks. But that world-view makes sense only if much that is important in life is fated. As understood in Greek religion and drama, tragedy requires a conflict of values that cannot be revoked by any act of will; in the mythology that Nietzsche concocted in his later writings, however, the godlike Superman, creating and destroying values as he pleases, can dissolve and nullify any tragic conflict.

As Eagleton puts it, “The autonomous, self-determining Superman is yet another piece of counterfeit theology.” Aiming to save the sense of tragedy, Nietzsche ended up producing another anti-tragic faith: a hyperbolic version of humanism.

The anti-tragic character of Christianity poses something of a problem for Eagleton. As he understands it, the Christian message calls for the radical dissolution of established forms of life – a revolutionary demand, but also a tragic one, as the kingdom of God and that of man will always be at odds. The trouble is that the historical Jesus seems not to have believed anything like this. His disdain for order in society rested on his conviction that the world was about to come to an end, not metaphorically, as Augustine would later suggest, but literally. In contrast, revolutionaries must act in the basic belief that history will continue, and when they manage to seize power they display an intense interest in maintaining order. Those who make revolutions have little interest in being figures in a tragic spectacle. Perhaps Eagleton should read a little more Lenin.

Although he fails to come up with anything resembling serious politics, Eagleton produces an account of the continuing power of religion that is rich and compelling. Open this book at random, and you will find on a single page more thought-stirring argument than can be gleaned from a dozen ponderous treatises on philosophy or sociology. Most of the critical turning points in modern thought are examined illuminatingly. Eagleton’s discussion of the religious dimensions of Romanticism is instructive, and his crisp deconstruction of postmodernism is a pleasure to read. He is exceptionally astute in his analysis of “the limits of Enlightenment” – nowadays a heavily mythologised movement, the popular conception of which bears almost no relation to the messy and often unpleasantly illiberal reality.

Evangelical rationalists would do well to study this book, but somehow I doubt that many of them will.

Was Nietzsche right in thinking that God is dead? Is it truly the case that – as the German sociologist Max Weber, who was strongly influenced by Nietzsche, believed – the modern world has lost the capacity for myth and mystery as a result of the rise of capitalism and secularisation? Or is it only the forms of enchantment that have changed? Importantly, it wasn’t only the Christian God that Nietzsche was talking about. He meant any kind of transcendence, in whatever form it might appear. In this sense, Nietzsche was simply wrong. The era of “the death of God” was a search for transcendence outside religion. Myths of world revolution and salvation through science continued the meaning-giving role of transcendental religion, as did Nietzsche’s own myth of the Superman.

Reared on a Christian hope of redemption (he was, after all, the son of a Lutheran minister), Nietzsche was unable, finally, to accept a tragic sense of life of the kind he tried to retrieve in his early work. Yet his critique of liberal rationalism remains as forceful as ever. As he argued with masterful irony, the belief that the world can be made fully intelligible is an article of faith: a metaphysical wager, rather than a premise of rational inquiry. It is a thought our pious unbelievers are unwilling to allow. The pivotal modern critic of religion, Friedrich Nietzsche will continue to be the ghost at the atheist feast.

John Gray is the New Statesman’s lead book reviewer. His latest book is “The Silence of Animals: On Progress and Other Modern Myths” (Allen Lane, £18.99)

John Gray is the New Statesman’s lead book reviewer. His latest book is The Soul of the Marionette: A Short Enquiry into Human Freedom.

Show Hide image

The age of loneliness

Profound changes in technology, work and community are transforming our ultrasocial species into a population of loners.

Our dominant ideology is based on a lie. A series of lies, in fact, but I’ll focus on just one. This is the claim that we are, above all else, self-interested – that we seek to enhance our own wealth and power with little regard for the impact on others.

Some economists use a term to describe this presumed state of being – Homo economicus, or self-maximising man. The concept was formulated, by J S Mill and others, as a thought experiment. Soon it became a modelling tool. Then it became an ideal. Then it evolved into a description of who we really are.

It could not be further from the truth. To study human behaviour is to become aware of how weird we are. Many species will go to great lengths to help and protect their close kin. One or two will show occasional altruism towards unrelated members of their kind. But no species possesses a capacity for general altruism that is anywhere close to our own.

With the possible exception of naked mole-rats, we have the most social minds of all mammals. These minds evolved as an essential means of survival. Slow, weak, armed with rounded teeth and flimsy nails in a world of fangs and claws and horns and tusks, we survived through co-operation, reciprocity and mutual defence, all of which developed to a remarkable degree.

A review paper in the journal Frontiers in Psychology observes that Homo economicus  might be a reasonable description of chimpanzees. “Outsiders . . . would not expect to receive offers of food or solicitude; rather, they would be fiercely attacked . . . food is shared only under harassment; even mothers will not voluntarily offer novel foods to their own infants unless the infants beg for them.” But it is an unreasonable description of human beings.

How many of your friends, colleagues and neighbours behave like chimpanzees? A few, perhaps. If so, are they respected or reviled? Some people do appear to act as if they have no interests but their own – Philip Green and Mike Ashley strike me as possible examples – but their behaviour ­attracts general revulsion. The news is filled with spectacular instances of human viciousness: although psychopaths are rare, their deeds fill the papers. Daily acts of kindness are seldom reported, because they are everywhere.

Every day, I see people helping others with luggage, offering to cede their place in a queue, giving money to the homeless, setting aside time for others, volunteering for causes that offer no material reward. Alongside these quotidian instances are extreme and stunning cases. I think of my Dutch mother-in-law, whose family took in a six-year-old Jewish boy – a stranger – and hid him in their house for two years during the German occupation of the Netherlands. Had he been discovered, they would all have been sent to a concentration camp.

Studies suggest that altruistic tendencies are innate: from the age of 14 months, children try to help each other, attempting to hand over objects another child can’t reach. At the age of two, they start to share valued possessions. By the time they are three, they begin to protest against other people’s violation of moral norms.

Perhaps because we are told by the media, think tanks and politicians that competition and self-interest are the defining norms of human life, we disastrously mischaracterise the way in which other people behave. A survey commissioned by the Common Cause Foundation reported that 78 per cent of respondents believe others to be more selfish than they really are.

I do not wish to suggest that this mythology of selfishness is the sole or even principal cause of the epidemic of loneliness now sweeping the world. But it is likely to contribute to the plague by breeding suspicion and a sense of threat. It also appears to provide a doctrine of justification for those afflicted by isolation, a doctrine that sees individualism as a higher state of existence than community. Perhaps it is hardly surprising that Britain, the European nation in which neoliberalism is most advanced, is, according to government figures, the loneliness capital of Europe.

There are several possible reasons for the atomisation now suffered by the supremely social mammal. Work, which used to bring us together, now disperses us: many people have neither fixed workplaces nor regular colleagues and regular hours. Our leisure time has undergone a similar transformation: cinema replaced by television, sport by computer games, time with friends by time on Facebook.

Social media seems to cut both ways: it brings us together and sets us apart. It helps us to stay in touch, but also cultivates a tendency that surely enhances other people’s sense of isolation: a determination to persuade your followers that you’re having a great time. FOMO – fear of missing out – seems, at least in my mind, to be closely ­associated with loneliness.

Children’s lives in particular have been transformed: since the 1970s, their unaccompanied home range (in other words, the area they roam without adult supervision) has declined in Britain by almost 90 per cent. Not only does this remove them from contact with the natural world, but it limits their contact with other children. When kids played out on the street or in the woods, they quickly formed their own tribes, learning the social skills that would see them through life.

An ageing population, family and community breakdown, the decline of institutions such as churches and trade unions, the switch from public transport to private, inequality, an alienating ethic of consumerism, the loss of common purpose: all these are likely to contribute to one of the most dangerous epidemics of our time.

Yes, I do mean dangerous. The stress response triggered by loneliness raises blood pressure and impairs the immune system. Loneliness enhances the risk of depression, paranoia, addiction, cognitive decline, dem­entia, heart disease, stroke, viral infection, accidents and suicide. It is as potent a cause of early death as smoking 15 cigarettes a day, and can be twice as deadly as obesity.

Perhaps because we are in thrall to the ideology that helps to cause the problem, we turn to the market to try to solve it. Over the past few weeks, the discovery of a new American profession, the people-walker (taking human beings for walks), has caused a small sensation in the media. In Japan there is a fully fledged market for friendship: you can hire friends by the hour with whom to chat and eat and watch TV; or, more disturbingly, to pose for pictures that you can post on social media. They are rented as mourners at funerals and guests at weddings. A recent article describes how a fake friend was used to replace a sister with whom the bride had fallen out. What would the bride’s mother make of it? No problem: she had been rented, too. In September we learned that similar customs have been followed in Britain for some time: an early foray into business for the Home Secretary, Amber Rudd, involved offering to lease her posh friends to underpopulated weddings.



My own experience fits the current pattern: the high incidence of loneliness suffered by people between the ages of 18 and 34. I have sometimes been lonely before and after that period, but it was during those years that I was most afflicted. The worst episode struck when I returned to Britain after six years working in West Papua, Brazil and East Africa. In those parts I sometimes felt like a ghost, drifting through societies to which I did not belong. I was often socially isolated, but I seldom felt lonely, perhaps because the issues I was investigating were so absorbing and the work so frightening that I was swept along by adrenalin and a sense of purpose.

When I came home, however, I fell into a mineshaft. My university friends, with their proper jobs, expensive mortgages and settled, prematurely aged lives, had become incomprehensible to me, and the life I had been leading seemed incomprehensible to everyone. Though feeling like a ghost abroad was in some ways liberating – a psychic decluttering that permitted an intense process of discovery – feeling like a ghost at home was terrifying. I existed, people acknowledged me, greeted me cordially, but I just could not connect. Wherever I went, I heard my own voice bouncing back at me.

Eventually I made new friends. But I still feel scarred by that time, and fearful that such desolation may recur, particularly in old age. These days, my loneliest moments come immediately after I’ve given a talk, when I’m surrounded by people congratulating me or asking questions. I often experience a falling sensation: their voices seem to recede above my head. I think it arises from the nature of the contact: because I can’t speak to anyone for more than a few seconds, it feels like social media brought to life.

The word “sullen” evolved from the Old French solain, which means “lonely”. Loneliness is associated with an enhanced perception of social threat, so one of its paradoxical consequences is a tendency to shut yourself off from strangers. When I was lonely, I felt like lashing out at the society from which I perceived myself excluded, as if the problem lay with other people. To read any comment thread is, I feel, to witness this tendency: you find people who are plainly making efforts to connect, but who do so by insulting and abusing, alienating the rest of the thread with their evident misanthropy. Perhaps some people really are rugged individualists. But others – especially online – appear to use that persona as a rationale for involuntary isolation.

Whatever the reasons might be, it is as if a spell had been cast on us, transforming this ultrasocial species into a population of loners. Like a parasite enhancing the conditions for its own survival, loneliness impedes its own cure by breeding shame and shyness. The work of groups such as Age UK, Mind, Positive Ageing and the Campaign to End Loneliness is life-saving.

When I first wrote about this subject, and the article went viral, several publishers urged me to write a book on the theme. Three years sitting at my desk, studying isolation: what’s the second prize? But I found another way of working on the issue, a way that engages me with others, rather than removing me. With the brilliant musician Ewan McLennan, I have written a concept album (I wrote the first draft of the lyrics; he refined them and wrote the music). Our aim is to use it to help break the spell, with performances of both music and the spoken word designed to bring people together –which, we hope, will end with a party at the nearest pub.

By itself, our work can make only a tiny contribution to addressing the epidemic. But I hope that, both by helping people to acknowledge it and by using the power of music to create common sentiment, we can at least begin to identify the barriers that separate us from others, and to remember that we are not the selfish, ruthless beings we are told we are.

“Breaking the Spell of Loneliness” by Ewan McLennan and George Monbiot is out now. For a full list of forthcoming gigs visit: monbiot.com/music/

This article first appeared in the 20 October 2016 issue of the New Statesman, Brothers in blood