Now let the real battle begin

We need new ways to decide ethical issues

"I have deep respect for those who do not agree with some of the provisions in the bill because of religious conviction," wrote Gordon Brown the day before Monday's Commons vote on hybrid embryos. "But I believe that we owe it to ourselves and future generations to introduce these measures and, in particular, to give our unequivocal backing, within the right framework of rules and standards, to stem-cell research."

In those two sentences Brown managed to capture all that is wrong in how we approach public debates about bioethics.

Brown's words are typical of the way religious views are "respected", only so that they can then be ignored. The debate then proceeds on fatalistic, utilitarian premises: there is no other option; we have to do this for the greater good. Both these moves are not so much steps forward as sideways, avoiding the tough issues and disagreements.

Take the engagement with religion first. There is a curious implicit pact here, whereby atheists, agnostics and believers alike all accept that faith stands somehow to one side of rationality. The devout gain respect and immunity from rational prosecution, at the price of being excluded from intellectual debate. Non-believers get to keep the civic sphere entirely secular, at the price of having to back off from believers. At one level, this is right. Much religious belief is a matter of faith, as impervious to rational scrutiny as the Vatican is to women. However, when it comes to specific matters of morality, the idea that religious convictions need respect, not interrogation and defence, is absurd. The world's major religious texts have nothing to say about stem cells, not least because those words do not appear in any of them. It may be a matter of faith that Christ rose from the dead, but Christians have to defend anything they say about the first stages of life.

For example, in his Easter Sunday sermon, Cardinal Keith O'Brien quoted from a letter he and several other church leaders had signed: "This bill goes against what most people, Christian or not, reckon is common sense. The idea of mixing human and animal genes is not just evil. It's crazy!" It is not good enough, on reading this, simply to nod sympathetically and say, "I respect your view." For one thing, the respect is not reciprocated: scientists and supporters of the bill are being accused of doing great evil. What we should do is demand that the central claims be substantiated, which, in this case, they are not. As a matter of fact, opinion polls repeatedly show that most members of the public do approve of embryo research, interspecies or otherwise. More importantly, if anyone other than a church leader accused something of being evil and crazy, we would want to see some reasons why we should agree. Instead, we smile, and move on.

Once religion is set aside, the debate then tends to proceed in a crassly simplistic way. Most of the time, the argument is no more than the claim that the benefits of the research will be enormous, and therefore we must do it. But this is far too quick. Using the terminally ill for experiments might teach us things future generations will benefit from, but that doesn't mean we should do it.

Yet it suits people to stop the debate here, because the real issue is much more complex: What is the moral status of embryos? Bishops simply assert they are as precious as full-grown human beings, scientists avoid answering the question altogether, and between the two camps, the fundamental issue is passed over in silence.

This fudge suits the religious lobby more, for it leaves unchallenged the view that cells from which human beings grow are precious. A similar silence has occluded the morality of abortion for decades. But if we thought 14-day-old embryos and aborted foetuses were as fully human as we are, then no appeal to the balance of costs and benefits could justify their routine killing. People talk as though foetal life has an important moral status, but act as though it does not.

Artificial divide

The contradiction can be resolved in one of only two ways: either we agree the bishops were right all along, or we face the facts squarely and stop the pretence that anything growing in the womb is important, and as human, as a tiny baby. The latter need not lead us down a slippery slope where human life in general is granted less respect. Nor would it entail treating stem cells with no respect: it is good for us to practise reverence for life even if, on reflection, we do not always think it is worth preserving.

But how can we debate these deeply divisive issues, when people's fundamental convictions are so different? What is needed is a way to bring religious perspectives into public discourse without diluting the essentially secular nature of the public square. This might sound impossible, as it is too often assumed that a secular politics requires people to leave their religious beliefs behind them. But that is a mistake. Democratic politics in a pluralist age requires, not that people set aside their fundamental commitments, but that they discuss their differences in a common language. The absence of God will inform someone's opinions on morality, but one cannot expect arguments in public debate to carry any weight if they start with an assertion of atheism. Catholicism may inform someone's beliefs on birth control, for instance, but we cannot be expected to agree with them on the basis of what the Pope says.

What both sides must do is to make their case in terms the other can assess and understand. Arguments for stem-cell research need to appeal to facts about the actual, not imagined, nature of early embryos, as well as serious thought about the potential social consequences of entirely new ways of doing science. Arguments can also draw on religious insights, just as long as they do not assume any particular theological framework. One can talk about the need for humility, deep respect for human life and the dangers of hubris without invoking St John's Gospel.

The justifiable desire to keep religious dogma out of public life has led to an unjustifiable tendency to treat religious views as a whole as separable from civic life. It is in the interests of everyone, believer or not, to end this artificial divide and start a real intellectual tussle in which secular and sacred views battle it out, rationally and in the open.

Julian Baggini is editor of The Philosophers' Magazine

This article first appeared in the 26 May 2008 issue of the New Statesman, Moral crisis?

Show Hide image

The age of loneliness

Profound changes in technology, work and community are transforming our ultrasocial species into a population of loners.

Our dominant ideology is based on a lie. A series of lies, in fact, but I’ll focus on just one. This is the claim that we are, above all else, self-interested – that we seek to enhance our own wealth and power with little regard for the impact on others.

Some economists use a term to describe this presumed state of being – Homo economicus, or self-maximising man. The concept was formulated, by J S Mill and others, as a thought experiment. Soon it became a modelling tool. Then it became an ideal. Then it evolved into a description of who we really are.

It could not be further from the truth. To study human behaviour is to become aware of how weird we are. Many species will go to great lengths to help and protect their close kin. One or two will show occasional altruism towards unrelated members of their kind. But no species possesses a capacity for general altruism that is anywhere close to our own.

With the possible exception of naked mole-rats, we have the most social minds of all mammals. These minds evolved as an essential means of survival. Slow, weak, armed with rounded teeth and flimsy nails in a world of fangs and claws and horns and tusks, we survived through co-operation, reciprocity and mutual defence, all of which developed to a remarkable degree.

A review paper in the journal Frontiers in Psychology observes that Homo economicus  might be a reasonable description of chimpanzees. “Outsiders . . . would not expect to receive offers of food or solicitude; rather, they would be fiercely attacked . . . food is shared only under harassment; even mothers will not voluntarily offer novel foods to their own infants unless the infants beg for them.” But it is an unreasonable description of human beings.

How many of your friends, colleagues and neighbours behave like chimpanzees? A few, perhaps. If so, are they respected or reviled? Some people do appear to act as if they have no interests but their own – Philip Green and Mike Ashley strike me as possible examples – but their behaviour ­attracts general revulsion. The news is filled with spectacular instances of human viciousness: although psychopaths are rare, their deeds fill the papers. Daily acts of kindness are seldom reported, because they are everywhere.

Every day, I see people helping others with luggage, offering to cede their place in a queue, giving money to the homeless, setting aside time for others, volunteering for causes that offer no material reward. Alongside these quotidian instances are extreme and stunning cases. I think of my Dutch mother-in-law, whose family took in a six-year-old Jewish boy – a stranger – and hid him in their house for two years during the German occupation of the Netherlands. Had he been discovered, they would all have been sent to a concentration camp.

Studies suggest that altruistic tendencies are innate: from the age of 14 months, children try to help each other, attempting to hand over objects another child can’t reach. At the age of two, they start to share valued possessions. By the time they are three, they begin to protest against other people’s violation of moral norms.

Perhaps because we are told by the media, think tanks and politicians that competition and self-interest are the defining norms of human life, we disastrously mischaracterise the way in which other people behave. A survey commissioned by the Common Cause Foundation reported that 78 per cent of respondents believe others to be more selfish than they really are.

I do not wish to suggest that this mythology of selfishness is the sole or even principal cause of the epidemic of loneliness now sweeping the world. But it is likely to contribute to the plague by breeding suspicion and a sense of threat. It also appears to provide a doctrine of justification for those afflicted by isolation, a doctrine that sees individualism as a higher state of existence than community. Perhaps it is hardly surprising that Britain, the European nation in which neoliberalism is most advanced, is, according to government figures, the loneliness capital of Europe.

There are several possible reasons for the atomisation now suffered by the supremely social mammal. Work, which used to bring us together, now disperses us: many people have neither fixed workplaces nor regular colleagues and regular hours. Our leisure time has undergone a similar transformation: cinema replaced by television, sport by computer games, time with friends by time on Facebook.

Social media seems to cut both ways: it brings us together and sets us apart. It helps us to stay in touch, but also cultivates a tendency that surely enhances other people’s sense of isolation: a determination to persuade your followers that you’re having a great time. FOMO – fear of missing out – seems, at least in my mind, to be closely ­associated with loneliness.

Children’s lives in particular have been transformed: since the 1970s, their unaccompanied home range (in other words, the area they roam without adult supervision) has declined in Britain by almost 90 per cent. Not only does this remove them from contact with the natural world, but it limits their contact with other children. When kids played out on the street or in the woods, they quickly formed their own tribes, learning the social skills that would see them through life.

An ageing population, family and community breakdown, the decline of institutions such as churches and trade unions, the switch from public transport to private, inequality, an alienating ethic of consumerism, the loss of common purpose: all these are likely to contribute to one of the most dangerous epidemics of our time.

Yes, I do mean dangerous. The stress response triggered by loneliness raises blood pressure and impairs the immune system. Loneliness enhances the risk of depression, paranoia, addiction, cognitive decline, dem­entia, heart disease, stroke, viral infection, accidents and suicide. It is as potent a cause of early death as smoking 15 cigarettes a day, and can be twice as deadly as obesity.

Perhaps because we are in thrall to the ideology that helps to cause the problem, we turn to the market to try to solve it. Over the past few weeks, the discovery of a new American profession, the people-walker (taking human beings for walks), has caused a small sensation in the media. In Japan there is a fully fledged market for friendship: you can hire friends by the hour with whom to chat and eat and watch TV; or, more disturbingly, to pose for pictures that you can post on social media. They are rented as mourners at funerals and guests at weddings. A recent article describes how a fake friend was used to replace a sister with whom the bride had fallen out. What would the bride’s mother make of it? No problem: she had been rented, too. In September we learned that similar customs have been followed in Britain for some time: an early foray into business for the Home Secretary, Amber Rudd, involved offering to lease her posh friends to underpopulated weddings.



My own experience fits the current pattern: the high incidence of loneliness suffered by people between the ages of 18 and 34. I have sometimes been lonely before and after that period, but it was during those years that I was most afflicted. The worst episode struck when I returned to Britain after six years working in West Papua, Brazil and East Africa. In those parts I sometimes felt like a ghost, drifting through societies to which I did not belong. I was often socially isolated, but I seldom felt lonely, perhaps because the issues I was investigating were so absorbing and the work so frightening that I was swept along by adrenalin and a sense of purpose.

When I came home, however, I fell into a mineshaft. My university friends, with their proper jobs, expensive mortgages and settled, prematurely aged lives, had become incomprehensible to me, and the life I had been leading seemed incomprehensible to everyone. Though feeling like a ghost abroad was in some ways liberating – a psychic decluttering that permitted an intense process of discovery – feeling like a ghost at home was terrifying. I existed, people acknowledged me, greeted me cordially, but I just could not connect. Wherever I went, I heard my own voice bouncing back at me.

Eventually I made new friends. But I still feel scarred by that time, and fearful that such desolation may recur, particularly in old age. These days, my loneliest moments come immediately after I’ve given a talk, when I’m surrounded by people congratulating me or asking questions. I often experience a falling sensation: their voices seem to recede above my head. I think it arises from the nature of the contact: because I can’t speak to anyone for more than a few seconds, it feels like social media brought to life.

The word “sullen” evolved from the Old French solain, which means “lonely”. Loneliness is associated with an enhanced perception of social threat, so one of its paradoxical consequences is a tendency to shut yourself off from strangers. When I was lonely, I felt like lashing out at the society from which I perceived myself excluded, as if the problem lay with other people. To read any comment thread is, I feel, to witness this tendency: you find people who are plainly making efforts to connect, but who do so by insulting and abusing, alienating the rest of the thread with their evident misanthropy. Perhaps some people really are rugged individualists. But others – especially online – appear to use that persona as a rationale for involuntary isolation.

Whatever the reasons might be, it is as if a spell had been cast on us, transforming this ultrasocial species into a population of loners. Like a parasite enhancing the conditions for its own survival, loneliness impedes its own cure by breeding shame and shyness. The work of groups such as Age UK, Mind, Positive Ageing and the Campaign to End Loneliness is life-saving.

When I first wrote about this subject, and the article went viral, several publishers urged me to write a book on the theme. Three years sitting at my desk, studying isolation: what’s the second prize? But I found another way of working on the issue, a way that engages me with others, rather than removing me. With the brilliant musician Ewan McLennan, I have written a concept album (I wrote the first draft of the lyrics; he refined them and wrote the music). Our aim is to use it to help break the spell, with performances of both music and the spoken word designed to bring people together –which, we hope, will end with a party at the nearest pub.

By itself, our work can make only a tiny contribution to addressing the epidemic. But I hope that, both by helping people to acknowledge it and by using the power of music to create common sentiment, we can at least begin to identify the barriers that separate us from others, and to remember that we are not the selfish, ruthless beings we are told we are.

“Breaking the Spell of Loneliness” by Ewan McLennan and George Monbiot is out now. For a full list of forthcoming gigs visit:

This article first appeared in the 20 October 2016 issue of the New Statesman, Brothers in blood