Show Hide image

Since the dawn of time

Two hundred years after Darwin’s birth, scientists still can’t agree on whether evolution and religi

It has been the year of evolution. To coincide with the anniversaries of both Darwin's birth and the publication of On the Origin of Species, Richard Dawkins published The Greatest Show on Earth: the Evidence for Evolution. And Jerry Coyne (an eminent evolutionary biologist at the University of Chicago) wrote Why Evolution Is True. Yet, amid the ongoing celebrations, a new storm has erupted. This is not the usual battle between creationist fundamentalists and evolutionists. The latest ruckus has broken out among scientists and philosophers who accept evolutionary theory as the explanation for the emergence of life's diversity.

Where they differ is on the public communication of science and evolution. Dawkins in particular is being rebuked for doing more harm than good to the public face of science. The basic claim - spelled out by the journalist Chris Mooney and the biologist Sheril Kirshenbaum in their book Unscientific America, published in June - is that Dawkins presents an unnecessarily divisive choice: you can accept evolution and a scientific world-view more broadly, and therefore reject religion, or cling to religion and sacrifice scientific understanding.

This strategy, critics argue, alienates moderate religious people who might otherwise be receptive to scientific theory. Faced with a mutually exclusive choice between their private faith and the objective world-view of science, moderates will turn away from the latter. Science loses out.
It's not just Dawkins. Coyne and all the "new atheists" (including the Darwinian philosopher Daniel Dennett, the neuroscientist Sam Harris and the cultural commentator Christopher Hitchens) are charged with alienating people from science. Lining up against them is a group of "accommodationists", including Mooney, an atheist, and Kirshenbaum, an agnostic, who believe that evolution and religion can live happily side by side - at least under an entente cordiale, if not in a mutually supportive relationship.

Dawkins calls accommodationism "the Neville Chamberlain school" of evolution, and its proponents the appeasement lobby. Yet it is the official line of the American Association for the Advancement of Science, the US National Academy of Sciences and the National Centre for Science Education, which is dedicated to promoting the teaching of evolution in American school curriculums.

Appeasement lobby

The accommodationist critique has at least two strands. One is the increasingly common criticism that the new atheists are excessively mean to people of faith, "militant" in tone, and iro­nically fundamentalist in their non-belief. The accommodationist philosopher Barbara Forrest chastises the new atheists for combining rudeness with arrogance and closed-mindedness. (Like Mooney and Kirshenbaum, Forrest is no friend of creationism; she was a critical witness at a 2005 trial in Dover, Pennsylvania, in which parents blocked the introduction of "intelligent design" theory into state-school curriculums - see "Gorilla warfare" below.)

Forrest argues that new atheists should respect the personal nature of faith, and nurture a sense of humility by recognising that scientific evidence does not rule out existence of the divine. They should accept that there is a wide range of views, she says, and stop insisting that everyone follow the "one true way" of atheism. Failing to do so only turns people off in droves.

Yet it seems unlikely that the new atheists have been this damaging. They have been an identifiable group and social force for five years only - starting with Harris's The End of Faith in 2004, which was followed by Dawkins's The God Delusion in 2006. More significantly, polls indicate that the proportion of the US public that subscribes to a creationist account of human origins has remained relatively constant for the past 25 years, hovering around 45 per cent. The previous era, which advocated greater respect for religion, does not seem to have won over hearts or minds. So who is to say that taking the opposite approach will drive anyone away?

The second thread of the accommodationist argument is that science, in fact, need not be inimical to religious faith. Eminent scientists from Galileo to Newton have found little trouble reconciling their personal faith with a scientific world-view. Perhaps the most prominent contemporary example is the geneticist Francis Collins, who ran the American arm of the Human Genome Project and was recently appointed head of the National Institutes of Health (NIH), the biggest funder of biomedical research in the US. Collins is also an evangelical Christian who speaks publicly about his faith and its relation to science. Exemplars of this sort show that a single human mind can hold two divergent world-views simultaneously, or at least accept the legitimacy of two very different ways of gaining knowledge about the world.

An interventionist God

But there is another side to this story. Steven Pinker, a Harvard psychologist and an atheist, has voiced grave misgivings over Collins's appointment - not just because of his religious beliefs, but because of his "public advocacy" that "atheistic materialism" must be resisted. Collins believes in an interventionist God who, in his own words, "gifted humanity with the knowledge of good and evil (the moral law), with free will, and with an immortal soul".

Although, in principle, religious beliefs need not affect one's day-to-day science, in practice, they might. Take research on the foundations of human sociality and ethics, currently one of the hottest areas in behavioural science. Researchers are probing these questions with evolutionary theory, comparative primate studies and neurobiology, among other approaches, but no one invokes non-natural or non-material explanations. Are these instances of atheistic materialism to be resisted?

How would Collins's views affect the priority he might give to funding such research, if his prime belief is that ethics and the moral law are God-given? It is perfectly possible that he would accept the materialistic explanation of morality, and just add that everything was set up by God in such a way that naturalistic processes were bound to produce a big-brained moral species. Time will tell if, and how, NIH funding changes under his leadership. It would be unfair to prejudge the case.

In the meantime, there is little reason to suppose that the world will reach any meaningful consensus on the question of how best to engage the public with science in general, and evolutionary theory in particular. Perhaps, in true Darwinian fashion, those arguments and ideas best adapted to the modern world will prevail. In an era of resurgent religion, it is far from clear which approach this will be.

“Unscientific America" by Chris Mooney and Sheril Kirshenbaum is published by Basic Books (£15.99)

Dan Jones's writing on science has appeared in Nature and New Scientist magazines

This article first appeared in the 09 November 2009 issue of the New Statesman, Castro

Show Hide image

The age of loneliness

Profound changes in technology, work and community are transforming our ultrasocial species into a population of loners.

Our dominant ideology is based on a lie. A series of lies, in fact, but I’ll focus on just one. This is the claim that we are, above all else, self-interested – that we seek to enhance our own wealth and power with little regard for the impact on others.

Some economists use a term to describe this presumed state of being – Homo economicus, or self-maximising man. The concept was formulated, by J S Mill and others, as a thought experiment. Soon it became a modelling tool. Then it became an ideal. Then it evolved into a description of who we really are.

It could not be further from the truth. To study human behaviour is to become aware of how weird we are. Many species will go to great lengths to help and protect their close kin. One or two will show occasional altruism towards unrelated members of their kind. But no species possesses a capacity for general altruism that is anywhere close to our own.

With the possible exception of naked mole-rats, we have the most social minds of all mammals. These minds evolved as an essential means of survival. Slow, weak, armed with rounded teeth and flimsy nails in a world of fangs and claws and horns and tusks, we survived through co-operation, reciprocity and mutual defence, all of which developed to a remarkable degree.

A review paper in the journal Frontiers in Psychology observes that Homo economicus  might be a reasonable description of chimpanzees. “Outsiders . . . would not expect to receive offers of food or solicitude; rather, they would be fiercely attacked . . . food is shared only under harassment; even mothers will not voluntarily offer novel foods to their own infants unless the infants beg for them.” But it is an unreasonable description of human beings.

How many of your friends, colleagues and neighbours behave like chimpanzees? A few, perhaps. If so, are they respected or reviled? Some people do appear to act as if they have no interests but their own – Philip Green and Mike Ashley strike me as possible examples – but their behaviour ­attracts general revulsion. The news is filled with spectacular instances of human viciousness: although psychopaths are rare, their deeds fill the papers. Daily acts of kindness are seldom reported, because they are everywhere.

Every day, I see people helping others with luggage, offering to cede their place in a queue, giving money to the homeless, setting aside time for others, volunteering for causes that offer no material reward. Alongside these quotidian instances are extreme and stunning cases. I think of my Dutch mother-in-law, whose family took in a six-year-old Jewish boy – a stranger – and hid him in their house for two years during the German occupation of the Netherlands. Had he been discovered, they would all have been sent to a concentration camp.

Studies suggest that altruistic tendencies are innate: from the age of 14 months, children try to help each other, attempting to hand over objects another child can’t reach. At the age of two, they start to share valued possessions. By the time they are three, they begin to protest against other people’s violation of moral norms.

Perhaps because we are told by the media, think tanks and politicians that competition and self-interest are the defining norms of human life, we disastrously mischaracterise the way in which other people behave. A survey commissioned by the Common Cause Foundation reported that 78 per cent of respondents believe others to be more selfish than they really are.

I do not wish to suggest that this mythology of selfishness is the sole or even principal cause of the epidemic of loneliness now sweeping the world. But it is likely to contribute to the plague by breeding suspicion and a sense of threat. It also appears to provide a doctrine of justification for those afflicted by isolation, a doctrine that sees individualism as a higher state of existence than community. Perhaps it is hardly surprising that Britain, the European nation in which neoliberalism is most advanced, is, according to government figures, the loneliness capital of Europe.

There are several possible reasons for the atomisation now suffered by the supremely social mammal. Work, which used to bring us together, now disperses us: many people have neither fixed workplaces nor regular colleagues and regular hours. Our leisure time has undergone a similar transformation: cinema replaced by television, sport by computer games, time with friends by time on Facebook.

Social media seems to cut both ways: it brings us together and sets us apart. It helps us to stay in touch, but also cultivates a tendency that surely enhances other people’s sense of isolation: a determination to persuade your followers that you’re having a great time. FOMO – fear of missing out – seems, at least in my mind, to be closely ­associated with loneliness.

Children’s lives in particular have been transformed: since the 1970s, their unaccompanied home range (in other words, the area they roam without adult supervision) has declined in Britain by almost 90 per cent. Not only does this remove them from contact with the natural world, but it limits their contact with other children. When kids played out on the street or in the woods, they quickly formed their own tribes, learning the social skills that would see them through life.

An ageing population, family and community breakdown, the decline of institutions such as churches and trade unions, the switch from public transport to private, inequality, an alienating ethic of consumerism, the loss of common purpose: all these are likely to contribute to one of the most dangerous epidemics of our time.

Yes, I do mean dangerous. The stress response triggered by loneliness raises blood pressure and impairs the immune system. Loneliness enhances the risk of depression, paranoia, addiction, cognitive decline, dem­entia, heart disease, stroke, viral infection, accidents and suicide. It is as potent a cause of early death as smoking 15 cigarettes a day, and can be twice as deadly as obesity.

Perhaps because we are in thrall to the ideology that helps to cause the problem, we turn to the market to try to solve it. Over the past few weeks, the discovery of a new American profession, the people-walker (taking human beings for walks), has caused a small sensation in the media. In Japan there is a fully fledged market for friendship: you can hire friends by the hour with whom to chat and eat and watch TV; or, more disturbingly, to pose for pictures that you can post on social media. They are rented as mourners at funerals and guests at weddings. A recent article describes how a fake friend was used to replace a sister with whom the bride had fallen out. What would the bride’s mother make of it? No problem: she had been rented, too. In September we learned that similar customs have been followed in Britain for some time: an early foray into business for the Home Secretary, Amber Rudd, involved offering to lease her posh friends to underpopulated weddings.



My own experience fits the current pattern: the high incidence of loneliness suffered by people between the ages of 18 and 34. I have sometimes been lonely before and after that period, but it was during those years that I was most afflicted. The worst episode struck when I returned to Britain after six years working in West Papua, Brazil and East Africa. In those parts I sometimes felt like a ghost, drifting through societies to which I did not belong. I was often socially isolated, but I seldom felt lonely, perhaps because the issues I was investigating were so absorbing and the work so frightening that I was swept along by adrenalin and a sense of purpose.

When I came home, however, I fell into a mineshaft. My university friends, with their proper jobs, expensive mortgages and settled, prematurely aged lives, had become incomprehensible to me, and the life I had been leading seemed incomprehensible to everyone. Though feeling like a ghost abroad was in some ways liberating – a psychic decluttering that permitted an intense process of discovery – feeling like a ghost at home was terrifying. I existed, people acknowledged me, greeted me cordially, but I just could not connect. Wherever I went, I heard my own voice bouncing back at me.

Eventually I made new friends. But I still feel scarred by that time, and fearful that such desolation may recur, particularly in old age. These days, my loneliest moments come immediately after I’ve given a talk, when I’m surrounded by people congratulating me or asking questions. I often experience a falling sensation: their voices seem to recede above my head. I think it arises from the nature of the contact: because I can’t speak to anyone for more than a few seconds, it feels like social media brought to life.

The word “sullen” evolved from the Old French solain, which means “lonely”. Loneliness is associated with an enhanced perception of social threat, so one of its paradoxical consequences is a tendency to shut yourself off from strangers. When I was lonely, I felt like lashing out at the society from which I perceived myself excluded, as if the problem lay with other people. To read any comment thread is, I feel, to witness this tendency: you find people who are plainly making efforts to connect, but who do so by insulting and abusing, alienating the rest of the thread with their evident misanthropy. Perhaps some people really are rugged individualists. But others – especially online – appear to use that persona as a rationale for involuntary isolation.

Whatever the reasons might be, it is as if a spell had been cast on us, transforming this ultrasocial species into a population of loners. Like a parasite enhancing the conditions for its own survival, loneliness impedes its own cure by breeding shame and shyness. The work of groups such as Age UK, Mind, Positive Ageing and the Campaign to End Loneliness is life-saving.

When I first wrote about this subject, and the article went viral, several publishers urged me to write a book on the theme. Three years sitting at my desk, studying isolation: what’s the second prize? But I found another way of working on the issue, a way that engages me with others, rather than removing me. With the brilliant musician Ewan McLennan, I have written a concept album (I wrote the first draft of the lyrics; he refined them and wrote the music). Our aim is to use it to help break the spell, with performances of both music and the spoken word designed to bring people together –which, we hope, will end with a party at the nearest pub.

By itself, our work can make only a tiny contribution to addressing the epidemic. But I hope that, both by helping people to acknowledge it and by using the power of music to create common sentiment, we can at least begin to identify the barriers that separate us from others, and to remember that we are not the selfish, ruthless beings we are told we are.

“Breaking the Spell of Loneliness” by Ewan McLennan and George Monbiot is out now. For a full list of forthcoming gigs visit:

This article first appeared in the 20 October 2016 issue of the New Statesman, Brothers in blood