Mourners carry the body of a father killed by a drone strike in Gaza. Photograph: Getty Images
Show Hide image

Drone attacks go against every human rights principle in the book

There is a sense that international law has failed.

It has for centuries been lawful to kill enemy commanders, on the principle that “a man who is dead renews no war”, a thought that comforted Cromwell as he viewed the body of Charles I. The outcry in the 1970s over comical CIA plots to murder Fidel Castro by sending him exploding cigars and poison pens led Congress to ban political assassinations under Executive Order 12333: “No person employed by or acting on behalf of the United States government shall engage in, or conspire to engage in, assassination.” This comports with the Fifth Amendment to the US constitution, which protects “any person” (not just US citizens) from being “deprived of life . . . without due process of law”.

Until 9/11, the legal position was clear: in war, active combatants could kill and be killed, subject to rules governing surrender, use of banned weapons, etc. But “war law” applied only to conflicts between armed forces of opposing states, invoking the right of self-defence. Confrontations with insurgents, rioters and terrorists were governed by human rights law, which requires state use of force against serious criminals to be reasonable in the circumstances. This is more restrictive – after three IRA bombers were shot dead on Gibraltar in 1988, the European Court held that the UK had denied them the right to life because MI5 had jumped to mistaken conclusions. In the case of known members of terrorist organisations, the “reasonable force” requirement exercises a necessary and humane restraint over the trigger-happiness of “special forces” and drone targeters. This is why the US, Russia and Israel pretend they are bound only by the law of war, which allows suspects to be killed without much compunction.

The states that deploy drones argue that they are operating under war law, where human rights are less relevant. As Harold Koh, legal adviser to the US state department, puts it: “The US is in an armed conflict with al-Qaeda . . . and may use force consistent with its inherent right to self-defence . . . including by targeting persons such as high-level al-Qaeda leaders who are planning to attack us.” This bald statement prompts many questions. How can you have “an armed conflict” without an enemy state? What criteria are used for putting names on the secret death list: is it enough to be sympathetic to terrorism, married to a terrorist, or anti-American? To provide shelter or give funds to terrorist groups? What is the required degree of proof? There are no accountability mechanisms – no inquests, sometimes not even a casualty list (although the US usually announces and celebrates when it hits a “high-value target”).

In drone warfare, there is no fairness or due process to enable the potential victim, his relatives or any outside body to challenge the accuracy of the information on which the targeting decision has been made. The Senate foreign relations committee reported in 2009 that the Pentagon’s approved list of “prioritised targets” contained 367 names and had been expanded to include 50 Afghan drug lords suspected of donating money to the Taliban. Suppose the suspicion was unreasonable, or the donation had been at gunpoint, or of a negligible amount? What the Pentagon is doing is secretly sentencing people to death for an unproven crime.

The Israeli Supreme Court is the only tribunal to have confronted the legality of targeted kill­ing, at a time (2008) when 234 victims had been members of Hamas and a further 153 had been civilians who got in the way. The court contented itself with comments about limiting the targets to dangerous terrorists and issued Polonius-like precautionary precepts: “well-based information is needed”; “innocent civilians are not to be harmed”; “careful verification is needed before an attack is made”. In reality, innocent civilians very often are killed, and “verification” always seems careful to the minds of the targeters.

Israeli officials seem morally content to risk civilian lives: after a one-tonne bomb was dropped on Gaza City in 2002, killing many civilians in order to assassinate the Hamas military leader Salah Shehadeh, an inquiry merely noted “shortcomings” in evaluation of information. This was a case of manslaughter by gross negligence. The CIA’s anxiety to kill the al-Qaeda leader Ayman al-Zawahiri led to a drone attack in 2006 on a village in Pakistan where he was mistakenly thought to be hiding, and 18 civ­ilians were killed. There was no explanation, no accountability and no compensation for what the CIA calls a “decapitation strike”.

Koh says that drone strikes are an exercise in self-defence under Article 51 of the UN Charter. But Article 51 applies only to attacks (or imminent attacks) by other states, not by terrorist groups. Nobody has yet noticed the irony of squeezing terrorism into this war-law paradigm. Because the Geneva Conventions and customary rights must apply to terrorist and law enfor­cer alike, if it is lawful to kill Osama Bin Laden, al-Zawahiri and Hamas commanders, then it must be lawful for them to kill their opposite numbers – Barack Obama and Binyamin Netan­yahu, generals, allies. (Even the Queen, as head of a co-belligerent state, may qualify.) Those who take the lives of innocent civilians in order to spread terror deserve to be treated like dangerous criminals and shot down when necessity requires, not dignified in law as if they were warriors matched in combat with great states.
What is the position under human rights law? It would obviously be a breach of the right to life if terrorist sympathisers were targeted to deter others, or killed in circumstances where it was possible to arrest them. It would be reasonable to kill terrorists on missions to blow up civilians, or engaged in conspiracies to kill them. But the record of drone attacks demonstrates that often individuals are targeted when they constitute no clear or present danger.

Drone killings in tribal areas of Pakistan and in Yemen have taken the lives of targets who are armed and in conspiratorial meetings, but others have merely been attending weddings or funerals or emerging from hospitals or mosques. In Pakistan, there have been cases where pro-government leaders, their families and even army soldiers have been killed by mistake in drone attacks that have severely damaged US relations with a politically tense, nuclear-armed nation that is not at war with the US.

There was little protest in the US until last year, when a drone strike in Yemen targeted a US citizen, Anwar al-Awlaki, rumoured to be al-Qaeda’s leader in that area. The rockets were fired at his pick-up truck, in which he might have been picked up rather than bombed. Obama’s lawyers said that the Fifth Amendment could not avail a US citizen who joined an enemy force. This is correct as far as it goes, but the Fifth Amendment must entitle a citizen or his family to know whether he is on a death list and to apply to have himself taken off it. When al-Awlaki’s father sought judicial review, the judge told him he did not have standing. If a father does not have standing to challenge a targeted killing, who does?

The Obama administration seems to have given the CIA carte blanche to choose targets, subject to the approval of Koh, a law professor, now an executioner. Those who press the Hellfire buttons in Nevada do not pause to consider whether their targets are engaged in combatant missions or not. But there is no point speculating about the criteria for listing or executing: these are secret CIA prerogatives, beyond the jurisdiction of the courts or the provisions of the Freedom of Information Act.

The battlefield utility of drone technology is such that it will be used widely in future conflicts, and by states much less scrupulous than the US and Israel (Syria and Iran, for example). Drones will become more compact, and more difficult to detect or shoot down – already there are plans for bird- and even insect-sized drones, capable of crawling inside homes or squatting on window ledges to listen and send “kill” messages to their bigger brethren without any “pilot” in Nevada pressing a button.

There is an urgent need for the US to make its drone operations more principled, first, by moving responsibility from the CIA to the department of defence, which is more accountable and bound by the Geneva Conventions. Second, there must be transparency in respect of both the target list and criteria for listing, and an opportunity for those listed to surrender or seek judicial review of whether the evidence against them proves they are an active combatant. Third, rules of engagement must exclude any killing if civilians are likely to be present, and finally, rules must prevent killing of a target who can be captured or arrested.

There is a sense that international law has failed: the UN Charter, the conventions and the norms of the courts have not provided satisfactory guidance for waging asymmetric warfare. Hence the silence of states and the recent earnest request, by the UN’s human rights commissioner, for urgent clarification of the law. The way forward may be to find a way back, to reasonable force and proportionality. At present, many drone killings can only be described as summary executions – the punishment of the Red Queen (“sentence first, trial later”), which denies the right to life, the presumption of innocence and the right to a fair trial.

Geoffrey Robertson QC’s full legal analysis of drone warfare is in his fourth edition of “Crimes Against Humanity” (Penguin, September 2012). Also in the New Statesman's Drones issue: Chris Woods on the legality of drones, Jemima Khan's interview with former Pakistani president Pervez Musharraf and Michael Brooks on the science that makes drones work

This article first appeared in the 18 June 2012 issue of the New Statesman, Drones: video game warfare

Show Hide image

The age of loneliness

Profound changes in technology, work and community are transforming our ultrasocial species into a population of loners.

Our dominant ideology is based on a lie. A series of lies, in fact, but I’ll focus on just one. This is the claim that we are, above all else, self-interested – that we seek to enhance our own wealth and power with little regard for the impact on others.

Some economists use a term to describe this presumed state of being – Homo economicus, or self-maximising man. The concept was formulated, by J S Mill and others, as a thought experiment. Soon it became a modelling tool. Then it became an ideal. Then it evolved into a description of who we really are.

It could not be further from the truth. To study human behaviour is to become aware of how weird we are. Many species will go to great lengths to help and protect their close kin. One or two will show occasional altruism towards unrelated members of their kind. But no species possesses a capacity for general altruism that is anywhere close to our own.

With the possible exception of naked mole-rats, we have the most social minds of all mammals. These minds evolved as an essential means of survival. Slow, weak, armed with rounded teeth and flimsy nails in a world of fangs and claws and horns and tusks, we survived through co-operation, reciprocity and mutual defence, all of which developed to a remarkable degree.

A review paper in the journal Frontiers in Psychology observes that Homo economicus  might be a reasonable description of chimpanzees. “Outsiders . . . would not expect to receive offers of food or solicitude; rather, they would be fiercely attacked . . . food is shared only under harassment; even mothers will not voluntarily offer novel foods to their own infants unless the infants beg for them.” But it is an unreasonable description of human beings.

How many of your friends, colleagues and neighbours behave like chimpanzees? A few, perhaps. If so, are they respected or reviled? Some people do appear to act as if they have no interests but their own – Philip Green and Mike Ashley strike me as possible examples – but their behaviour ­attracts general revulsion. The news is filled with spectacular instances of human viciousness: although psychopaths are rare, their deeds fill the papers. Daily acts of kindness are seldom reported, because they are everywhere.

Every day, I see people helping others with luggage, offering to cede their place in a queue, giving money to the homeless, setting aside time for others, volunteering for causes that offer no material reward. Alongside these quotidian instances are extreme and stunning cases. I think of my Dutch mother-in-law, whose family took in a six-year-old Jewish boy – a stranger – and hid him in their house for two years during the German occupation of the Netherlands. Had he been discovered, they would all have been sent to a concentration camp.

Studies suggest that altruistic tendencies are innate: from the age of 14 months, children try to help each other, attempting to hand over objects another child can’t reach. At the age of two, they start to share valued possessions. By the time they are three, they begin to protest against other people’s violation of moral norms.

Perhaps because we are told by the media, think tanks and politicians that competition and self-interest are the defining norms of human life, we disastrously mischaracterise the way in which other people behave. A survey commissioned by the Common Cause Foundation reported that 78 per cent of respondents believe others to be more selfish than they really are.

I do not wish to suggest that this mythology of selfishness is the sole or even principal cause of the epidemic of loneliness now sweeping the world. But it is likely to contribute to the plague by breeding suspicion and a sense of threat. It also appears to provide a doctrine of justification for those afflicted by isolation, a doctrine that sees individualism as a higher state of existence than community. Perhaps it is hardly surprising that Britain, the European nation in which neoliberalism is most advanced, is, according to government figures, the loneliness capital of Europe.

There are several possible reasons for the atomisation now suffered by the supremely social mammal. Work, which used to bring us together, now disperses us: many people have neither fixed workplaces nor regular colleagues and regular hours. Our leisure time has undergone a similar transformation: cinema replaced by television, sport by computer games, time with friends by time on Facebook.

Social media seems to cut both ways: it brings us together and sets us apart. It helps us to stay in touch, but also cultivates a tendency that surely enhances other people’s sense of isolation: a determination to persuade your followers that you’re having a great time. FOMO – fear of missing out – seems, at least in my mind, to be closely ­associated with loneliness.

Children’s lives in particular have been transformed: since the 1970s, their unaccompanied home range (in other words, the area they roam without adult supervision) has declined in Britain by almost 90 per cent. Not only does this remove them from contact with the natural world, but it limits their contact with other children. When kids played out on the street or in the woods, they quickly formed their own tribes, learning the social skills that would see them through life.

An ageing population, family and community breakdown, the decline of institutions such as churches and trade unions, the switch from public transport to private, inequality, an alienating ethic of consumerism, the loss of common purpose: all these are likely to contribute to one of the most dangerous epidemics of our time.

Yes, I do mean dangerous. The stress response triggered by loneliness raises blood pressure and impairs the immune system. Loneliness enhances the risk of depression, paranoia, addiction, cognitive decline, dem­entia, heart disease, stroke, viral infection, accidents and suicide. It is as potent a cause of early death as smoking 15 cigarettes a day, and can be twice as deadly as obesity.

Perhaps because we are in thrall to the ideology that helps to cause the problem, we turn to the market to try to solve it. Over the past few weeks, the discovery of a new American profession, the people-walker (taking human beings for walks), has caused a small sensation in the media. In Japan there is a fully fledged market for friendship: you can hire friends by the hour with whom to chat and eat and watch TV; or, more disturbingly, to pose for pictures that you can post on social media. They are rented as mourners at funerals and guests at weddings. A recent article describes how a fake friend was used to replace a sister with whom the bride had fallen out. What would the bride’s mother make of it? No problem: she had been rented, too. In September we learned that similar customs have been followed in Britain for some time: an early foray into business for the Home Secretary, Amber Rudd, involved offering to lease her posh friends to underpopulated weddings.



My own experience fits the current pattern: the high incidence of loneliness suffered by people between the ages of 18 and 34. I have sometimes been lonely before and after that period, but it was during those years that I was most afflicted. The worst episode struck when I returned to Britain after six years working in West Papua, Brazil and East Africa. In those parts I sometimes felt like a ghost, drifting through societies to which I did not belong. I was often socially isolated, but I seldom felt lonely, perhaps because the issues I was investigating were so absorbing and the work so frightening that I was swept along by adrenalin and a sense of purpose.

When I came home, however, I fell into a mineshaft. My university friends, with their proper jobs, expensive mortgages and settled, prematurely aged lives, had become incomprehensible to me, and the life I had been leading seemed incomprehensible to everyone. Though feeling like a ghost abroad was in some ways liberating – a psychic decluttering that permitted an intense process of discovery – feeling like a ghost at home was terrifying. I existed, people acknowledged me, greeted me cordially, but I just could not connect. Wherever I went, I heard my own voice bouncing back at me.

Eventually I made new friends. But I still feel scarred by that time, and fearful that such desolation may recur, particularly in old age. These days, my loneliest moments come immediately after I’ve given a talk, when I’m surrounded by people congratulating me or asking questions. I often experience a falling sensation: their voices seem to recede above my head. I think it arises from the nature of the contact: because I can’t speak to anyone for more than a few seconds, it feels like social media brought to life.

The word “sullen” evolved from the Old French solain, which means “lonely”. Loneliness is associated with an enhanced perception of social threat, so one of its paradoxical consequences is a tendency to shut yourself off from strangers. When I was lonely, I felt like lashing out at the society from which I perceived myself excluded, as if the problem lay with other people. To read any comment thread is, I feel, to witness this tendency: you find people who are plainly making efforts to connect, but who do so by insulting and abusing, alienating the rest of the thread with their evident misanthropy. Perhaps some people really are rugged individualists. But others – especially online – appear to use that persona as a rationale for involuntary isolation.

Whatever the reasons might be, it is as if a spell had been cast on us, transforming this ultrasocial species into a population of loners. Like a parasite enhancing the conditions for its own survival, loneliness impedes its own cure by breeding shame and shyness. The work of groups such as Age UK, Mind, Positive Ageing and the Campaign to End Loneliness is life-saving.

When I first wrote about this subject, and the article went viral, several publishers urged me to write a book on the theme. Three years sitting at my desk, studying isolation: what’s the second prize? But I found another way of working on the issue, a way that engages me with others, rather than removing me. With the brilliant musician Ewan McLennan, I have written a concept album (I wrote the first draft of the lyrics; he refined them and wrote the music). Our aim is to use it to help break the spell, with performances of both music and the spoken word designed to bring people together –which, we hope, will end with a party at the nearest pub.

By itself, our work can make only a tiny contribution to addressing the epidemic. But I hope that, both by helping people to acknowledge it and by using the power of music to create common sentiment, we can at least begin to identify the barriers that separate us from others, and to remember that we are not the selfish, ruthless beings we are told we are.

“Breaking the Spell of Loneliness” by Ewan McLennan and George Monbiot is out now. For a full list of forthcoming gigs visit:

This article first appeared in the 20 October 2016 issue of the New Statesman, Brothers in blood