Show Hide image

Leader: The Iraq war, the Arab spring and the limits of intervention

In the ten years since the Iraq war, the Arab spring has shown that regime change need not be synonymous with western military intervention.

It was the belief of those leaders who participated in the Iraq war that the intervention would serve as a model of noble and far-sighted foreign policy. The establishment of a “free Iraq at the heart of the Middle East”, George W Bush predicted in 2003, would be “a watershed event in the global democratic revolution”. A decade later, Iraq has become a model but one of failure. The state that the Bush administration promised would become “a beacon of freedom, democracy and peace” today ranks as the fifth most at risk of terrorism and as the eighth most corrupt. So inured have we become to the near-daily acts of sectarian violence that the deaths of 150 civilians in this month alone have passed largely without comment.

The arguments advanced for the war are now so discredited that it is easy to forget how widely accepted they were at the time. The invasion had the support of most Labour MPs and, with the notable exception of old-style Tory sceptics such as Kenneth Clarke and Malcolm Rifkind, almost the entire Conservative Parliamentary Party. In the US, 58 per cent of Democratic senators, including Joe Biden, Hillary Clinton and John Kerry, voted to approve the use of force.

In a leader published on 17 February 2003, we argued: “The case made for war must be like the case made for guilt in a court of law: it must be beyond all reasonable doubt. That case has not been made and it has patently failed before the jury of public opinion. Tony Blair should accept the verdict.” Ten years later, we see no reason to revise this judgement. A war fought to prevent the proliferation of weapons of mass destruction, to reduce the threat of terrorism and to protect civilian life achieved the reverse in each case.

It was the decision to disarm Saddam Hussein of his non­existent arsenal of chemical and biological weapons that prompted the other two members of “the axis of evil” – Iran and North Korea – to seek nuclear weapons as the ultimate insurance against pre-emptive attack. As the former US sec­retary of state Madeleine Albright laconically observed, “The message out of Iraq is that if you don’t have nuclear weapons, you get invaded. If you do have nuclear weapons, you don’t get invaded.” There was, contrary to US propaganda, no evidence of any relationship between Iraq and al-Qaeda, at least until the invasion established one. The country became a haven for jihadists and the disastrous occupation the most powerful recruiting sergeant for their cause.

Shortly before the war began, Mr Blair promised MPs that although civilian casualties would undoubtedly occur, Saddam Hussein would be “responsible for many, many more deaths even in one year than we will be in any conflict”. Yet a 2006 study by the Lancet medical journal, the most authoritative survey of post-invasion mortality in Iraq, estimated that nearly 655,000 more Iraqis died than would have been expected had the invasion not taken place. Though many of the deaths were attributable to al-Qaeda and its surrogates, the Lancet found that 186,000 were due to action by coalition forces.

All of this was predictable and predicted but we recognise that many of those on the left who supported the invasion did so in the sincere belief that even war was preferable to the survival of the murderous Ba’athist regime. As John Lloyd (who makes a welcome return to the NS this week) reminds us on page 27, Saddam Hussein was no mere “tinpot dictator” but a sadistic megalomaniac who invaded one of his neighbours (Iran), sought to annex another (Kuwait) and pursued terrifying campaigns of genocide against the Kurds and the marsh Arabs. That the west armed and supported him throughout the 1980s arguably only heightened the responsibility to cancel this moral debt by removing him from power.

But the lesson of the Arab spring, an authentic democratic surge of a kind not envisaged by Mr Bush and Mr Blair, is that regime change need not be synonymous with western military intervention. Tyrants such as Egypt’s Hosni Mubarak and Tunisia’s Zine el-Abidine Ben Ali were toppled from within by popular uprisings.

Those who supported the war argued that there was no reasonable alternative to a US-led invasion. “You go to war with the president you’ve got” was the common refrain. Yet while contending that the end justified the means, the pro-war left failed to recognise that the means change the end. It was foreseeable that an administration populated by small-government Republicans with little interest in state-building would prove unable to create a stable and democratic Iraq. As one of the anti-war playwright David Hare’s characters remarks, “When you knew what sort of butcher was the surgeon, there was no doubt about the outcome of the operation.”

Though the return of free elections in Iraq is no small achievement, it is disingenuous of the war’s supporters to call the country a flourishing democracy. It is not. Iraq has been torn apart by sectarianism; its prime minister, Nouri al-Maliki, a former Shia dissident, is not a Jeffersonian liberal but a sectarian autocrat who, as Adnan Hussein, editor of al-Mada newspaper, writes on page 31, has exploited the country’s young constitution to concentrate ever greater power in his hands.

Ten years on, Iraq, like Suez before it and now Afghanistan, has become a byword for western failure, a permanent warning of the perils of wrong-headed intervention. In the US and the UK, a new generation of centre-left leaders has sought to draw a line under the invasion. Barack Obama, who opposed the intervention, spoke of a “decade of war” coming to an end in his second inaugural address, while Ed Miliband used his first, cathartic speech as Labour leader to denounce the Iraq war as “wrong”.

Chastened by recession, the indebted west has neither the will nor the capability to mount interventions on the scale of Iraq. For that, perhaps, we should be grateful.

This article first appeared in the 18 February 2013 issue of the New Statesman, Iraq: ten years on

The Science & Society Picture Library
Show Hide image

This Ada Lovelace Day, let’s celebrate women in tech while confronting its sexist culture

In an industry where men hold most of the jobs and write most of the code, celebrating women's contributions on one day a year isn't enough. 

Ada Lovelace wrote the world’s first computer program. In the 1840s Charles Babbage, now known as the “father of the computer”, designed (though never built) the “Analytical Engine”, a machine which could accurately and reproducibly calculate the answers to maths problems. While translating an article by an Italian mathematician about the machine, Lovelace included a written algorithm for which would allow the engine to calculate a sequence of Bernoulli numbers.

Around 170 years later, Whitney Wolfe, one of the founders of dating app Tinder, was allegedly forced to resign from the company. According to a lawsuit she later filed against the app and its parent company, she had her co-founder title removed because, the male founders argued, it would look “slutty”, and because “Facebook and Snapchat don’t have girl founders. It just makes it look like Tinder was some accident". (They settled out of court.)

Today, 13 October, is Ada Lovelace day – an international celebration of inspirational women in science, technology, engineering and mathematics (STEM). It’s lucky we have this day of remembrance, because, as Wolfe’s story demonstrates, we also spend a lot of time forgetting and sidelining women in tech. In the wash of pale male founders of the tech giants that rule the industry,we don't often think about the women that shaped its foundations: Judith Estrin, one of the designers of TCP/IP, for example, or Radia Perlman, inventor of the spanning-tree protocol. Both inventions sound complicated, and they are – they’re some of the vital building blocks that allow the internet to function. 

And yet David Streitfield, a Pulitzer-prize winning journalist, someow felt it accurate to write in 2012: “Men invented the internet. And not just any men. Men with pocket protectors. Men who idolised Mr Spock and cried when Steve Jobs died.”

Perhaps we forget about tech's founding women because the needle has swung so far into the other direction. A huge proportion – perhaps even 90 per cent - of the world’s code is written by men. At Google, women fill 17 per cent of technical roles. At Facebook, 15 per cent. Over 90 per cent of the code respositories on Github, an online service used throughout the industry, are owned by men. Yet it's also hard to believe that this erasure of women's role in tech is completely accidental. As Elissa Shevinsky writes in the introduction to a collection of essays on gender in tech, Lean Out: “This myth of the nerdy male founder has been perpetuated by men who found this story favourable."

Does it matter? It’s hard to believe that it doesn’t. Our society is increasingly defined and delineated by code and the things it builds. Small slip-ups, like the lack of a period tracker on the original Apple Watch, or fitness trackers too big for some women’s wrists, gesture to the fact that these technologies are built by male-dominated teams, for a male audience.

In Lean Out, one essay written by a Twitter-based “start-up dinosaur” (don’t ask) explains how dangerous it is to allow one small segment of society to built the future for the rest of us:

If you let someone else build tomorrow, tomorrow will belong to someone else. They will build a better tomorrow for everyone like them… For tomorrow to be for everyone, everyone needs to be the one [sic] that build it.

So where did all the women go? How did we get from a rash of female inventors to a situation where the major female presence at an Apple iPhone launch is a model’s face projected onto a screen and photoshopped into a smile by a male demonstrator? 

Photo: Apple.

The toxic culture of many tech workplaces could be a cause or an effect of the lack of women in the industry, but it certainly can’t make make it easy to stay. Behaviours range from the ignorant - Martha Lane-Fox, founder of, often asked “what happens if you get pregnant?” at investors' meetings - to the much more sinister. An essay in Lean Out by Katy Levinson details her experiences of sexual harassment while working in tech: 

I have had interviewers attempt to solicit sexual favors from me mid-interview and discuss in significant detail precisely what they would like to do. All of these things have happened either in Silicon Valley working in tech, in an educational institution to get me there, or in a technical internship.

Others featured in the book joined in with the low-level sexism and racism  of their male colleagues in order to "fit in" and deflect negative attention. Erica Joy writes that while working in IT at the University of Alaska as the only woman (and only black person) on her team, she laughed at colleagues' "terribly racist and sexist jokes" and "co-opted their negative attitudes”. 

The casual culture and allegedly meritocratic hierarchies of tech companies may actually be encouraging this discriminatory atmosphere. HR and the strict reporting procedures of large corporates at least give those suffering from discrimination a place to go. A casual office environment can discourage reporting or calling out prejudiced humour or remarks. Brook Shelley, a woman who transitioned while working in tech, notes: "No one wants to be the office mother". So instead, you join in and hope for the best. 

And, of course, there's no reason why people working in tech would have fewer issues with discrimination than those in other industries. A childhood spent as a "nerd" can also spawn its own brand of misogyny - Katherine Cross writes in Lean Out that “to many of these men [working in these fields] is all too easy to subconciously confound women who say ‘this is sexist’ with the young girls who said… ‘You’re gross and a creep and I’ll never date you'". During GamerGate, Anita Sarkeesian was often called a "prom queen" by trolls. 

When I spoke to Alexa Clay, entrepreneur and co-author of the Misfit Economy, she confirmed that there's a strange, low-lurking sexism in the start-up economy: “They have all very open and free, but underneath it there's still something really patriarchal.” Start-ups, after all, are a culture which celebrates risk-taking, something which women are societally discouraged from doing. As Clay says, 

“Men are allowed to fail in tech. You have these young guys who these old guys adopt and mentor. If his app doesn’t work, the mentor just shrugs it off. I would not be able ot get away with that, and I think women and minorities aren't allowed to take the same amount of risks, particularly in these communities. If you fail, no one's saying that's fine.

The conclusion of Lean Out, and of women in tech I have spoken to, isn’t that more women, over time, will enter these industries and seamlessly integrate – it’s that tech culture needs to change, or its lack of diversity will become even more severe. Shevinsky writes:

The reason why we don't have more women in tech is not because of a lack of STEM education. It's because too many high profile and influential individuals and subcultures within the tech industry have ignored or outright mistreated women applicants and employees. To be succinct—the problem isn't women, it's tech culture.

Software engineer Kate Heddleston has a wonderful and chilling metaphor about the way we treat women in STEM. Women are, she writes, the “canary in the coal mine”. If one dies, surely you should take that as a sign that the mine is uninhabitable – that there’s something toxic in the air. “Instead, the industry is looking at the canary, wondering why it can’t breathe, saying ‘Lean in, canary, lean in!’. When one canary dies they get a new one because getting more canaries is how you fix the lack of canaries, right? Except the problem is that there isn't enough oxygen in the coal mine, not that there are too few canaries.” We need more women in STEM, and, I’d argue, in tech in particular, but we need to make sure the air is breatheable first. 

Barbara Speed is a technology and digital culture writer at the New Statesman and a staff writer at CityMetric.