Show Hide image

Perpetual warfare

As the ruins of the twin towers still smouldered, the west plunged into a series of conflicts it could not win. Can it now confront its diminished place in the world?

The 11 September 2001 attacks were a new kind of warfare. Waged by small, decentralised, highly mobile groups not identified with any state or government, this hypermodern type of conflict aims not to conquer territory or destroy the enemy's military forces, but to weaken the adversary's society internally. For all its medieval trappings, al-Qaeda is deeply modern: its ideology owes more to Lenin than to Islamic theology, while its organisation is that of a decentralised global franchise operation.

The US response was a variant of conventional warfare: a Vietnam-like counter-insurgency directed against the Taliban in Afghanistan - only incidentally connected with al-Qaeda but an equally elusive force - followed by an attack on the state of Iraq, the effect of which was to allow al-Qaeda to build a presence in the country that it had lacked when Saddam Hussein was in power. The new type of war was not understood, and the failure of the US-led riposte was preordained

Terror is not a nebulous, all-pervading, demonic force. In more clear-thinking times, events that are now routinely described as acts of terrorism were seen as episodes in normal historical conflicts. Politicians and military people spoke of civil wars, insurrections and political assassinations rather than lumping together all forms of political violence into a single terrorist threat. It was also understood that political violence can never finally be eradicated. Today such sobriety is rare. Suicide bombing is interpreted as the expression of a religious culture of martyrdom, when it is a technique that was first developed by the Tamil Tigers, a Marxist-Leninist group.

The 1995 Oklahoma bombing and the 22 June massacre on Utøya island show that indigenous, far-right ideas can also have deadly effects; the Aum Shinrikyo cult in Japan would have wreaked cataclysmic damage if it had been able to implement its plan to use anthrax against the population. In Britain, far more people have been killed and injured by offshoots of the Irish Republican Army than by Islamist groups. If we are to talk of terrorism, the intimidation and murder by some American fundamentalist Christians of doctors who perform abortions also falls into that category. The threats to peace and security that we face are more specific and more diverse than the global evil posited in the "war on terror".

When it launched the 9/11 attacks, al-Qaeda demonstrated a firm grip on strategic logic. Nothing could be better calculated to throw western governments into panic than an assault on the World Trade Center - a monument to faith in the civilising magic of affluence. Later attacks in Bali, Madrid, London and elsewhere demonstrated the capacity of the network to operate on a global scale. A sober response to 9/11 would have involved focusing resources on intelligence-gathering and using the results to deter and disable terrorist activity in the countries that al-Qaeda was targeting. Instead, the west's response has been much as al-Qaeda's strategists intended: a succession of costly, unwinnable conflicts that have eroded the west's freedom and diminished its security, while exacerbating the serious but not unmanageable threat posed by al-Qaeda itself. If it is true that the danger may now be receding, it is because new movements of change are making al-Qaeda increasingly irrelevant.

The conflicts triggered by 9/11 have all been fought on false premises. Bombing al-Qaeda bases in Afghanistan was a legitimate act of self-defence and, in the context of US politics, may have been inevitable, but it was not the only option. The relationship between the Taliban and al-Qaeda has never been simple or unproblematic, and there is evidence that the Taliban may have been considering expelling al-Qaeda from Afghanistan when the bombing campaign got under way. Whether an alternative strategy, focused on convincing the Taliban regime to enforce such a policy, could have been effective is uncertain. What is clear is that, ten years later, the US-led coalition has been exploring a similar scenario - tacitly recognising that the fundamental problem has never been military. Even more than the Soviets - whose ruthless occupation some Afghans now remember as being preferable to the chaos of the present conflict - western forces have fought a war that lacked any achievable political goals. Unfortunately, the prospect of an orderly exit may prove to be just another mirage.

Welcomed by many Afghans and by some of the Taliban, the initial objective of ejecting al-Qaeda from the country was soon achieved. It is doubtful how much western security was improved. Al-Qaeda does not need permanent bases and has moved on to Pakistan, Yemen and post-Saddam Iraq. As the US became ever more preoccupied with a non-existent threat from Iraq, Afghanistan was forgotten and the Taliban returned.

The war has continued, with a series of shifting goals - installing democracy, promoting economic and social development, battling the drug trade and the like - all of them unrealisable. Building schools and hospitals may be a fine thing, but it will count for nothing when teachers and doctors are terrorised and killed after allied forces make their inevitable withdrawal from much of the country.

Linking the Afghan mission with the nonsensical "war on drugs" has been predictably counterproductive. Destroying drug production - the Americans at one point thinking of spraying the whole of Helmand Valley with weedkiller to wipe out the opium fields - would also have destroyed much of the Afghan economy. There is constant talk of preparing government forces to take over responsibility for security, Bamiyan being the first province handed over, on 17 July. But where government is weak and lacking in legitimacy, and where allegiance to any authority has long been a tradable commodity, it should be obvious that improving the training of local forces will not ensure their loyalty. Presiding over a territory that has never been ruled by a modern state, the Afghan government is not much more than a funnel for endemic corruption. In the event of a full-scale pull-out of US-led forces, it would be lucky to survive for more than 48 hours.

In the blind rush to export an idealised version of western governance, it has been forgotten that democracy comes in several versions, some of them highly illiberal. If a functioning democracy were to develop in Afghanistan in the current conditions, it would most likely be a variant of the Rousseau type that exists in Iran. The effect could be to entrench the power of the Taliban.

Built up by elements in Pakistani intelligence and financed with Saudi money, the Taliban waged a pitiless war on Afghan culture and traditions. At the same time they flouted the most basic human values. Stoning gay people and women who are victims of rape is barbarism pure and simple. Rather than preventing such atrocities, an Iranian-style Afghan democracy could instead confer legitimacy on those who commit them.

It is hard to imagine any kind of democracy in Afghanistan in the foreseeable future. In the event of a full drawdown of western forces, a many-sided civil war would ensue and the hapless peoples of Afghanistan would face a future without effective government, democratic or otherwise. At this point, the analogy with Vietnam becomes misleading. In Vietnam, the US retreat allowed the well-organised and competent government in the North to take control of the country. In Afghanistan, departing US-led forces would leave an ungoverned space.

Again, the underlying problem is political rather than military. There can be no peace in Afghanistan for as long as it is used as a theatre to play out regional conflicts. Without a solution to the division of Kashmir, the Afghans will continue to be pawns in the struggle between India and Pakistan (both nuclear powers) while Iran, Russia and China watch alertly on the sidelines. Perhaps Washington could once have brokered a settlement in the region, but with President Barack Obama having declared victory 18 months in advance of a US retreat, that time is gone. A pull-out would create a geopolitical vacuum in the region. That is why - assuming a worsening economic crisis in America doesn't force the issue - US forces are unlikely to make anything like a total withdrawal any time soon.

In contrast to Afghanistan, where even the Soviets could not instal a modern state, Saddam's Iraq was a thoroughly modern despotism. If western intervention in these quite different countries has failed in similar ways, one reason is that, in both cases, the west was unprepared to deal with the condition of anarchy that it had created.

Regime change in Iraq was engineered in the belief that something like liberal democracy would emerge of its own accord. But after nearly a quarter-century in power, Saddam's dictatorship was practically coextensive with the Iraqi state, and toppling the tyrant meant destroying any kind of government in the country. US policies - such as disbanding the Iraqi army - hastened this outcome, but it was principally a consequence of regime change itself. As the scale of the disaster began to unfold, it became conventional wisdom to claim that insufficient thought had been given to post-invasion planning. But before the war started it was clear that no one had the skills required to govern the failed state that the overthrow of the regime would create. The result - the Kurds hiving off as a de facto independent state and the rest of what had been Iraq governed by a shifting coalition of sectarian parties, with Shia politicians increasingly under Iranian influence - was in no way surprising. If there had been any serious forethought, the invasion would not have been launched.

It is not often that foreign policies come to grief because of an intellectual error, but this has been the case ever since the idea of humanitarian war took hold during the 1990s. Semi-successful in the Balkans, humanitarian intervention fuelled the illusion that - with only a small dose of force - freedom and democracy could be implanted anywhere in the world. Since then, the western elite have been gripped by the idea that authoritarian regimes are atavistic relics that will soon be swept aside in the grand march of history. There is nothing atavistic about tyranny - Nazism and Stalinism were unequivocally modern, like al-Qaeda today - and freedom is not the same as democracy.

Neoconservatives talk sagely of being "on the right side of history" - as if a process of evolution had begun, at the end of which all of humankind will at last become like the neocons. Rather, what is happening is that the world is returning to the normality of only a few centuries ago, when power and wealth were more evenly distributed between east and west. There is nothing that need be feared in this shift, but it destroys the myth that the west is a model for the whole of humanity.

The notion that the Arab spring is a rerun of Europe's 1848 revolutions is an example of this kind of thinking. Those who make the comparison are asserting ownership of movements that owe very little to the west. When Tony Blair and his fellow neocons tell the Arab world that it must modernise, they assume that modernisation is a quick and peaceful process that ends with the adoption of "democratic capitalism". A little history shows a different picture. The popular protests of 1848 were soon defeated. Europe became democratic only after two world wars and a long cold war. Building a Europe of democratic nation states was a lengthy and violent business, involving ethnic cleansing between the two wars, and then again after the fall of Yugoslavia in 1991.

Nor is the global order that was then put in place in any sense stable. The European project is coming apart at the seams, while in the US - only a few years ago incessantly lecturing the world on the need to embrace the "Washington consensus" - the financial system has collapsed. Supposedly the end of history, "democratic capitalism" of the sort that prevailed over the past two decades now looks like a blind alley.

In this light, why should the peoples of the Arab world retrace the west's journey? They would be better off striking out on paths of their own. Western declarations of support for the new Arab protest movements are in any case sel­ective. Not much outrage is voiced at torture and murder in Bahrain - home to a US navy base, and a vital link in the supply of oil from Saudi Arabia.

Lying behind these inconsistencies is an awkward geostrategic fact. When they give rhetorical backing to protest in the Middle East and North Africa, western governments are speaking as they did when they backed democracy in the Soviet bloc. Yet while the fall of communism seemed for a time to enlarge western power, the west now finds itself in the position of the former Soviet Union, losing control of events as popular uprisings threaten regimes it has kept in power for decades.

It is often claimed that the uprisings in the Arab world show that the west has been short-sighted in pursuing stability over more high-minded goals. It was not western realpolitik which triggered the protests, however. Much has been written about the role of social networks in powering the uprisings, and new media were undoubtedly an important factor. But, to an extent that has not been appreciated, the Arab protest movements emerged as an unintended consequence of western weakness. The demand for change had a specific cause: the steep rise in food prices that was produced by the liquidity released by Ben Bernanke, chairman of the US Federal Reserve, into global markets. Quantitative easing (QE) is, in effect, a policy of creating new money and, just like money-printing by governments and central banks in the past, it tends to produce inflation. In this case, the inflation showed up in asset prices - in stock markets and on the commodity markets.

The protests in Tunisia began as bread riots, and though graduate unemployment may have been a larger factor in Egypt, the protests there occurred against a background in which the country - one of the world's biggest importers of wheat - was facing a steep rise in the price of food. Not only in the Middle East but in the world as a whole, there is a looming problem of food scarcity, which is partly a result of the sheer growth in human numbers, projected to increase from roughly seven billion at present to more than nine billion in 2050. The sudden rise in the cost of food was not only a result of increasing demand, however. Another factor was the Federal Reserve's attempt to refloat the sinking American economy with a flood of cheap money, beginning with the ultra-low interest rates engineered by Alan Greenspan from 2001 onwards, which led to a speculative boom in commodity prices. Driving up living costs in poor countries that import much of their food, American monetary policy has been a potent force for regime change. In an ironic twist, US weakness has unwittingly sparked revolution in the Arab world as its blundering attempts to impose regime change by force have been swallowed by the sands.

There are some who see the entire war on terror as a cover for neo-colonialism. Behind all the pro­clamations about democracy and human rights, they say, the real goal was building pipelines in Afghanistan and seizing oilfields in Iraq. In fact, the course of events has been much more absurd. There is no evidence of consecutive thought of the kind required to make any conspiracy theory credible. Certainly there has been disinformation - plenty of it - but rather than concealing any covert strategy, it masked the lack of any strategy at all. Despite denials at the time, oil was a crucial factor in the decision to invade Iraq, but western companies cannot operate effectively in conditions of near-anarchy, and it was only at the start of this year that Iraqi oil production reached levels it achieved under Saddam. Again, there was never any realistic chance of western forces using Afghanistan as an energy corridor or of harvesting the country's abundant mineral wealth - if any country benefits, it will be China, which by standing aside from the conflict does not face the security problems of western businesses and has a better chance of establishing a long-term presence in Afghanistan. The wars of the past decade have been colossally expensive, costing billions of dollars and accelerating the US decline into national bankruptcy. As an exercise in neocolonialism, perpetual warfare has been strikingly unprofitable.

More than by disinformation, the decade of war has been shaped by delusion. Today, for western leaders, the utility of force is not so much to achieve any specific goal as to preserve a sense of their importance in the world. Wealth and power are flowing to the east and south, but Europe and the US still claim global leadership. More than by any humanitarian impulse, it seems to have been this need to reaffirm a distinctive western destiny that motivated the Libyan adventure.

Fearful of being dragged into the chaos that will ensue if Libya fragments, the Obama administration has not been a cheerleader for this intervention, which is primarily a European folly. The commonplace that Nato forces lack a clear exit strategy misses the point. How could they have such a strategy, when they have no rationale for being in Libya in the first place? Intervention might have been justified if the objective had been simply to prevent carnage in Benghazi - though the risk of killing on the scale that is happening in Syria, where the west has shown no interest in intervening, seems to have been small.

But an end to violence could be secured only by negotiating with Muammar al-Gaddafi and leaving him in power, an outcome unacceptable to David Cameron and Nicolas Sarkozy, insecure and impulsive leaders anxious to make their mark on the international scene. So, Britain and France have opted for regime change, risking creating another failed state. Libya does not have the religious divisions of Iraq, but now that the fragmented opposition finds itself struggling to govern a still tribal and fractured country, it must be an open question whether a tolerable level of order can be maintained without further engagement by the west - including boots on the ground.

The posturing that has surrounded the Libyan adventure highlights the contradictions of humanitarian warfare. Its advocates declare that the west has a duty to protect universal values, with neoconservatives railing against critics as feeble moral relativists. Coming from neocons, who more than anyone else undermined the ban on torture - one of the fixed points in any civilised ethics - the assault on relativism has a hollow ring. However, the contradictions of humanitarian warfare affect its more principled advocates as well. Contrary to postmodern relativists, some values are humanly universal. The trouble is that these values are often in conflict. Peace and justice are universal goods, but they are at odds in Libya. Branding Gaddafi a war criminal (as the International Criminal Court did on 27 June) may have been right in terms of justice. Whether he would have chosen to leave if the way had been smoothed for him (as some in the Obama administration seem to have wanted at one time) cannot be known. But closing off any exit for the Libyan tyrant could only have had the effect of prolonging the war. Humanitarian military intervention is exposed to these conflicts of values just like any other kind of warfare.

The illusions of liberal intervention are screening out the risks faced by western countries. One comes from upheaval in the Gulf. Peak oil leaves Saudi Arabia the world's pivotal producer. Any disruption in production resulting from conflict in the Gulf would detonate an oil shock bigger than any other in the past. Contrary to what some on the left believe, the greatest danger of war may not come from the US or Israel. Upheaval in Bahrain illustrates the mounting risk of conflict between the Saudis and the Iranians, which Olivier Roy ("The long war between Sunni and Shia", New Statesman, 20 June 2011) has described as the defining schism in the Middle East - a schism whose depth was revealed when a former head of Saudi intelligence warned Nato officials in June this year that the kingdom would build nuclear weapons once Iran acquired them. As it drifts away from Europe, Turkey, too, is becoming an increasingly powerful player in the region.

The west would be wise to curb its dependency on oil, but that will not remove the risk of resource wars. The coming conflicts will not be mainly between the west and the rest. Advancing industrialisation has set in motion a new Great Game in which western states are not the most important players. China is the world's largest energy consumer after the US and will soon be first; but its fiercest rival for oil in future is likely to be India, rather than the US.

The danger comes not only from peak oil. Peaking minerals, arable land and fresh water are likely to inflame existing conflicts and spark new ones in many parts of the world. As Mark Lynas has noted ("Panic stations", New Statesman, 21 March 2011), countries that reject nuclear power are likely to turn to coal and gas, speeding up global warming as a result. Some countries may well try to control the climate through geo-engineering, and it would not be surprising if weather-modifying technologies were turned to military use.

If a new pattern of conflict is developing around natural resources, another is emerging in cyberspace. There are those who argue that the danger is being exaggerated, but there can be no doubt that as the economy and infrastructure become more reliant on computers, they become more vulnerable to cyber-attack. Practically every part of an advanced modern society can be disabled in this way - power supply, airports, banks, companies, television stations and personal computers, for example. Cyber-attack already occurs frequently, with episodes reported in the Baltic countries and the Middle East, among other places. Touted as a realm of freedom and transparency, cyberspace has become another site of conflict.

Our present insecurity is not a passing phase - a station on the way to a state of peace and stability. Insecurity will be the common con­dition in any future that is realistically imaginable. Our leaders should be looking for intelligent ways of adjusting to this state of affairs. But it is precisely the capacity for realistic thinking that is lacking. Talk of victory in Afghanistan is delusional - just as the idea that liberal democracy would follow regime change in Iraq was delusional. Yet the role of such discourse is not to represent things as they are, nor even as they might some day become. It is to create a pseudo-reality that insulates rulers and those they rule from painful facts.

The September 2001 attacks succeeded in producing what their perpetrators intended: a suspension of rational thought. Beginning as an ill-considered response to a new type of conflict, the permanent warfare that followed became a displacement activity, the function of which has been to distract attention from the west's problems - declining skills, falling living standards, debt and festering unrest.

Sooner or later, the cost of maintaining the west's illusions will become prohibitive. Countries whose economies are floundering cannot for long sustain vast, costly and ineffective military-industrial complexes. To be sure, the retreat of western power will not usher in any age of peace. War will not cease, if only because conflicts over natural resources are certain to increase. The normal conflicts of history - including many types of political violence - will continue. But the curtain is about to fall on the absurd and gruesome spectacle of the past decade, when the west waged unceasing war in order to avoid confronting its true position in the world.

John Gray is the NS lead book reviewer. His most recent book is "The Immortalization Commission: Science and the Strange Quest
to Defeat Death" (Allen Lane, £18.99)

John Gray is the New Statesman’s lead book reviewer. His latest book is The Soul of the Marionette: A Short Enquiry into Human Freedom.

This article first appeared in the 05 September 2011 issue of the New Statesman, 9/11

The Science & Society Picture Library
Show Hide image

This Ada Lovelace Day, let’s celebrate women in tech while confronting its sexist culture

In an industry where men hold most of the jobs and write most of the code, celebrating women's contributions on one day a year isn't enough. 

Ada Lovelace wrote the world’s first computer program. In the 1840s Charles Babbage, now known as the “father of the computer”, designed (though never built) the “Analytical Engine”, a machine which could accurately and reproducibly calculate the answers to maths problems. While translating an article by an Italian mathematician about the machine, Lovelace included a written algorithm for which would allow the engine to calculate a sequence of Bernoulli numbers.

Around 170 years later, Whitney Wolfe, one of the founders of dating app Tinder, was allegedly forced to resign from the company. According to a lawsuit she later filed against the app and its parent company, she had her co-founder title removed because, the male founders argued, it would look “slutty”, and because “Facebook and Snapchat don’t have girl founders. It just makes it look like Tinder was some accident". (They settled out of court.)

Today, 13 October, is Ada Lovelace day – an international celebration of inspirational women in science, technology, engineering and mathematics (STEM). It’s lucky we have this day of remembrance, because, as Wolfe’s story demonstrates, we also spend a lot of time forgetting and sidelining women in tech. In the wash of pale male founders of the tech giants that rule the industry,we don't often think about the women that shaped its foundations: Judith Estrin, one of the designers of TCP/IP, for example, or Radia Perlman, inventor of the spanning-tree protocol. Both inventions sound complicated, and they are – they’re some of the vital building blocks that allow the internet to function. 

And yet David Streitfield, a Pulitzer-prize winning journalist, someow felt it accurate to write in 2012: “Men invented the internet. And not just any men. Men with pocket protectors. Men who idolised Mr Spock and cried when Steve Jobs died.”

Perhaps we forget about tech's founding women because the needle has swung so far into the other direction. A huge proportion – perhaps even 90 per cent - of the world’s code is written by men. At Google, women fill 17 per cent of technical roles. At Facebook, 15 per cent. Over 90 per cent of the code respositories on Github, an online service used throughout the industry, are owned by men. Yet it's also hard to believe that this erasure of women's role in tech is completely accidental. As Elissa Shevinsky writes in the introduction to a collection of essays on gender in tech, Lean Out: “This myth of the nerdy male founder has been perpetuated by men who found this story favourable."

Does it matter? It’s hard to believe that it doesn’t. Our society is increasingly defined and delineated by code and the things it builds. Small slip-ups, like the lack of a period tracker on the original Apple Watch, or fitness trackers too big for some women’s wrists, gesture to the fact that these technologies are built by male-dominated teams, for a male audience.

In Lean Out, one essay written by a Twitter-based “start-up dinosaur” (don’t ask) explains how dangerous it is to allow one small segment of society to built the future for the rest of us:

If you let someone else build tomorrow, tomorrow will belong to someone else. They will build a better tomorrow for everyone like them… For tomorrow to be for everyone, everyone needs to be the one [sic] that build it.

So where did all the women go? How did we get from a rash of female inventors to a situation where the major female presence at an Apple iPhone launch is a model’s face projected onto a screen and photoshopped into a smile by a male demonstrator? 

Photo: Apple.

The toxic culture of many tech workplaces could be a cause or an effect of the lack of women in the industry, but it certainly can’t make make it easy to stay. Behaviours range from the ignorant - Martha Lane-Fox, founder of, often asked “what happens if you get pregnant?” at investors' meetings - to the much more sinister. An essay in Lean Out by Katy Levinson details her experiences of sexual harassment while working in tech: 

I have had interviewers attempt to solicit sexual favors from me mid-interview and discuss in significant detail precisely what they would like to do. All of these things have happened either in Silicon Valley working in tech, in an educational institution to get me there, or in a technical internship.

Others featured in the book joined in with the low-level sexism and racism  of their male colleagues in order to "fit in" and deflect negative attention. Erica Joy writes that while working in IT at the University of Alaska as the only woman (and only black person) on her team, she laughed at colleagues' "terribly racist and sexist jokes" and "co-opted their negative attitudes”. 

The casual culture and allegedly meritocratic hierarchies of tech companies may actually be encouraging this discriminatory atmosphere. HR and the strict reporting procedures of large corporates at least give those suffering from discrimination a place to go. A casual office environment can discourage reporting or calling out prejudiced humour or remarks. Brook Shelley, a woman who transitioned while working in tech, notes: "No one wants to be the office mother". So instead, you join in and hope for the best. 

And, of course, there's no reason why people working in tech would have fewer issues with discrimination than those in other industries. A childhood spent as a "nerd" can also spawn its own brand of misogyny - Katherine Cross writes in Lean Out that “to many of these men [working in these fields] is all too easy to subconciously confound women who say ‘this is sexist’ with the young girls who said… ‘You’re gross and a creep and I’ll never date you'". During GamerGate, Anita Sarkeesian was often called a "prom queen" by trolls. 

When I spoke to Alexa Clay, entrepreneur and co-author of the Misfit Economy, she confirmed that there's a strange, low-lurking sexism in the start-up economy: “They have all very open and free, but underneath it there's still something really patriarchal.” Start-ups, after all, are a culture which celebrates risk-taking, something which women are societally discouraged from doing. As Clay says, 

“Men are allowed to fail in tech. You have these young guys who these old guys adopt and mentor. If his app doesn’t work, the mentor just shrugs it off. I would not be able ot get away with that, and I think women and minorities aren't allowed to take the same amount of risks, particularly in these communities. If you fail, no one's saying that's fine.

The conclusion of Lean Out, and of women in tech I have spoken to, isn’t that more women, over time, will enter these industries and seamlessly integrate – it’s that tech culture needs to change, or its lack of diversity will become even more severe. Shevinsky writes:

The reason why we don't have more women in tech is not because of a lack of STEM education. It's because too many high profile and influential individuals and subcultures within the tech industry have ignored or outright mistreated women applicants and employees. To be succinct—the problem isn't women, it's tech culture.

Software engineer Kate Heddleston has a wonderful and chilling metaphor about the way we treat women in STEM. Women are, she writes, the “canary in the coal mine”. If one dies, surely you should take that as a sign that the mine is uninhabitable – that there’s something toxic in the air. “Instead, the industry is looking at the canary, wondering why it can’t breathe, saying ‘Lean in, canary, lean in!’. When one canary dies they get a new one because getting more canaries is how you fix the lack of canaries, right? Except the problem is that there isn't enough oxygen in the coal mine, not that there are too few canaries.” We need more women in STEM, and, I’d argue, in tech in particular, but we need to make sure the air is breatheable first. 

Barbara Speed is a technology and digital culture writer at the New Statesman and a staff writer at CityMetric.