We can't crowdsource the right to free speech

The BBFC's plan to put content flags on online video could work – but crowd-sourcing censorship isn't the right way to do it.

The debate over protecting children from unsuitable web content has given rise to a novel proposal for content to be rated by users, with the resulting votes going to determine the suitability of the content. Plans are reported to be underway for a traffic light age rating system for user generated videos, on which the British Board of Film Classification and its Dutch equivalent are working with service providers and government. How this will work in practice has not yet been announced but dangers for freedom of expression lurk in relying too heavily on the wisdom of the crowd.

The timing suggests that the idea may be related to the Prime Minster's proposal that households should be able to control their access to adult content online by switching on a simple filter. One of the criticisms levelled at filtering is that to be effective it will have to be a relatively blunt instrument and block both the inoffensive and the inappropriate, with a potential impact on freedom of expression. Crowd-sourced age rating of content is at first sight both appealingly simple and potentially better, allowing greater discernment between content which really is adult and that which a machine might consider so. Red, amber and green ratings will reportedly be arrived at through a combination of the rating applied by the work's contributor and how the audience reacts.

The web inevitably makes available some content which is unsuitable or inappropriate for children to access. Some of this will be illegal, but much more will not, or may be suitable say for over 13s or over 16s only. A traffic light system may therefore struggle to distinguish between these and runs the risk of imposing the strictest warning on masses of content by default.

A greater concern however, is how the new system will guard against becoming a tool to enable prejudices of one kind or another to be played out. The system can only operate if it is the crowd's decision which counts - the reason this is even being considered is because there is too much content for a regulator or platform to consider. Relying on the crowd assumes that a collective consciousness emerges from the great mass of web users and their shared values, rather than a set of subjective reactions. This is a dangerous assumption. As a recent MIT study reported in Science suggests, the "wisdom" of the crowd may be a myth, its mentality more akin to that of a mob or herd. 

A huge amount of content which some viewers may be strongly, even violently, opposed to can be found online. However, such content may well not be illegal, or even be the sort of content that a body such as the BBFC would normally feel the need to apply adult age ratings to - religious teachings for example. Once crowd or mob has control, how will the system ensure it cannot be hijacked to serve the values of one interest group over another? Very few votes may be enough for any piece of video content to be tagged as unsuitable. 

Even then, merely adding a red traffic light rating to a piece of content may not by itself do much harm. But what if the ratings are not a simple visual warning but information which determines whether that piece of content is made available or not?

In controlling what content is made available, European governments' room for manoeuvre is limited. EU law enshrines protection for freedom of expression. Where Member States take measures which affect users' access to and use of services and applications over electronic networks, they have to respect fundamental human rights and freedoms. Any restrictions need to satisfy tests of being appropriate, proportionate and necessary in a democratic society. Determining the suitability of content has, until now, been the preserve of carefully chosen, neutral regulators, applying a set of agreed principles. Would mandating a system of crowd-sourced suitability ratings from anonymous web users around the world satisfy these tests? Without being able to ensure that the system could not be hijacked, it may struggle to do so.

So, encouraging ISPs to take voluntary steps may assist governments in assuaging the most vocal demands for action, while avoiding a difficult debate over internet regulation. But any approved scheme will need safeguards over whether the traffic lights become the basis for automated blocking of content which a household or ISP can apply at the flick of a switch. Once an appealingly simple idea like this takes hold, it may not be readily dropped and may go on to have profound effects on what content is made available in the majority of households in this country.


Mark Owen is a partner at international law firm Taylor Wessing. He writes here in a personal capacity.

The Science & Society Picture Library
Show Hide image

This Ada Lovelace Day, let’s celebrate women in tech while confronting its sexist culture

In an industry where men hold most of the jobs and write most of the code, celebrating women's contributions on one day a year isn't enough. 

Ada Lovelace wrote the world’s first computer program. In the 1840s Charles Babbage, now known as the “father of the computer”, designed (though never built) the “Analytical Engine”, a machine which could accurately and reproducibly calculate the answers to maths problems. While translating an article by an Italian mathematician about the machine, Lovelace included a written algorithm for which would allow the engine to calculate a sequence of Bernoulli numbers.

Around 170 years later, Whitney Wolfe, one of the founders of dating app Tinder, was allegedly forced to resign from the company. According to a lawsuit she later filed against the app and its parent company, she had her co-founder title removed because, the male founders argued, it would look “slutty”, and because “Facebook and Snapchat don’t have girl founders. It just makes it look like Tinder was some accident". (They settled out of court.)

Today, 13 October, is Ada Lovelace day – an international celebration of inspirational women in science, technology, engineering and mathematics (STEM). It’s lucky we have this day of remembrance, because, as Wolfe’s story demonstrates, we also spend a lot of time forgetting and sidelining women in tech. In the wash of pale male founders of the tech giants that rule the industry,we don't often think about the women that shaped its foundations: Judith Estrin, one of the designers of TCP/IP, for example, or Radia Perlman, inventor of the spanning-tree protocol. Both inventions sound complicated, and they are – they’re some of the vital building blocks that allow the internet to function. 

And yet David Streitfield, a Pulitzer-prize winning journalist, someow felt it accurate to write in 2012: “Men invented the internet. And not just any men. Men with pocket protectors. Men who idolised Mr Spock and cried when Steve Jobs died.”

Perhaps we forget about tech's founding women because the needle has swung so far into the other direction. A huge proportion – perhaps even 90 per cent - of the world’s code is written by men. At Google, women fill 17 per cent of technical roles. At Facebook, 15 per cent. Over 90 per cent of the code respositories on Github, an online service used throughout the industry, are owned by men. Yet it's also hard to believe that this erasure of women's role in tech is completely accidental. As Elissa Shevinsky writes in the introduction to a collection of essays on gender in tech, Lean Out: “This myth of the nerdy male founder has been perpetuated by men who found this story favourable."

Does it matter? It’s hard to believe that it doesn’t. Our society is increasingly defined and delineated by code and the things it builds. Small slip-ups, like the lack of a period tracker on the original Apple Watch, or fitness trackers too big for some women’s wrists, gesture to the fact that these technologies are built by male-dominated teams, for a male audience.

In Lean Out, one essay written by a Twitter-based “start-up dinosaur” (don’t ask) explains how dangerous it is to allow one small segment of society to built the future for the rest of us:

If you let someone else build tomorrow, tomorrow will belong to someone else. They will build a better tomorrow for everyone like them… For tomorrow to be for everyone, everyone needs to be the one [sic] that build it.

So where did all the women go? How did we get from a rash of female inventors to a situation where the major female presence at an Apple iPhone launch is a model’s face projected onto a screen and photoshopped into a smile by a male demonstrator? 

Photo: Apple.

The toxic culture of many tech workplaces could be a cause or an effect of the lack of women in the industry, but it certainly can’t make make it easy to stay. Behaviours range from the ignorant - Martha Lane-Fox, founder of, often asked “what happens if you get pregnant?” at investors' meetings - to the much more sinister. An essay in Lean Out by Katy Levinson details her experiences of sexual harassment while working in tech: 

I have had interviewers attempt to solicit sexual favors from me mid-interview and discuss in significant detail precisely what they would like to do. All of these things have happened either in Silicon Valley working in tech, in an educational institution to get me there, or in a technical internship.

Others featured in the book joined in with the low-level sexism and racism  of their male colleagues in order to "fit in" and deflect negative attention. Erica Joy writes that while working in IT at the University of Alaska as the only woman (and only black person) on her team, she laughed at colleagues' "terribly racist and sexist jokes" and "co-opted their negative attitudes”. 

The casual culture and allegedly meritocratic hierarchies of tech companies may actually be encouraging this discriminatory atmosphere. HR and the strict reporting procedures of large corporates at least give those suffering from discrimination a place to go. A casual office environment can discourage reporting or calling out prejudiced humour or remarks. Brook Shelley, a woman who transitioned while working in tech, notes: "No one wants to be the office mother". So instead, you join in and hope for the best. 

And, of course, there's no reason why people working in tech would have fewer issues with discrimination than those in other industries. A childhood spent as a "nerd" can also spawn its own brand of misogyny - Katherine Cross writes in Lean Out that “to many of these men [working in these fields] is all too easy to subconciously confound women who say ‘this is sexist’ with the young girls who said… ‘You’re gross and a creep and I’ll never date you'". During GamerGate, Anita Sarkeesian was often called a "prom queen" by trolls. 

When I spoke to Alexa Clay, entrepreneur and co-author of the Misfit Economy, she confirmed that there's a strange, low-lurking sexism in the start-up economy: “They have all very open and free, but underneath it there's still something really patriarchal.” Start-ups, after all, are a culture which celebrates risk-taking, something which women are societally discouraged from doing. As Clay says, 

“Men are allowed to fail in tech. You have these young guys who these old guys adopt and mentor. If his app doesn’t work, the mentor just shrugs it off. I would not be able ot get away with that, and I think women and minorities aren't allowed to take the same amount of risks, particularly in these communities. If you fail, no one's saying that's fine.

The conclusion of Lean Out, and of women in tech I have spoken to, isn’t that more women, over time, will enter these industries and seamlessly integrate – it’s that tech culture needs to change, or its lack of diversity will become even more severe. Shevinsky writes:

The reason why we don't have more women in tech is not because of a lack of STEM education. It's because too many high profile and influential individuals and subcultures within the tech industry have ignored or outright mistreated women applicants and employees. To be succinct—the problem isn't women, it's tech culture.

Software engineer Kate Heddleston has a wonderful and chilling metaphor about the way we treat women in STEM. Women are, she writes, the “canary in the coal mine”. If one dies, surely you should take that as a sign that the mine is uninhabitable – that there’s something toxic in the air. “Instead, the industry is looking at the canary, wondering why it can’t breathe, saying ‘Lean in, canary, lean in!’. When one canary dies they get a new one because getting more canaries is how you fix the lack of canaries, right? Except the problem is that there isn't enough oxygen in the coal mine, not that there are too few canaries.” We need more women in STEM, and, I’d argue, in tech in particular, but we need to make sure the air is breatheable first. 

Barbara Speed is a technology and digital culture writer at the New Statesman and a staff writer at CityMetric.