New web security system tests computers' emotions

Sorting the men from the replicants.

A new Captcha system seeks to separate humans from computers by testing empathy – and spreading awareness of human rights human rights abuses at the same time.

A Captcha – which stands for Completely Automated Public Turing test to tell Computers and Humans Apart – is the test used when logging into many sites to distinguish between real people and malicious programs, which may attempt to log into many thousands of accounts at the same time. You've all used one – signing up for a New Statesman commenting account, if nowhere else – and they are ripe for being put to good use.

ReCAPTCHA was the first socially-beneficial captcha, and still the most popular. It uses the combined might of all the human brain power wasted on Captchas to transcribe scanned books:

reCAPTCHA improves the process of digitizing books by sending words that cannot be read by computers to the Web in the form of CAPTCHAs for humans to decipher. More specifically, each word that cannot be read correctly by OCR is placed on an image and used as a CAPTCHA. This is possible because most OCR programs alert you when a word cannot be read correctly.

Since it took off, ReCAPTCHA has been used on innumerable sites, and is now displayed over 100 million times a day. But that success comes at a price. Now that the low hanging fruit has been plucked, fewer and fewer easily-transcribable words remain in its corpus, meaning that the system regularly throws up completely unintelligible words, words in other scripts, or things which just aren't language at all.

The civil rights captcha wants to be the replacement. Rather than using the captcha to perform useful work, like reCAPTCHA, it uses it to raise awareness about important issues:

Instead of visually decoding an image of distorted letters, the user has to take a stand regarding facts about human rights. Depending on whether the described situation is positively or negatively charged, the CAPTHA generates three random words from a database. These words describe positive and negative emotions. The user selects the word that best matches how they feel about the situation, and writes the word in the CAPTCHA. Only one answer is correct, the answer showing compassion and empathy.

As well as being important socially – example questions include "The parliament in St. Petersburg recently passed a law that forbids "homosexual propaganda". How does that make you feel?" – the Civil Rights Captcha is stronger against attack as well. It includes the same visual element as a reCAPTCHA, requiring potential attackers to decipher obfuscated words, but also requires any automated attack to parse a complex question, pick the right emotion, and only then work out which of the proffered words match that emotion.

The whole thing is rather reminiscent of Blade Runner:

We'll catch those pesky replicants yet.

Rutger Hauer, in the film Blade Runner.

Alex Hern is a technology reporter for the Guardian. He was formerly staff writer at the New Statesman. You should follow Alex on Twitter.

Getty.
Show Hide image

Anita Sarkeesian: “I don’t like the words ‘troll’ and ‘bully’”

The media critic and GamerGate target tells the Guardian that online harassers need a rebrand.

Anita Sarkeesian has been under attack for an entire year. She has received bomb threats, rape threats, gun threats, and threats that events she was due to speak at would be attacked. Her home address was circulated in online gaming communities. Her crime? She started a Kickstarter campaign for her YouTube channel, Feminist Frequency, to fund a series called “Tropes vs Women in Video Games”, which catalogues the sexist stereotypes and attitudes in gaming. 

So overall, it's pretty unsurprising that Sarkeesian doesn't call her attackers "trolls" or "bullies", with their comfy associations of schoolyards and fairytale bridges. 

Speaking in an interview with Jessica Valenti, published in the Guardian this weekend, Sarkeesian explains her reasoning:

“I don’t like the words ‘troll’ and ‘bully’ – it feels too childish. This is harassment and abuse."

She also implies that these words tie into a delusion entertained by some of the men themselves – that the abuse is just a bit of fun. Yet whatever the intent, Sarkeesian argues, “it still perpetuates all of the harmful myths attached to that language and those words”.

The interview also covers GamerGate controversy as a whole and Sarkeesian’s rise to prominence as someone willing to speak publicly about the abuse she has receved. As she points out, however,

“There are a lot of people who are being targeted who don’t get the attention I do. Women of colour and trans women, in particular, are not getting media attention and not getting the support they need.”

Barbara Speed is a technology and digital culture writer at the New Statesman and a staff writer at CityMetric.