New web security system tests computers' emotions

Sorting the men from the replicants.

A new Captcha system seeks to separate humans from computers by testing empathy – and spreading awareness of human rights human rights abuses at the same time.

A Captcha – which stands for Completely Automated Public Turing test to tell Computers and Humans Apart – is the test used when logging into many sites to distinguish between real people and malicious programs, which may attempt to log into many thousands of accounts at the same time. You've all used one – signing up for a New Statesman commenting account, if nowhere else – and they are ripe for being put to good use.

ReCAPTCHA was the first socially-beneficial captcha, and still the most popular. It uses the combined might of all the human brain power wasted on Captchas to transcribe scanned books:

reCAPTCHA improves the process of digitizing books by sending words that cannot be read by computers to the Web in the form of CAPTCHAs for humans to decipher. More specifically, each word that cannot be read correctly by OCR is placed on an image and used as a CAPTCHA. This is possible because most OCR programs alert you when a word cannot be read correctly.

Since it took off, ReCAPTCHA has been used on innumerable sites, and is now displayed over 100 million times a day. But that success comes at a price. Now that the low hanging fruit has been plucked, fewer and fewer easily-transcribable words remain in its corpus, meaning that the system regularly throws up completely unintelligible words, words in other scripts, or things which just aren't language at all.

The civil rights captcha wants to be the replacement. Rather than using the captcha to perform useful work, like reCAPTCHA, it uses it to raise awareness about important issues:

Instead of visually decoding an image of distorted letters, the user has to take a stand regarding facts about human rights. Depending on whether the described situation is positively or negatively charged, the CAPTHA generates three random words from a database. These words describe positive and negative emotions. The user selects the word that best matches how they feel about the situation, and writes the word in the CAPTCHA. Only one answer is correct, the answer showing compassion and empathy.

As well as being important socially – example questions include "The parliament in St. Petersburg recently passed a law that forbids "homosexual propaganda". How does that make you feel?" – the Civil Rights Captcha is stronger against attack as well. It includes the same visual element as a reCAPTCHA, requiring potential attackers to decipher obfuscated words, but also requires any automated attack to parse a complex question, pick the right emotion, and only then work out which of the proffered words match that emotion.

The whole thing is rather reminiscent of Blade Runner:

We'll catch those pesky replicants yet.

Rutger Hauer, in the film Blade Runner.

Alex Hern is a technology reporter for the Guardian. He was formerly staff writer at the New Statesman. You should follow Alex on Twitter.

Show Hide image

7 problems with the Snooper’s Charter, according to the experts

In short: it was written by people who "do not know how the internet works".

A group of representatives from the UK Internet Service Provider’s Association (ISPA) headed to the Home Office on Tuesday to point out a long list of problems they had with the proposed Investigatory Powers Bill (that’s Snooper’s Charter to you and me). Below are simplified summaries of some of the objections submitted by Adrian Kennard, of Andrews and Arnold, a small internet service provider, to the department after the meeting. 

1. The types of records the government wants collected aren’t that useful

The IP Bill places a lot of emphasis on “Internet Connection Records”; i.e. a list of domains you’ve visited, but not the specific pages visited or messages sent.

But in an age of apps and social media, where we view vast amounts of information through single domains like Twitter or Facebook, this information might not even help investigators much, as connections can last for days, or even months. Kennard gives the example of a missing girl, used as a hypothetical case by the security services to argue for greater powers:

 "If the mobile provider was even able to tell that she had used twitter at all (which is not as easy as it sounds), it would show that the phone had been connected to twitter 24 hours a day, and probably Facebook as well… this emotive example is seriously flawed”

And these connection records are only going to get less relevant over time - an increasing number of websites including Facebook and Google encrypt their website under "https", which would make finding the name of the website visited far more difficult.

2. …but they’re still a massive invasion of privacy

Even though these records may be useless when someone needs to be found or monitored, the retention of Internet Connection Records (ICRs) is still very invasive – and can actually yield more information than call records, which Theresa May has repeatedly claimed are their non-digital equivalent.

Kennard notes: “[These records] can be used to profile [individuals] and identify preferences, political views, sexual orientation, spending habits and much more. It is useful to criminals as it would easily confirm the bank used, and the time people leave the house, and so on”. 

This information might not help find a missing girl, but could build a profile of her which could be used by criminals, or for over-invasive state surveillance. 

3. "Internet Connection Records" aren’t actually a thing

The concept of a list of domain names visited by a user referred to in the bill is actually a new term, derived from the “Call Data Records" collected by hone companies. Compiling them is possible, but won't be an easy or automatic process.

Again, this strongly implies that those writing the bill are using their knowledge of telecommunications surveillance, not internet era-appropriate information. Kennard calls for the term to be removed form the bill. or at least its “vague and nondescript nature” made clear.

4. The surveillance won’t be consistent and could be easy to dodge

In its meeting with the ISPA, the Home Office implied that smaller Internet service providers won't be forced to collect these ICR records, as it's a costly process. But this means those seeking to avoid surveillance could simply move over to a smaller provider. Bit of a loophole there. 

5. Conservative spin is dictating the way we view the bill 

May and the Home Office are keen for us to see the surveillance in the bill as passive: internet service providers must simply log the domains we visit, which will be looked at in the event that we are the subject of an investigation. But as Kennard notes, “I am quite sure the same argument would not work if, for example, the law required a camera in every room in your house”. This is a vast new power the government is asking for – we shouldn’t allow politicians to play it down.

6. The bill would allow our devices to be bugged

Or, in the jargon, used in the draft bill, subjected to “equipment interference”. This could include surveillance of all use of a phone or laptop, or even the ability to turn on its camera or webcam to watch someone. The bill actually calls for “bulk equipment interference” – when surely, as Kennard notes, “this power…should only be targeted at the most serious of criminal suspects" at most.

7. The ability to bug devices would make them less secure

Devices can only be subject to “equipment interference”, or bugging, if they have existing vulnerabilities, which could also be exploited by criminals and hackers. If security services know about these vulnerabilities, they should tell the manufacturer about them. As Kennard writes, allowing equipment interference "encourages the intelligence services to keep vulnerabilities secret” so they don't lose their own access to our devices. Meanwhile, though, they're laying the population open to hacks from cyber criminals. 


So there you have it  – a compelling soup of misused and made up terms, and ethically concerning new powers. 


This piece was updated on 1 December to reflect the fact that the written evidence contained the opinions of Andrew Kennard, and not necessarily those of the ISPA.

Barbara Speed is a technology and digital culture writer at the New Statesman and a staff writer at CityMetric.