New web security system tests computers' emotions

Sorting the men from the replicants.

A new Captcha system seeks to separate humans from computers by testing empathy – and spreading awareness of human rights human rights abuses at the same time.

A Captcha – which stands for Completely Automated Public Turing test to tell Computers and Humans Apart – is the test used when logging into many sites to distinguish between real people and malicious programs, which may attempt to log into many thousands of accounts at the same time. You've all used one – signing up for a New Statesman commenting account, if nowhere else – and they are ripe for being put to good use.

ReCAPTCHA was the first socially-beneficial captcha, and still the most popular. It uses the combined might of all the human brain power wasted on Captchas to transcribe scanned books:

reCAPTCHA improves the process of digitizing books by sending words that cannot be read by computers to the Web in the form of CAPTCHAs for humans to decipher. More specifically, each word that cannot be read correctly by OCR is placed on an image and used as a CAPTCHA. This is possible because most OCR programs alert you when a word cannot be read correctly.

Since it took off, ReCAPTCHA has been used on innumerable sites, and is now displayed over 100 million times a day. But that success comes at a price. Now that the low hanging fruit has been plucked, fewer and fewer easily-transcribable words remain in its corpus, meaning that the system regularly throws up completely unintelligible words, words in other scripts, or things which just aren't language at all.

The civil rights captcha wants to be the replacement. Rather than using the captcha to perform useful work, like reCAPTCHA, it uses it to raise awareness about important issues:

Instead of visually decoding an image of distorted letters, the user has to take a stand regarding facts about human rights. Depending on whether the described situation is positively or negatively charged, the CAPTHA generates three random words from a database. These words describe positive and negative emotions. The user selects the word that best matches how they feel about the situation, and writes the word in the CAPTCHA. Only one answer is correct, the answer showing compassion and empathy.

As well as being important socially – example questions include "The parliament in St. Petersburg recently passed a law that forbids "homosexual propaganda". How does that make you feel?" – the Civil Rights Captcha is stronger against attack as well. It includes the same visual element as a reCAPTCHA, requiring potential attackers to decipher obfuscated words, but also requires any automated attack to parse a complex question, pick the right emotion, and only then work out which of the proffered words match that emotion.

The whole thing is rather reminiscent of Blade Runner:

We'll catch those pesky replicants yet.

Rutger Hauer, in the film Blade Runner.

Alex Hern is a technology reporter for the Guardian. He was formerly staff writer at the New Statesman. You should follow Alex on Twitter.

Apple
Show Hide image

Is Apple Music really deleting users’ songs without their consent?

It's hard to tell – but the iTunes Terms and Conditions seem to cover the company even if it does.

Musician James Pinkstone was a new Apple Music user when he realised that 122GB of music was missing from his computer.

According to a long blogpost he published on Wednesday, Apple Music attempted to “match” his music with songs in its online library via a function called “iMatch”. It then, Pinkstone claims, deleted all 122GB of his original files – collected from CDs, bought, and even created himself over a lifetime – from his hard drive.  

Luckily, Pinkstone was able to restore his library from a backup, but if what he says is true, it’s outrageous for a number of reasons. Apple Music streams music to users, meaning you need to be connected to Wi-Fi while you’re listening, so it isn’t the same as having an iTunes library of songs you actually own. You can download individual songs from the service to your device, but as Pinkstone writes, “it would take around 30 hours to get my music back” in this way. Your music and playlists also disappear if you stop paying your Apple Music subscription fee.

Meanwhile, iMatch has been notoriously rubbish at matching your files with music library entries, sparking lots of user complaints already. Pinkstone says a Fountains of Wayne song was replaced by a later version, for example, so he would have been unable to get the original song back.

So is it true? It’s not totally clear what happened to Pinkstone’s library, but here’s what we know so far.

Apple has said it doesn’t delete users’ music without their consent

Apple declined to give me a statement, but referred me to the piece “No, Apple Music is not deleting tracks off your hard drive – unless you tell it to” on the site iMore, which is not affiliated with the company but which the spokesperson described as “accurate background”.

Its author, Serenity Caldwell, explains that you have “primary” and “secondary” devices on Apple Music, and that on secondary devices (usually phones or tablets) in particular it’s advisable to delete your physical copies of songs to free up space – after all, you can stream everything via Apple Music anyway or download individual songs if you need them.

However, users should never delete files from their “primary” device (usually your desktop or laptop computer) because they’d lose the master copy of their songs forever.

…But customers might be giving that consent by accident

Jason Snell, a writer, speculated on Twitter that a misleading dialogue box may have caused Pinkstone his problems.

When you delete a song on any device, a dialogue box pops up offering to “delete” the song from “your iCloud Music Library and from your other devices” (emphasis mine). It’s more than possible that users would click this “delete” button rather than the less obvious “remove download” option which removes the song only from that device.

Apple Music’s terms and conditions cover it if it does delete your songs

Pinkstone seems to argue that he did no such thing, however, and it’s possible that there’s a bug as yet undiscovered by Apple which is deleting songs at will.

However, as Pinkstone points out, iTunes terms of use actually do cover it in the event the programme damages your files, or your property in general.

One section reads:

“IN NO CASE SHALL APPLE, ITS DIRECTORS, OFFICERS, EMPLOYEES, AFFILIATES, AGENTS, CONTRACTORS, OR LICENSORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, PUNITIVE, SPECIAL, OR CONSEQUENTIAL DAMAGES ARISING FROM YOUR USE OF THE APPLE MUSIC SERVICE OR FOR ANY OTHER CLAIM RELATED IN ANY WAY TO YOUR USE OF THE APPLE MUSIC SERVICE, INCLUDING, BUT NOT LIMITED TO, ANY ERRORS OR OMISSIONS IN ANY CONTENT OR APPLE MUSIC PRODUCTS, OR ANY LOSS OR DAMAGE OF ANY KIND INCURRED AS A RESULT OF THE USE OF ANY CONTENT OR APPLE MUSIC PRODUCTS POSTED, TRANSMITTED, OR OTHERWISE MADE AVAILABLE VIA THE APPLE MUSIC SERVICE, EVEN IF ADVISED OF THEIR POSSIBILITY.”

Elsewhere, it defends its right to withdraw access to Apple products at will  including songs and albums you're under the impression you bought from them outright:

Apple and its principals reserve the right to change, suspend, remove, or disable access to any iTunes Products, content, or other materials comprising a part of the iTunes Service at any time without notice. In no event will Apple be liable for making these changes.

Tl;dr: Until there’s some explanation for Pinkstone’s lost library, it might be a good idea to avoid using the iMatch function, or even Apple Music altogether. It seems very unlikely that the software would be able to delete files without your consent, but given you aren’t covered if they do, it’s better to be safe than sorry.

Barbara Speed is a technology and digital culture writer at the New Statesman and a staff writer at CityMetric.