Wartime cryptographer Alan Turing’s iconic question – “can machines think?” – was put to the test once again at the Royal Society’s Turing Test 2014 competition in London. Supercomputer “Eugene Goostman” managed to fool 33 per cent of judges into thinking it’s a human.
The test, proposed by mathematician Alan Turing in his 1950 paper Computing Machinery and Intelligence, understands its limitations. Defining “think” is not an easy task. He instead replaced the question with something more tangible – can a computer successfully convince an observer that it’s human?
Eugene can do just that. In a five-minute question-and-answer text chat – with no limitations of topic – a third of judges believed the program to be a real human. Until now, no computer has managed to reach the 30 per cent benchmark set by Turing.
The chatbot, brainchild of Russian computer scientist Vladimir Veselov, has an important advantage over its (/his?) competitors – his “personality” is a 13-year-old Ukrainian boy. After winning the competition Veselov explained his team’s intentions. “Eugene was ‘born’ in 2001,” he said. “Our main idea was that he can claim that he knows anything, but his age also makes it perfectly reasonable that he doesn’t know everything. We spent a lot of time developing a character with a believable personality.”
Subject-specific knowledge is rarely a strong point for 13-year-olds, and Eugene has just enough for a brief chat about a wide range of topics. In addition, the bot’s grammatical errors can be put down to speaking English as a second language. With these caveats in mind, it’s perhaps unsurprising that it managed to convince the judges.
The achievement was hailed as a landmark. Roboticist and cybernetics researcher Kevin Warwick, of the University of Reading, which organises the competition, said: “There is no more iconic and controversial milestone than the Turing Test... This milestone will go down in history as one of the most exciting.”
The practical implications of this are ominous. Face-to-face conversations are being progressively replaced by social media – Channel 4 found that the average Briton will text friends and family more regularly than see them face-to-face. If the move to digital media is accompanied by increasingly sophisticated computers, then we need to be sure of who we’re talking to.
Warwick warned: “Having a computer that can trick a human into thinking that someone, or even something, is a person we trust is a wake-up call to cybercrime. The Turing Test is a vital tool for combating that threat.”
Though Matrix-style scenarios of machine domination are still a long way off, it doesn’t take much to imagine the potential for misuse of such machines. Already our online presences give away a huge part of our personalities. A future version of Eugene could plausibly analyse our social media profiles en masse and conjure up a passable imitation of a loved one – enough, say, to start sending requests for pin numbers and passwords.