There are a few certainties in this world: death, taxes, and that our jobs will eventually be taken by robots. However, some professions are under greater threat than others. Accountants and couriers should probably worry. But doctors and teachers will be fine, surely? Even the most sophisticated algorithm can’t capture the nuanced set of skills these professions require – compassion, common sense and emotional intelligence?
According to Sir Anthony Seldon, former headmaster of public school Wellington College and current vice-chancellor of Buckingham University, teachers might have reason to be concerned after all. Last month he said he believed “extraordinarily inspirational” robots would begin taking on the work of teachers over the next ten years.
“It will open up the possibility of an Eton or Wellington-style education for all,” he said. “Everyone can have the very best teacher and it’s completely personalised; the software you’re working with will be with you throughout your education journey.”
He said he expected teaching unions to be “alarmed” by the prospect, but that the impact would be “beyond anything that we’ve seen in the industrial revolution or since with any other new technology”. He does not believe teachers will be totally replaced by robots, but that the two will work together. Teachers’ roles will become more pastoral, less focused on the repetitive tasks of imparting information, testing and marking.
“I don’t believe that AI should replace teachers,” agrees Professor Rose Luckin, chair of Learning with Digital Technologies at UCL’s Institute of Education. “But I think that if we don’t plan for the oncoming AI revolution then there is a risk, that because of the massive shortage of teachers, that it will be seen to be an economically viable alternative to use artificially intelligent systems to do a significant amount of work that teachers do. I don’t think that would be a good thing.”
When we think of artificially intelligent teachers, we tend to picture a human-like physical robot standing at the front of the class. Such technologies are among those being developed, particularly for very young children. Pepper and Nao, two humanoid robots made by Japanese company SoftBank Robotics, were trialled in two Singapore pre-schools last year with encouraging results. Pepper was able to question children about a story they had just heard, for example, offering multiple-choice answers for them to select on a screen.
However, Luckin believes the first AI technology to be adopted by schools in the UK will be more similar to a chatbot application, or a virtual assistant such as Amazon’s Alexa that can recognise and respond to speech.
Such technology also already exists in the education world: Professor Ashok Goel, of the Georgia Institute of Technology in the US, managed to fool computer science students with “Jill Watson”, a bot who responded to their email queries about class assignments. Students could not tell the difference between “Jill”’s emails and those from real teaching assistants – although the impressive promptness of her responses aroused some suspicion. Goel gave a TED Talk about the project’s success.
Where AI really excels is in teaching Stem subjects such as maths and science, at least at school level. These tend to have a clear right answer, unlike subjects such as English where topics are open to different interpretations.
Will a robot ever be able to teach literature? “I think they can certainly support,” says Luckin. “It is possible now to capture a lot of data about people’s interactions with technology, but also as they’re wandering around in the world, so data about social media, about their physical wellbeing, about brain function.
“That can be processed using AI algorithms to find out a lot more about that person’s emotional wellbeing, physical wellbeing, even cognitive wellbeing. And some of that could be used to engage students in some of those more subjective areas.”
Seldon himself agrees that AI’s potential goes beyond just basic maths and science. “I think it will transform Stem,” he tells the New Statesman. “In science, it will liberate young people to take part in experiments way beyond anything that we can do at the moment.
“But I think that social sciences will follow five to ten years later, and the arts and humanities ten to 15 years later, because there is learning which can be related by algorithms in the arts.
“Young people will be able to look at a scene from Macbeth in three dimensions, see holograms of actors performing in the middle of the class. Machines will be able to ask probing questions of students to test their understanding of what’s happening, and their responses, using voice recognition.”
This certainly sounds impressive. But can artificial intelligence really tackle the huge gap between the best and worst schools – or the state and the private sector – when this technology will undoubtedly be expensive to begin with?
“I think there will be [a gap between state and private] initially, and then I think there won’t be,” says Seldon, whose upcoming book The Fourth Education Revolution explores the rise of learning with AI. “As we all know, new technology is very expensive because you’re paying for the research and development and the uncertainty, [but] I think the price will come down very quickly.
“The software you’d be getting in a top grammar school or a top independent school would be the same as in the most deprived region of, say, rural Wales, or the north-west coast. Schools where they might find it hard to get the best maths and physics teachers.”
He continues: “Teachers will become much more the overall organisers, the explainers, and ultimate evaluators of progress. They will become pastoral leaders. A lot of the heavy lifting of the primary work of teaching will take place on a one-to-one instructional basis, between the individual and the machine.
“At the moment we could do with about three times the number of teachers in schools… they are desperately short of time to do the job properly. They just get by. And so if we had computers doing a lot of the repetitive teaching work it would mean that teachers would be able to do the job so much better.”
He says experience from the US, where some of this technology is being developed, is that covering basic material on a computer using AI takes up around 30 per cent of a student’s day: “That leaves 70 per cent of the time for teaching staff to organise discussions, activities, one-to-one sessions with students, assemblies, cultural sessions.”
Luckin believes artificial intelligence could also help with an increasingly worrying problem facing schools: students’ mental health. A recent government-funded study found as many as one in four teenage girls now suffer from depression by the age of 14. “We could be building systems to help people understand themselves better – to see the signals of problems coming their way in terms of their mental wellbeing, and flag them up to somebody who can help them before it gets too bad,” Luckin says.
But of course, there are huge ethical concerns surrounding the collection of such sensitive data, particularly about children. “I think there is some great potential, but there is a huge risk that these things will be shot down before they have even started, because people understandably will be very worried,” Luckin adds.
The Samaritans caused controversy three years ago with a Twitter app aiming to monitor users for signs of depression through their posts. Samaritans Radar looked for phrases such as “tired of being alone” and “hate myself” among publicly available tweets. But critics said it was poorly designed and that vulnerable Twitter users could potentially be monitored without their consent, as anyone could sign up to receive an email alert when someone they followed was posting worrying messages. The app was later pulled.
Seldon describes the privacy concerns surrounding AI in schools as “massive”, adding: “Think about who knows you best. Now imagine this software is going to know you, certainly the cognitive tasks but also the emotional aspects, as well as the person who knows you best in life… I think we have to be very, very careful.”
Could the shift to learning from machines also harm the very human connection that exists between a teacher and a student? As teachers have something to gain or lose from a child’s success or failure they are motivated to push them academically – can a robot ever have the same effect? “I think that whereas the machines will be able to feign empathy in a very convincing way,” Seldon says, “the real emotional empathy will [still] be in the relationship with the teacher.”
“It’s a fascinating space with huge potential,” Luckin concludes, “but it’s also a fascinating space with huge problems in terms of getting it right.”