Show Hide image

AI on the NHS: how machine intelligence could save the eyesight of thousands

Google Deepmind's artificial intelligence can spot the early signs of serious eye diseases

On 15 March last year, the 18-time world champion Go player Lee Sedol conceded a fourth defeat to his opponent, a computer programme called AlphaGo.

To a casual observer, a computer beating a human at a board game is not unusual – the IBM supercomputer Deep Blue beat the world chess champion, Garry Kasparov, almost 20 years ago – but to people who understood the challenge, it came as a surprise. It had not been expected that a computer would beat a human at Go for another decade, and a decade is a very, very long time in modern computing.

The reason Go presented such a challenge was that computing every possibility in a Go game is a task so complex that it is probably unimaginable. For a sense of the sheer dizzying scale of it, try this: a single grain of sand contains, very roughly, 50 million million million atoms.

A game on a 19x19 Go board contains more legal positions than there are atoms in the whole of the observable universe.

Because mastery of Go is not currently possible through brute computational power, AlphaGo won using a different approach. The developers, London-based Deepmind, wrote an algorithm that simulates human learning. The algorithm was then ‘trained’ to mimic expert human players, then ‘practised’ playing altered versions of itself until it was able to play at the highest grandmaster level.

AlphaGo’s victory demonstrated the power of a new kind of computing: artificial intelligence. The significance of AI is that the Deepmind algorithm was not written to play Go – it was trained to play Go. And it could be trained to do other things.

The person in charge of deciding which other things it will do is Deepmind’s Head of Applied AI, Mustafa Suleyman. “We’re inundated with opportunities,” says Suleyman.

“So we prioritise which areas to focus on by finding opportunities to make a very meaningful difference. Not just something that’s incremental, but has the potential to be transformative. We look for the opportunity to have a meaningful social impact, so we want to work on product areas that can deliver sustainable business models, but in equal measure actually make the world a better place.”

This is not the first time Suleyman has spoken to the press about his altruistic aims for AI. In July 2015, Suleyman speculated in an interview with Wired magazine that Deepmind’s technology could have applications in healthcare. Reading that piece was Pearse Keane, an academic ophthalmologist from Moorfields Eye Hospital in London.

“I’m a bit of a tech nerd,” Keane admits, saying that he had been aware of Deepmind’s research for a couple of years. Reading the interview, “when a lightbulb went on in my head, that this should be applied to ophthalmology, and in particular to the type of imaging of the eye that I specialise in, which is called OCT - Optical Coherent Tomography.”

The scanning of patients’ eyes using OCT – three-dimensional scans of the retina that are much better at revealing eye disease than traditional retina photography – is one of the biggest developments in modern ophthalmology.

However, as Keane explains, it is actually a growing impediment to detecting serious eye diseases in the NHS. “Approximately 5-10 per cent of high-street opticians now have OCT scanners in the UK. It’s not like having an MRI scanner – it’s about the size of a desktop computer. They’re usually pretty easy to use and it’s very quick and safe to acquire the scans.

“The problem is that they’ll offer to do the scans, but in many cases they don’t have the training or the experience to interpret them. So what they do is, if there’s any deviation whatsoever from the norm on the scan, they refer the patient urgently into Moorfields or other NHS hospitals to be seen.”

The result, says Keane, is “a huge number of false positive referrals. The people who actually do have sight-threatening disease are then delayed in getting in to be seen, because the system is overflowing.”

This swamping of services could not be happening at a worse time. Ophthalmology is already the second-busiest speciality in the NHS, with more than 9 million outpatient appointments per year.

What’s particularly frightening for Keane, is that among the huge numbers of referrals being produced by the rolling-out of improved scanning are people who have recently developed a disease that will blind them if they are not treated in time.

“Often, someone could potentially have developed severe eye disease, and they could not get an appointment - even if it’s an urgent referral - for weeks, or sometimes longer. If someone’s in a situation where they’ve already lost their sight in one eye, and they’ve started to develop a problem in their other eye - you can imagine, psychologically, what that would be like.”

Keane also points out that this is not happening to an unlucky few, but to a horrifying number of people. “The most important disease, to my mind, is age-related macular degeneration, or AMD, and in particular the more severe form, which is known as ‘wet’ AMD – due to the leakage of fluid at the back of the eye.

Wet AMD is the most common cause of blindness in the UK, Europe and North America. It’s a massive problem. The Macular Society says that nearly 200 people develop the severe, blinding form of AMD every single day in the UK.”

These people need treating quickly. “If we intervene earlier, we have much better outcomes. If it was a family member of mine, I would want them to receive treatment within 48 hours. The national standard for age-related AMD is that they should be seen and treated within two weeks.

“The reality is that across the NHS, that target is not being met, and people are often waiting much, much longer than two weeks to actually receive treatment. The reason for that is that the system is being overwhelmed, and in particular by so many false-positive referrals.”

To make matters even worse, Keane foresaw an even greater inundation of OCT scans. “The big optician chains are talking about rolling out OCT scans across their whole chains – thousands of optometry practices. If there’s no way for us to deal with that, we’re in very, very big trouble.

“It’s as if every GP in the country was given an MRI scanner, but had very limited ability to interpret the scans. Every person who went in with a headache would get an MRI scan, and then they’d have to refer every single patient into the hospital.”

Fortunately, as a follower of Deepmind’s work, Keane knew that the vast amounts of data the OCT scanners were producing were, like moves on a Go board, exactly the kind of thing that can be used to train an AI. “The techniques that Deepmind uses, these deep reinforcement techniques, are successful in the context of large data sets, and Moorfields has probably the largest data sets in the world for many ophthalmic settings.

For just one of our devices, we had about 1.2 million OCT scans.” The AI pioneers were also, helpfully, just around the corner: “DeepMind is based in King’s Cross, and two of the co-founders were UCL alumni – Moorfields is affiliated with the Institute of Ophthalmology at UCL – so I thought I would be a fool not to capitalise on this. I contacted Mustafa through LinkedIn, and he, to my great delight, emailed back within an hour.”

For Suleyman, too, the problem arrived at an opportune moment. “In the last five years, we have made a lot of progress on some of the big milestones in AI. We now have very good speech recognition, very good translation, very good image labelling and image recognition. Many of the things that we try now seem to be working. We have much improved machine learning models, we’ve got access to very large-scale computers, and there’s increasingly enough training data to help us build effective models.”

The millions of OCT scans held by Moorfields presented the ideal dataset for Deepmind to apply its research. “If you think about the number of cases that each of the world’s very best ophthalmologists have seen during their careers,” says Suleyman, “aggregate all those cases in one place and show them to a machine, the machine learning system is going to have the benefit of a much, much wider set of experiences than any single human, or collection of the best humans, could have had during their career.”

While Suleyman says he finds the term ‘artificial intelligence’ unhelpful – “it’s imbued with a lot of anthropomorphic projection, it tends to conjure up the sense that this is a single coherent system that’s doing lots of different things, just like a human” – he is comfortable describing the AI’s interactions with its data as “experiences.”

“In some sense you can think of us replaying all of the scans to our machine learning system in the same way that an expert human might sit in front of their computer and watch scans and case studies over and over again. It’s what we call experience replay.”

Suleyman says the AI also recalls or imagines things in a way that’s analogous to a human mind. “An ophthalmologist doesn’t recall a specific case study that she saw seven years ago. She has an abstracted, conceptual representation of, say, diabetic retinopathy or glaucoma - and that representation is built up through many, many examples of experience and teaching throughout her career.

Those things combine to create a short-form conceptual representation of the idea of the particular diagnosis. We do a very similar thing with our machine learning models - we replay, many times, lots of training instances of positive examples of the pathology that we’re trying to teach the system, and then over time it builds an abstract representation of that pathology and uses that to identify new pathologies when it encounters a new case.”

Just as AlphaGo used human-like judgement to master Go, the system being used with the Moorfields data is ‘imagining’ an abstract form of the disease it looks for, seeing it in its ‘mind’s eye’. “I think seeing it in its mind’s eye is a fair description. It’s generalising from past experiences and making an inference about the new example that it’s seeing at that moment.”

Keane says this is similar to the technology used “to look at photographs on Google Photos or Image Search, or Facebook, to recognise faces in the photos or to be able to recognise that there’s a cat or a dog or a man on a skateboard in the photo.

“The way that the neural networks work is, the raw data from the photograph or the OCT scan – the pixels – are fed into the neural network, and each layer within the network extracts different features from the picture or the OCT scan.

“So, for example, the lower layers of the network will extract very simple features - they might pick out edges, or contrasts between black and white, or other very low-level features – and as you rise up through the network, more abstract features are picked out, so it might recognise that two eyes and a nose indicate a face. And then finally, the output from the network is some type of classification, so it will say that it 99 per cent certain that there’s a dog in the photo, and one per cent sure there’s a wolf in the photo.”

“We train the neural network using a huge amount of examples that have labels; this is called supervised learning. We’re able to give it many thousands of OCTs that have diabetic retinopathy or age-related AMD or other retinal diseases, and then we tweak the parameters of the network so that it can accurately recognise those diseases again.

“We then test the network on a dataset of fresh scans where it doesn’t know the label, and then it will tell us if it classes a scan as having diabetic retinopathy, or AMD.”

Can the system spot eye diseases better than a human? Both Suleyman and Keane say that while it is currently very much a research project, they are optimistic that it will soon be able to ‘grade’ eye scans more effectively – also much more quickly, and more cheaply – than a human.

Keane says he expects people will be able to walk into a high-street optician, have an OCT scan and have it graded by an AI in “two or three years. I don’t think this is more than five years”, while Suleyman says mass adoption is “a reasonable thing to expect over a five-year period.”

Deepmind and Moorfields are not only breaking new ground in technological terms; the advent of AI in healthcare will require new regulation, too. If the eye diseases Keane is hunting were identified by a chemical indicator, it would be subject to approval by the Medicines & Healthcare products Regulatory Agency in the UK, the FDA in the US and others around the world.

And while the use of machine learning is physically non-invasive, the huge reserves of data that the NHS has to offer AI companies are the private property of millions of individuals. It is this data that gives machine learning its formidable power, and the NHS is in a unique position to offer huge, well-labelled datasets; how it is shared, who gets to use it and who gets to profit from it are questions that could fail to be properly answered in the rush to implement this important new technology.

There is no doubt, however, that these questions will need to be answered, because AI is coming to healthcare, soon, and in a very big way. Suleyman predicts that machine learning will become hugely valuable in diagnosing conditions earlier and planning treatment – Deepmind is also working on a separate project that could “massively speed up the process of planning for radiotherapy” – but he says doctors are not the only ones who may find themselves working alongside AI. Managers, too, could be disrupted.

“The hospital environment is such an expensive and complex system. One of the reasons why I think it’s reaching breaking point is that humans are simply overwhelmed by the scale and complexity of managing so many patients who are on so many different pathways, who need so many different tests and interventions. It becomes a massive co-ordination exercise. So, one of the things we’re increasingly thinking about is how we efficiently and speedily prioritise the tasks that get done in different areas of the hospital.”

Few would dispute that the NHS is beginning to creak under its own complexity; AI promises to parse this tangled problem with fast and tireless concentration.

For Keane, this is an opportunity to be seized. “In the next couple of years, we need to work to build on those advantages, because we might have a head start, but that might not be there indefinitely.”

Will Dunn is the New Statesman's Special Projects Editor. 

Show Hide image

Jonathan Ashworth: “If you take £4.5bn out of social care, it will hit the NHS”

The new shadow health secretary discusses Labour's health policy

Immediately to the left of Jonathan Ashworth’s desk in Portcullis House are a large flipchart and a poster.

The flipchart details the places Ashworth will visit in the coming week, when he tours different towns and cities discussing the NHS with “people who work in the NHS, patient groups, royal colleges, staff groups, people who aren’t part of a group but have an interest in the future of the NHS.”

The poster shows what was known, for a week or so in 2015, as the ‘Maggie Simpson map’: a Britain colour-coded by general election results, with an almost entirely Conservative-blue body and an SNP-yellow head.

In the centre of the coffee table is a copy of The House magazine, with Yvette Cooper on the cover. “That’s a coincidence!” laughs Ashworth, who nominated Cooper in the leadership election in 2015. He didn’t declare support for either Jeremy Corbyn or Owen Smith in last year’s rematch. Though he would reject the comparison, both Ashworth and Theresa May carried off a very successful 2016 by standing back while their parties fought bitterly amongst themselves. Previously a shadow minister without portfolio, Ashworth was promoted in October to one of the biggest roles on the opposition front bench, following in the steps of Andy Burnham and Diane Abbott.

It is a good week to be shadow health secretary. Three days before we meet, Jeremy Hunt was forced to redraw the four-hour waiting threshold for A&E as a target for “urgent health problems… but not all health problems”, while the British Red Cross chief executive described NHS hospital and ambulance services as a “humanitarian crisis.” The following day, Hunt was forced to admit to the Commons that the NHS had failed to provide mental health support to the 18-year-old daughter of one of its own nurses. The day we meet, the National Audit Office have reported that Hunt’s plans for seven-day GP access were made “despite not having evaluated the cost-effectiveness of their proposals and without having consistently provided value for money from the existing services”.But Ashworth says he doesn’t take any satisfaction from his opponent’s tough week.

“I’m not one for demonising Jeremy; I rather like Jeremy. I’ll probably get lots of criticism on social media for saying that. But if we get into a debate about personalities, we’re missing the bigger picture – the systematic underfunding of the NHS, and deep cuts to social care.” Does Ashworth agree with the Red Cross? Is the NHS at breaking point? 

“I think we have to be responsible in the language we use. It is certainly a winter crisis. A&E targets have been missed again, 20 hospitals have had to declare black alert. In the last few weeks, 50 trusts have put out messages saying they can’t cope, several hospitals say they don’t think they can offer comprehensive care and 140 hospitals, at least, are effectively turning people away from A&E. It is certainly a crisis this winter. What is outrageous is that the government were continually warned about this. They’ve been consistently warned that unless they do something to solve the social care crisis in this country, it’ll continue to put undue pressures on the wider NHS. And that is what we are seeing now. We’ve had six years of multi-billion pound cuts in social care. If you take £4.5 billion out of the social care sector, that is a lot of elderly and very vulnerable people not getting the support they deserve and need. It’s inevitable that it’s going to impact on the NHS.”

Funding is the primary issue for Ashworth. “The government insists it's given the NHS an extra £10bn. Those claims have been eviscerated this week by Simon Stevens at the Public Accounts Committee. The reality is that it’s the biggest financial squeeze in its history. It’s effectively flat-lining, and indeed in 2018, as Simon Stevens said, head-for-head expenditure in this country will actually be cut in the NHS. Whenever you put this to Theresa May, she just stubbornly refuses to listen. She keeps saying we’ve had £10bn, even though expert after expert after expert has knocked this down."

Ashworth says there needs to be “objectivity in the debate about the finances of the NHS” and that a new body, similar to the Office of Budget Responsibility, might offer this. By making reports to the health secretary as the OBR does to the chancellor, such a body would, says Ashworth, “allow parliament to scrutinise the judgements they are making about the allocation of resources. Government ministers have to be answerable to the people, ultimately, for the decisions they make for NHS financing.”

Few would argue that NHS spending is as well-planned as it could be, but the money does have to exist for it to be allocated. Where would Labour find the funds to save the NHS?

“This government has chosen a multi-billion pound tax cut for corporations and a big capital gains tax cut for share transactions. It’s chosen to give the very wealthiest estates in the country a £1bn pound inheritance tax cut. In the most recent Autumn Statement, there was nothing for social care and yet there were millions allocated in capital expenditure for new grammar schools. And yet we have the NHS having to raid its capital budgets. So in the immediate term, I would say they do not need to make that set of decisions.”

So Labour would increase taxes and reduce spending on education?  “We don’t know what we would inherit in 2020. These will be the decisions for the chancellor, for his or her first budget. But if we were in government now, we would be making a different set of decisions now, to help the health service.”

Previous Labour governments – for which Ashworth worked as a special advisor and Gordon Brown’s deputy political secretary – found the money for over 100 new hospitals in Private Finance Initiative (PFI) schemes. Many of these schemes involve repaying three to five times the build cost of the hospital – and in some cases, up to seven times the cost – over decades. The last scheme begun by Labour will not be paid off until 2049. Would Ashworth consider using PFI again?

“I think the days of PFI are over,” he says. “I think Jeremy Corbyn has been pretty clear on that.” Does he defend PFI? “The PFI contracts delivered huge numbers of new buildings, and we wanted new buildings. There are a handful of contracts that were not negotiated well. In fairness, a lot of those were inherited from the previous John Major government.But Jeremy Corbyn has said that he would want to have a look at these contracts, if he can, in government. It’s something that I know he and John McDonnell are looking at carefully. But we’re not going to have PFI contracts in the future.”

Opening the NHS up to more outsourcing is also off the table, says Ashworth. “The Health and Social care Act pushed us in the direction of greater fragmentation, of greater outsourcing to private companies. I think it is a damaging piece of legislation, and we are committed to reversing that. “We don’t see why these companies should be able to come in - they try to make a profit, they don’t, and then they leave. And then sometimes they then rely on the public sector to pick up the pieces again.”

Some recent health policy is worth keeping, however. Ashworth says the sustainability and transformation plans (STPs) introduced in 2015 “could mean more planning at a local level. They could mean a more strategic delivery of local health services. That is something we would be in favour of, and it’s why we used to have strategic health authorities. The problem is, what started off as a way to get greater collaboration in the system now increasingly looks like a way of filling financial gaps. If you look at all of the STPs now, and you look at their ‘do-nothing’ proposals - maintaining services as they are now - it adds up to £21.8bn. That is the gap they’ve got to find. So they’re proposing cutting beds, downgrading hospitals or maternity wings or A&E departments, while not explaining how there will be greater provision in the wider community, not explaining what social care provision will be put in place, not explaining what acute sector provision will be in place. It’s a cart-before-the-horse argument.”

Ashworth is at his most emotive when discussing Labour’s electoral ambitions. “I sit on the opposition benches, and you see casual dismissal from the Tories of the condition of Britain. That does make me angry. I make no apologies for wanting Labour to win. It isn’t a game. It is the ultimate betrayal of the people we’re in politics to represent if we’re not completely focused, laser-like, on winning elections.”

Will Dunn is the New Statesman's Special Projects Editor. 

0800 7318496