The day Watson the super computer learned to swear

Could this actually be a breakthrough?

IBM has been trying to get a super computer (Watson) to pass the Turing test - a test that works out how intelligent - or human like - a machine is.

But when they tried to teach Watson some phrases from Urban Dictionary, they accidentally taught him how to swear. Then they couldn't get him to stop. Here's an extract from an interview with Ed Brown, that appeared in Fortune magazine.

But Watson couldn't distinguish between polite language and profanity -- which the Urban Dictionary is full of. Watson picked up some bad habits from reading Wikipedia as well. In tests it even used the word "bullshit" in an answer to a researcher's query.

Ultimately, Brown's 35-person team developed a filter to keep Watson from swearing and scraped the Urban Dictionary from its memory. But the trial proves just how thorny it will be to get artificial intelligence to communicate naturally. Brown is now training Watson as a diagnostic tool for hospitals. No knowledge of OMG required.

It sounds like they made a breakthrough. The Turing test supposedly tests artificial intelligence - but this is a misnomer, as it actually measures for all human behaviours, not  just "intelligent" ones. Points are given for idiosyncratic behaviour: susceptibility to insults, temptation to lie, and even typing errors - (the first Loebner winner's victory was due partly to its ability to "imitate human typing errors").

Picking up swearing a little too easily? Responding to a supposedly straight question with "bullshit"? Sounds like Watson's becoming more human by the minute.

A computer that swears back. Photograph: Getty Images
Getty
Show Hide image

The surprising truth about ingrowing toenails (and other medical myths)

Medicine is littered with myths. For years we doled out antibiotics for minor infections, thinking we were speeding recovery.

From time to time, I remove patients’ ingrowing toenails. This is done to help – the condition can be intractably painful – but it would be barbaric were it not for anaesthesia. A toe or finger can be rendered completely numb by a ring block – local anaesthetic injected either side of the base of the digit, knocking out the nerves that supply sensation.

The local anaesthetic I use for most surgical procedures is ready-mixed with adrenalin, which constricts the arteries and thereby reduces bleeding in the surgical field, but ever since medical school I’ve had it drummed into me that using adrenalin is a complete no-no when it comes to ring blocks. The adrenalin cuts off the blood supply to the end of the digit (so the story goes), resulting in tissue death and gangrene.

So, before performing any ring block, my practice nurse and I go through an elaborate double-check procedure to ensure that the injection I’m about to use is “plain” local anaesthetic with no adrenalin. This same ritual is observed in hospitals and doctors’ surgeries around the world.

So, imagine my surprise to learn recently that this is a myth. The idea dates back at least a century, to when doctors frequently found digits turning gangrenous after ring blocks. The obvious conclusion – that artery-constricting adrenalin was responsible – dictates practice to this day. In recent years, however, the dogma has been questioned. The effect of adrenalin is partial and short-lived; could it really be causing such catastrophic outcomes?

Retrospective studies of digital gangrene after ring block identified that adrenalin was actually used in less than half of the cases. Rather, other factors, including the drastic measures employed to try to prevent infection in the pre-antibiotic era, seem likely to have been the culprits. Emboldened by these findings, surgeons in America undertook cautious trials to investigate using adrenalin in ring blocks. They found that it caused no tissue damage, and made surgery technically easier.

Those trials date back 15 years yet they’ve only just filtered through, which illustrates how long it takes for new thinking to become disseminated. So far, a few doctors, mainly those in the field of plastic surgery, have changed their practice, but most of us continue to eschew adrenalin.

Medicine is littered with such myths. For years we doled out antibiotics for minor infections, thinking we were speeding recovery. Until the mid-1970s, breast cancer was routinely treated with radical mastectomy, a disfiguring operation that removed huge quantities of tissue, in the belief that this produced the greatest chance of cure. These days, we know that conservative surgery is at least as effective, and causes far less psychological trauma. Seizures can happen in young children with feverish illnesses, so for decades we placed great emphasis on keeping the patient’s temperature down. We now know that controlling fever makes no difference: the fits are caused by other chemicals released during an infection.

Myths arise when something appears to make sense according to the best understanding we have at the time. In all cases, practice has run far ahead of objective, repeatable science. It is only years after a myth has taken hold that scientific evaluation shows us to have charged off down a blind alley.

Myths are powerful and hard to uproot, even once the science is established. I operated on a toenail just the other week and still baulked at using adrenalin – partly my own superstition, and partly to save my practice nurse from a heart attack. What would it have been like as a pioneering surgeon in the 1970s, treating breast cancer with a simple lumpectomy while most of your colleagues believed you were being reckless with your patients’ future health? Decades of dire warnings create a hefty weight to overturn.

Only once a good proportion of the medical herd has changed course do most of us feel confident to follow suit. 

This article first appeared in the 20 April 2017 issue of the New Statesman, May's gamble

0800 7318496